Automated Service Ticket Routing with Deep Learning on Azure
Published Sep 23 2019 06:59 PM 9,864 Views
Microsoft

In this two-part blog series, we explore a robust end-to-end architecture powered by modern deep learning techniques and built on Microsoft Azure to implement an automated service ticket routing solution.

In the first part, we discuss key architectural details highlighting the usage of serverless and PaaS services in Microsoft Azure that allow the rapid implementation of the solution presented here or a similar one.

In the upcoming second part of this series we get into the details of designing and developing the machine learning model used for the categorization of service tickets, using advanced deep learning and NLP (Natural Language Processing) techniques on the Azure Machine Learning service platform.

 

Why service ticket routing

Here we consider service ticket routing as the process of delivering a customer service ticket to the right recipient in a customer service organization.

As businesses become even more customer centric, an accurate and agile service ticket routing has become increasingly important as the foundation to provide an efficient and enjoyable communication channel for customers.

Moreover, modern businesses rely less on traditional human interactions in their Help Desk systems in favor of more automated channels such as bots, mobile apps, or text messaging.

In this context, being able to automatically route support tickets, customer inquiries, or complaints to the right channel can make a huge impact in customer satisfaction and beyond that, such as increased operational efficiency and sales.

 

The challenges of service ticket routing

Traditionally, this process is initiated by a user interacting with a product or service provider through one of its available customer interaction channels.

This will then trigger a service ticket creation in a Help Desk or Service Management solution.

Finally, this ticket ideally needs to be delivered to the right recipient, which can be, for example, the product or service support organization, a sales team, or a service or product specialist.

Routing those tickets manually is usually a slow, inefficient, error-prone and non-scalable process. If a ticket is not routed to the right recipient in the first place, this could potentially compromise a business in several ways: from disrupted service-level agreements to negative brand reputation. Here is where a machine learning enabled solution can help.

 

Automated ticket routing enabled by Machine Learning

Unless the service ticket is created or processed by a knowledgeable human capable of correctly identifying the right recipient for it, an automated way to categorize the information that comprises the ticket is needed in order to implement automated routing.

A critical part of this process is to rely on NLP (Natural Language Processing) and Machine Learning techniques to automatically categorize text information extracted from the tickets, in order to match a given category with the correct recipient.

 

Proposed end-to-end solution architecture on Azure

Here we discuss, at a high level, the solution architecture proposed on Microsoft Azure that supports a generic workflow for automated ticket routing.

We considered the following architecture principles when designing it:

  • Modularity: the solution should be flexible in order to easily integrate to Help Desk or Service Management platforms through common interfaces. Consider, as an example, an integration with the Zendesk Support
  • Elasticity: the solution should rapidly and automatically accommodate unpredictable processing demands and resume usual operations consuming only the needed resources.
  • Simplified Management and Operations: the solution should abstract the details of the underlying platform, requiring the least amount of management and operations tasks as possible.
  • Availability: the solution should consider inherent or easy-to-implement highly available components.
  • Pay per use: the solution should consider components that are billed per amount of data stored and/or processed, or per hour of usage.
  • Non-intrusive: the solution should be as non-intrusive as possible, avoiding customized code to be developed in the service management platform.

Considering these principles, a best practice is to use Serverless and PaaS solutions on Azure to design the proposed architecture.

The core objective of this architecture is to support the data ingestion, integration, and processing workflows needed to develop and implement the machine learning model for text categorization, as well as triggering the automated ticket routing in the service ticket management platform.

In the diagram below we have a high-level overview of this solution architecture:

 

image001.png

Fig. 1: high-level overview of the proposed solution architecture

 

There are only two integration points in the proposed architecture:

  • The REST API in the service management platform: here we show an example suited for the Zendesk Support platform but competing solutions also have REST-based APIs. Moreover, as the proposed solution is modular, it is easy to integrate with other interfaces.
  • The Data Export functionality in Zendesk Support: this functionality allows data to be exported from the internal Zendesk database to text files. Although this is specific to Zendesk, the architecture can easily be adapted to capture data in other ways, such as reading direct from a database for example.

This architecture supports two distinct data ingestion workflow patterns:

  • Data Ingestion (batch): this workflow is executed for the first batch of historical data ingestion from the service ticket management platform, and subsequent batch data ingestions according to the frequency needed for the machine learning model retraining.
  • Data Ingestion (near real-time): this workflow is executed following a time-based trigger (for example, at every minute) in order to process the latest service tickets created.

Following each of the two data ingestion workflows described above, we also have two data processing workflows:

  • Data Preparation and Model Development: this workflow is executed whenever a new machine learning development cycle is needed. It performs the data preparation, model training, and model operationalization tasks needed to deploy a new machine learning model.
  • Model Scoring: this workflow is triggered, following the near real-time data ingestion, in order to categorize the latest service tickets ingested and trigger the ticket routing process in the service management platform.

 

How a typical workflow is processed in the proposed solution

Here we describe in more detail how the typical workflows for model development, training, and scoring are performed following the numeric labels depicted in the diagram below:

 

image003.png

Fig. 2: workflows for model development, training, and scoring

 

Training and deploying the machine learning model:

  1. Historical service ticket data is exported from the Zendesk platform, using its Data Export functionality, and placed into a file share provided by Azure Files.
  2. A Copy Data job in Azure Data Factory is executed to copy the service ticket data into an Azure Data Lake Storage.
  3. Azure Data Factory executes a Databricks Notebook Activity, in order to trigger the training of the machine learning model that performs service ticket categorization.
  4. The Notebook referenced above is executed on Azure Databricks. It contains the Python code that reads the historical service ticket data from Azure Data Lake Storage, prepares the data for model training, trains the model, evaluates it, and saves the data preparation script, the scoring script, and the trained model back in the Azure Data Lake Storage for further deployment. Model training leverages the functionality provided by Azure Machine Learning service.
  5. Azure Data Factory executes another Databricks Notebook Activity, this time to trigger the deployment of both the data preparation script and the model scoring script to be used in the process of categorizing new service tickets.
  6. The Notebook referenced above is executed on Azure Databricks. It reads both the data preparation script and the model scoring script with the trained model saved in Azure Data Lake Storage and uses Azure Machine Learning service to deploy those scripts as web services in Azure Kubernetes Service. Both Azure Machine Learning service and Azure Databricks can benefit from MLOps (Machine Learning Operations) best practices. Notebooks in Azure Databricks can be integrated with Azure DevOps for version control and Azure ML service can automate the end-to-end machine learning life cycle for activities like model updating, testing, and continuous integration.

 

Using the Machine Learning model to drive the routing of new service tickets:

  1. A script that runs in Azure Functions is used to call the Zendesk Support REST API, which allows it to retrieve the latest support tickets created in the system since a given timestamp. It is configured to be executed through a built-in time-based trigger.
  2. After retrieving the tickets, this script writes them to an Azure Event Hub, so that they can be consumed further down in the processing workflow.
  3. That same script also writes its operations on Azure Application Insights, for application logging purposes.
  4. Another script running in Azure Functions is triggered upon the publishing of service ticket data into the Azure Event Hub.
  5. That script first calls the data preparation service that was published as a web service in Azure Kubernetes Service, passing the raw service ticket data to be prepared. Then, it calls the machine learning scoring script, also published in Azure Kubernetes Service, passing the prepared service ticket data to be categorized.
  6. The script above then uses the categorized service ticket in another call to the Zendesk Support API, now to trigger the service ticket routing in Zendesk.
  7. Finally, the script also writes its operations on Azure Application Insights, for logging purposes.

 

Final remarks:

The modular and composable architectural approach, coupled with serverless and PaaS services and platforms available on Microsoft Azure, allows organizations of any size and budget to benefit from the transformational power provided by cloud-based, AI-driven solutions.

Here we showed an example of a solution architecture implemented to address the problem of automated service ticket routing, with recommended patterns for data ingestion, data preparation, machine learning model development and operationalization on Microsoft Azure. To learn more about Microsoft-proposed solution architectures on Azure, please refer to this documentation.

Stay tuned for the second part of this series, where we explore the details of the machine learning model and how to develop and operationalize it using the Azure Machine Learning service platform.

 

 

4 Comments
Copper Contributor
Cool
Copper Contributor

Hi @vilcek, do you know when the second part of this article is likely to publish?

Copper Contributor

Really looking forward to part 2.

Microsoft

Part 2 will be published hopefully by the end of this month.

Version history
Last update:
‎Feb 12 2020 10:13 AM
Updated by: