Using CRM for grounding in a retrieval augmented generation (RAG) pattern
Solution idea
This solution shows how data from a CRM can be used as the grounding data in a retrieval augmented generation (RAG) pattern. The AI assistant can help resolve or suggest resolutions for customer requests using language models from Azure OpenAI with customer data from Dataverse or any other CRM solution. CRM data can be used for situations such as:
- Clarification or information pertaining to the service offered
- Customer complains on any issue with the service they received
- A refund or warranty claim on a service or product is requested
This application understands the request, potentially as multi-modal with text, voice, images, and generates human-like responses. The solution supports summarization, category classification, sentiment analysis, and suggested next steps.
Architecture
The solution shows the Dataflow in solid lines and Inference flow in dotted lines.
Dataflow
- Dynamics - Dataverse houses all the CRM customer activity like purchases, , request for information, or customer service claims. This is the data source is the source of truth for all customer processing operations. File systems and Azure Data Lake Storage stores all the internal and customer facing policy and guideline files.
- Azure Document Intelligence and Data Factory are used to extract the relevant data and load into AI Search. This is a daily process update to the data in AI search. After this process, Azure AI Search has all the grounding data needed to support the AI functionality in this workload. During inferencing, this supporting information is retrieved and sent to the language model for automated claim processing.
- Azure Open AI service using the retreived search results , generates the responses . Prompt Flow give the framework for orchestration of this model.
-
- a) Model design: The LLM, Prompt crafting and custom codes make it easier to build a flow.
- b) Model evaluation: The built-in metrics help to evaluate language generated output. From the model evaluation perspective, relevancy and groundedness can be suitable metrics for this application.
- c) Model deployment: The best performing model can be deployed into a new or existing endpoint.
- d) Azure OpenAI language modelGPT-4o is used with Prompt flow RAG pattern interactions for processing the customer requests.
- The model with best performance is deployed as Endpoint.
Inference Flow
- The model inference works as follows:
-
- a) Realtime Endpoint, where the Prompt flow receives a customer request online and evaluates the request, classifies into appropriate groups, identifies the necessary next steps and suggests a response.
- b) The application can call the endpoint for quick customer information requests.
- c) Both real-time or batch processing generates the responses for the customer requests. The results generated are quality checked/ audited regularly.
- Batch processing takes care of request resolution at regular scheduled times.
-
- a) Logic Apps executes the batch processing for overnight daily run of processing the customer request.
- b) Generated results are saved in Azure Data Lake Storage (ADLS). Example: When a request for the Baggage limits in airlines is received, the reply will have the neccessary details in the response email. This is stored in the ADLS.
- The web application for customer requests asking for relevant information can be processed without further manual intervention. Remediation involving financial transaction needs a human representative to process the claim.
- The insights into the customer claim processed or compensation suggested or remediation measures are presented in the end use UI for customer support or the business manager.
Components
- Azure OpenAI service is a platform provided by Microsoft that offers access to powerful language models developed by OpenAI, including GPT-4, GPT-4o, GPT-4omini, and others. The service is used in this scenario for all the natural language understanding and generating communication to the customers.
- AI Search is a service that provides secure information retrieval at scale over user-owned content. The service enables semantic search over a copy of the data originally found in CRM.
- Prompt flow is a development tool designed to streamline the entire development cycle of AI applications, including those powered by language models. In this solution, this is the orchestration layer for prototyping, experimenting, iterating, and evaluating model parameters.
- Azure Data Factory is a cloud-based data integration service that allows creation of data-driven workflows for orchestrating and automating data movement and data transformation. The data factory is used for bringing the data from Dataverse and then adding records to Azure AI Search.
- Data Lake Storage is a secure cloud platform that provides scalable, cost-effective storage for big data analytics. Once a day, Azure Document Intelligence and Azure Data Factory are used to extract new and updated data from ADLS and Dataverse, chunk it, and store relevant data into Azure AI Search.
- Azure AI Document Intelligence is a cloud-based Azure AI service that uses machine learning to extract text, key-value pairs, and tables from documents. The prebuilt models were used to text extraction from the policy documents.
- Azure Web Apps hosts web applications, and provides autoscaling, load balancing, high availability, and auto-patching. It supports both Windows and Linux platforms. The customer-facing UI that is used by the customer to submit requests or get responses is hosted in this service.
- Power BI is a self-service, enterprise analytics solution that allows you to visualize your data, share insights across your organization, and embed them in your app or website. Power BI is used in this scenario to surface a summary of the requests that are processed in the application.
Solution details
Conventional CRM process:
Customer claim processing is complex and can require significant costs in customer service business units. The support team would manually identify the claim classification and eventually process or remediate the issue, with only a subset needing compensation.
AI Knowledge assistant using CRM data:
Knowledge assistant using CRM data
The language model prompt combines CRM data with the original intent, following the RAG pattern. This gives new context to the language model that it wasn't trained with. The data from the Dynamics systems has customer data and the CRM requests. Azure Data Factory is used to ingest the data from Dataverse into Azure AI Search using data transformation pipelines. Document intelligence is used to extract information from the relevant guidelines and attachments in the data store. Azure Data Factory also loads this extracted data into Azure AI Search.
All the information in the AI Search forms the foundation for knowledge base for the LLM processing. Azure OpenAI interacts with the corresponding indexes to provide relevant supporting information for its inference. The claim processing follows these steps:
- Classification of the incoming requests
- Based on the identified request type, the process can follow either path:
- For the request asking for information, look up the correct details and present the response.
- For the request asking for remediation, follow the guidelines documents and past resolution-history to infer the appropriate action. The actions could include providing assistance or compensation depending on the kind of issue. An additional review is needed to audit the relevant results.
- All the responses generated are retained and updated back in Dataverse.
The online endpoint is used for the real-time inference and the batch processing with Logic Apps is used for daily runs. Here is a snapshot of the Prompt Flow, which refers customer purchases of product/service and the relevant guideline for the service levels for their purchases.
Claim processing with AI
The CRM process can be optimized with the introduction of AI processing. All customer requests may not need the attention of the expert in the claim processing. Quick processing of the requests can be enabled by model inference where the customer needs information. The claims can be processed by classifying and addressing appropriate remediation if needed. Only few claims will need analysis leading to awarding any refund/compensation.
By using AI technologies, companies can streamline their operations, improve accuracy, and provide better customer experiences. Here are some key applications of AI in customer claim processing:
- Automated Data Extraction: Document Intelligence can expedite the processing by reading the supporting documents so information can eventually be added to the context used by the language model.
- Natural Language Processing (NLP) using Azure OpenAI and Azure AI Search: NLP enables understanding and interpreting customer communications, such as emails, claim requests, chat messages. This allows AI to categorize and prioritize claims based on urgency and complexity, ensuring that high-priority cases are addressed promptly. Also, automatic emails can be generated with relevant text rather than template-email formats.
- Predictive Analytics: AI can analyze historical claim data to predict future trends and identify potential risks. This helps companies to proactively manage their resources and make informed decisions about claim settlements.
- Customer Support: AI-powered chatbots and virtual assistants can provide instant support to customers, answering their queries and guiding them through the CRM process. This improves customer satisfaction and reduces the workload on human agents.
- Image Recognition: AI can analyze images and videos submitted as part of a claim to assess the extent of damage or verify the authenticity of the claim. This is particularly useful in industries like insurance, where visual evidence is often required. Using modals that support multi-modal analysis can help with evaluating the claims.
Alternative solution
The solution can be extended with new functionality or using some alternative technologies. Some examples follow.
- Fine-tuning Azure OpenAI models enable language models to gain specific knowledge in your industry or company, and responses will be more relevant to your CRM operations.
- Data Factory can be replaced with a near realtime mirroring of the CRM data from Dynamics. This could be a more efficient dataflow setup.
- Copilots in Dynamics. Custom copilots can use out of the box features when the claims are less complex.
- Other frameworks like semantic kernel or the Assistant API can be used for the implementation.
- Instead of responses being stored back in Azure Data Lake Storage, those responses could be stored in another database, such as Cosmos DB.
Potential use cases
This solution applies to claim processing in different industry verticals in the Customer Relationship Management services. It could be used in any customer service situation where response is needed based on customer data (products purchased / feedback from the customer) and the details of the offerings. If the remediation needs a refund or monetary award, only the ones with small amounts can be auto approved; and review any higher amount by the representative servicing the claim. The solution can be applied to the following areas:
- E-commerce or retail: The customer complaints or claims processing can be automated based on the purchase transactions and the product details.
- Telecom: The application can refer to product billing and explain the details the customer is looking for.
- Insurance quotes: Auto/Home insurance product documentation can be used as the knowledge base. The historical and new actuarial analysis can help tailor a relevant quote to a customer.
- Hospitality: Reservations could be automated with any process automation. If the customer is looking for information or want to present feedback, the language-model can respond with a conversation that is more relevant.
- Manufacturing: The product manuals and services explanation can be more nuanced when the customer reaches out with a question. The AI solution will avoid text search and the responses generated are based on language understanding.
Contributors
This article is maintained by Microsoft. It was originally written by the following contributors.
Principal author:
- Charitha Basani | Senior Cloud Solutions Architect