Azure Functions triggers and bindings for building intelligent apps with Azure OpenAI
Published Jun 19 2024 07:25 AM 2,116 Views
Microsoft

Azure Functions triggers and bindings for building intelligent apps with Azure OpenAI

 

Over the last year, we have seen a large interest from customers to bring intelligence to their existing and new applications based on the innovation that has recently occurred in artificial intelligence and particularly Azure OpenAI.

 

After working with customers, a few common challenges have shown up when building these intelligent applications.

  1. Developers often feel that they need to become AI engineers to integrate the capabilities into their application.
  2. Most of the samples are in languages like Python and this is often not the expertise of the existing developers or what the organizations' applications are built with.
  3. Developers also find it difficult to know how successful their integration using OpenAI will be so right sizing their application given the new space can be very challenging.
  4. Lastly, although multiple SDKs exist for working with OpenAI from Microsoft, there are still quite a lot of external tools and SDKs that seem required for sophisticated applications, but clear support is not present.

 

To help with these challenges, Azure Functions now supports a new extension for OpenAI that contains a set of triggers and bindings that make it easier to build applications that require the following capabilities.

 

Retrieval Augmented Generation (Bring your own data for semantic search)

  • Data ingestion with Functions bindings.
  • Automatic chunking and embeddings creation.
  • Store embeddings in vector database including AI Search, Cosmos DB for MongoDB, and Azure Data Explorer.
  • Binding that takes prompts, retrieves documents, sends to OpenAI LLM, and returns to user.

 

Text completion for content summarization and creation

  • Input binding that takes prompt or content and returns response from LLM.

 

Chat assistants

  • Input binding to chat with LLMs.
  • Output binding to retrieve chat history from persisted storage.
  • Skills trigger to extend capabilities of OpenAI LLM through natural language. 

 

These bindings are available in preview for C#, Java, Python, Node, and PowerShell. You can see the documentation for more information.

 

 

Use your own data with Azure Open AI

A typical architecture when bringing your own data for semantic search using Retrieval Augmented Generation is shown below.

 

EamonOReilly_0-1718296036518.png

 

 

In this example, customers documents are uploaded by a client and stored for later retrieval. These documents are broken into chunks and sent to Azure OpenAI to create embeddings on the content. The embeddings are then stored in a vector database like AI Search, Cosmos DB for MongoDB or Azure data explorer. These are the current vector stores supported by Azure Functions OpenAI extension, but more will be added in the future.

 

Once the documents are successfully stored in a vector store, the client can now ask questions of this content to enhance the application with organization data driven by natural language with Azure OpenAI.

 

This is performed by sending the prompt to the Azure Function binding, where it will be sent to Azure OpenAI to create embeddings, and then these embeddings will be used to do semantic search against the vector store to retrieve relevant content. This content is then sent to Azure OpenAI along with the initial prompt so that an answer can be sent back to the client to enhance the application.

 

 

Extending Azure OpenAI chat experience with function calling

A second common scenario we see customers doing is extending the chat experience of working with Azure OpenAI to perform additional actions that the LLM cannot perform. These can be things like sending an email on the findings, looking up support ticket information, perform timecard reporting, or any other action that makes sense in the chat experience that is tailored for the user.

 

This capability in Azure OpenAI is made available through Function calling https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/function-calling using the Assistant API.

 

A typical flow for this scenario is described below:

 

Assistant.png

 

In this example, the client starts a chat session with Azure OpenAI, interactions are automatically saved to a table storage account for history and auditing as required and to resume chats later. If a question comes that OpenAI is not able to answer, the Azure Functions Assistant Trigger capabilities are automatically sent for each interaction so that OpenAI can then direct Azure Functions bindings to call the trigger for the user to perform those custom actions.

This capability is automatically delivered by the Functions triggers and bindings to enable faster development and to give the developer more time to focus on the business integration of the application.

 

Content summarization and generation

 

One of the most common use cases for customers using Azure OpenAI is to integrate the ability to create content, improve existing content, or convert content to another form.

 

In this scenario, the client makes a prompt request with the content and instructions to OpenAI and the Azure Functions text completion binding takes this prompt and sends to OpenAI and returns the response to the client.

 

EamonOReilly_0-1718806933529.png

 

 

Different parameters are supported on the binding to help enhance the response.

 

As you can see in these three scenarios, the Azure Functions bindings and triggers for Azure OpenAI delivers built in capabilities for developers to integrate intelligence into their new and existing applications. The bindings are available in all supported languages in Azure Functions including .NET, Java, Python, Node, and PowerShell.

 

Given the serverless nature of Azure Functions, the OpenAI integration will automatically scale based on customer demand so organizations can experiment and know that the underlying platform can meet any usage scenarios. For applications that might need orchestration and workflow capabilities in these intelligent function applications, the native support for durable functions enables developers to take full advantage of the service to deliver the right solution for their users.

 

Lastly, for customers who need real-time RAG (Retrieval augmented generation) for their organizational data, existing triggers and bindings in the Azure Functions ecosystem enable automatic processing of data in files, databases, messaging services, or any of the systems natively supported for integration https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings?#supported-bindi...  

 

If you would like to get started building intelligence into your applications with Azure Functions and Azure OpenAI, please visit the documentation or you can see an end-to-end demo solution that you can deploy to see all of the capabilities.

 

The Azure Functions OpenAI triggers and bindings are currently in public preview, and we would love to hear feedback from you on improvements or issues that you experience. You can file them on Issues · Azure/azure-functions-openai-extension (github.com)

 

1 Comment
Co-Authors
Version history
Last update:
‎Jun 19 2024 07:22 AM
Updated by: