This blog will give some insights on the newly released Azure OpenAI extension. It will combine both Azure OpenAI service and Azure Function Apps. We will discuss the following contents:
Please note that due to fast-growing development of AI services, some contents may be outdated. This article will use the released version as of July 2024.
Compared with standard Azure OpenAI API call, the extension would give:
To use this extension, the following requirements must be met:
Get access to Azure OpenAI service. If you are new to Azure, remember to request Azure OpenAI access via this link. After the approval, please go to deployment section and choose one of the LLMs (Large Language Models) for future use. Example:
Azure Function Apps under the following language version:
The extension will support all mentioned languages & version.
Support Scope: Public Preview. Since this feature is still in preview, you may notice some issues. If so, please reach out via support request, or raise issues via GitHub Issue Page pointing to the extension.
The following resources are highly recommended to start using this extension:
The following demos will use .NET8 and GPT-4o as language / LLM. We will dive deeper into 3 sections:
Essentially, this extension would help you make API calls to the Azure OpenAI endpoint with a smoother experience.
Chat allows users to communicate with Azure OpenAI service. The response would be generated based on pre-defined prompts and questions. Please do keep in mind if you want to have long-term memory of the chat history, 2 options are available when using the extension:
[Local Debugging]
Clone this GitHub repo, you may take a look at README.md to see detailed instructions.
Go to csharp-ooproc/local.settings.json. Replace values to match parameters in your Azure OpenAI endpoint. Also, remember to add setting CHAT_MODEL_DEPLOYMENT_NAME = <name of your deployed LLM>
Head to ChatBot.cs file, you can review and learn how to invoke REST api call to the Azure OpenAI services. Modify the trigger type if needed.
Remember to install the NuGet package Microsoft.Azure.Functions.Worker.Extensions.OpenAI == 0.16.0 Alpha. You may add by executing .NET CLI in VSCode, or a simple click to install this package in Visual Studio.
Go to root folder → cd samples/chat/csharp-ooproc && func start. We will not demo how to test as this has already been covered in the GitHub repo. In all, we will leverage the below 3 API requests:
Functions:
CreateChatBot: [PUT] <http://localhost:7071/api/chats/{chatId}>
GetChatState: [GET] <http://localhost:7071/api/chats/{chatId}>
PostUserResponse: [POST] <http://localhost:7071/api/chats/{chatId}>
If the local debugging would work, we will move on to publish to Azure.
[Publish to Azure]
Use the following methods to publish your project:
Visual Studio Code: Use Azure Extension to publish.
Visual Studio: Use built-in publish profile. Also please note to enable SCM Basic Auth Publishing Credentials from Azure portal, as this is required by Visual Studio deployment:
Grant the user or function app managed identity Cognitive Services OpenAI User on Azure OpenAI resource. This is important as the platform will use it as authentication method to allow connection:
Result:
Create a new chatbot
Make conversations
Text completion allows Azure OpenAI service to extend or answer with given sentences. It’s commonly used with paper writing, story telling and many more scenarios. The below example will demo how to leverage completion APIs to perform text completion:
[Local Debugging]
We will skip details of the test due to this has been covered in the README.md. If the local debugging would work, we will move on to publish to Azure.
[Publish to Azure]
Similar to chat section, please pay attention to the difference when using different IDEs.
Result:
WhoIs
GenericCompletion
Azure cosmosDB product provides a good option to store previous chat history / company-level Knowledge Base. In this example, we will show how to leverage cosmosDB product to store the information, then use semantic search to locate and print required contents.
[Requirements]
[How to work with Azure cosmosDB]
To work with Azure cosmosDB, firstly prepare the environment:
Then, we will follow the below steps to leverage cosmosDB:
Insert docs into cosmosDB. We will need a storage or equivalent service to host TXT/JSON file. This is the source where you can add or edit the content. Then invoke POST request like below:
My example: Mengyang Chen is a Support Engineer working for Azure App Service Team.
You can also validate the ingestion has been succeeded in the terminal logs:
Query by Prompt. By invoking POST request, we can receive the desired result via semantic search.
As you can see, it would give you the result and where to find this info. This will be helpful if users want to build custom Knowledge Base or store long-term memory.
Hope this blog will give a good start to leverage this extension, and if you want to ask anything related to this, please feel free to leave comments, and we would be glad to help.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.