Blog Post

AI - Azure AI services Blog
2 MIN READ

Azure OpenAI Service expands .NET SDK support

Travis_Wilson_MSFT's avatar
Jun 08, 2024

Since its first release in December 2022, Azure OpenAI Service has continuously worked to meet developers where they are with rich, idiomatic client libraries that expand on OpenAI’s language availability in Python and JavaScript with options for .NET, Java, and Go development. This week, we’re excited to share two major announcements for .NET customers: the preview release of OpenAI’s official .NET library and the matched update of the preview Azure OpenAI Service client library for .NET.

 

.NET becomes OpenAI’s third officially supported language

As recently announced on the official .NET blog, OpenAI this week released a new OpenAI 2.0.0-beta package on NuGet.org that marks its first official support for .NET developers using programming languages like C#. This new, open-source library is produced and maintained through close, ongoing collaboration with Microsoft; OpenAI’s openai-dotnet repository joins openai-python and openai-node as the next client library project available directly from OpenAI. In addition to empowering .NET developers with access to OpenAI’s models and capabilities in their programming language of choice, this new library also features substantial strides in simplifying usage patterns to make even data-rich operations -- like streaming with v2 of the Assistants (beta) API -- easier and more intuitive.

 

Azure OpenAI Service adopts and extends OpenAI’s library

In reflection of this partnership, Azure.AI.OpenAI, the Azure SDK library for Azure OpenAI Service, has released a new 2.0.0-beta.1 update that converts the previously standalone .NET library into a companion of the official OpenAI .NET library. This new version streamlines Azure client configuration and provides additional, strongly typed support for concepts and capabilities specific to Azure OpenAI Service, such as Responsible AI Content Filtering annotations and On Your Data data sources and citations. With its extension of OpenAI’s .NET library, seamlessly switching between OpenAI and Azure OpenAI Service endpoints is easier than ever, and new language feature support can now arrive faster, independently of service API release vehicles.

 

Although this change brings a major version increment that will require migration, Azure OpenAI Service will also continue to support the previous 1.0.0-beta.17 package through the lifetime of that version’s most recently supported 2024-04-01-preview service API version. Developers are encouraged to upgrade for the latest features and substantial improvements to functionality and usability, but that upgrade is not mandatory for customers already using the previous library version in conjunction with an older service API version.

 

What’s next for .NET and OpenAI

Together with OpenAI, we’re eager to refine and improve our .NET libraries to reach a General Availability (GA) status as soon as possible. Developer feedback on both OpenAI’s openai-dotnet discussions and Azure SDK’s azure-sdk-for-net issues is welcomed, appreciated, and will help accelerate the continued evolution of .NET support for OpenAI and Azure OpenAI Service.

Published Jun 08, 2024
Version 1.0
  • We're working on a migration guide from 1.0.0-beta* to 2.0.0-beta* and fully empathize with the hassle of needing to update code; we're confident that the new alignment with OpenAI and updated usability approaches will be substantial net positives, but that doesn't reduce the importance of minimizing how long and how frustrating it is to reach that net positive state.

     

    Data_Juggler, the code you shared can be rewritten using 2.0.0-beta* as:

    using Azure.AI.OpenAI;
    using OpenAI.Chat;
    
    AzureOpenAIClient azureClient = new();
    ChatClient client = azureClient.GetChatClient(
        deploymentName: "gpt-35-turbo");
    
    const string textFromUser = "Hello, assistant!"; // PromptTextBox.Text
    
    ChatCompletion completion = client.CompleteChat(
        [
            new SystemChatMessage("You are an AI assistant.")
            {
                ParticipantName = "Steve",
            },
            new UserChatMessage(textFromUser),
        ]);
    
    Console.WriteLine(completion.Content[0].Text);
    

     

    You can find more samples and walkthroughs of the API on both OpenAI's GitHub repository and the Azure.AI.OpenAI project:

  • Essenbee's avatar
    Essenbee
    Copper Contributor

    Hi,

     

    Will you be providing a migration guide from 1.0.0-beta-17 to 2.0.0 since its obvious that a lot has changed?

     

    Thanks,

     

    Stu

  • I migrated from 1.0.0-beta-17 to 2.0.0 in just few hours on small proof of concept project. Very straight forward.

  • Essenbee's avatar
    Essenbee
    Copper Contributor

    Thanks for the response. I am using Semantic Kernel and the upgrade does not go well with that. I opened an issue about it on the SK Github.

     

    Cheers,

     

    Stu

  • Data_Juggler's avatar
    Data_Juggler
    Copper Contributor

    I upgraded to 2.0, and my code doesn't compile.

     

    Why did they break existing functionality? ChatRequestMessage no longer exists.

     

    ChatRequestMessage message1 = new ChatRequestAssistantMessage("You are an AI Assistant named Steve.");
    message1.Role = ChatRole.Assistant;

    // Get the user's question
    ChatRequestMessage message2 = new ChatRequestUserMessage(PromptTextBox.Text);
    message2.Role = ChatRole.User;

    var chatCompletionsOptions = new ChatCompletionsOptions
    {
    DeploymentName = "DataJugglerGPT4o",
    Messages = { message1, message2 }
    };

     

    I also had to change OpenAIClient to AzureOpenAIClient.

     

    When existing code breaks due to an upgrade, that is bad if you want people to use the latest code. 

    Is there a breaking changes document?

  • Surendratt's avatar
    Surendratt
    Copper Contributor

    I'm Getting Compatibility issues when I tried to use this in my existing .NET framework project

  • hkithmin's avatar
    hkithmin
    Copper Contributor

    I have upgraded code to 2.0. But it gave some errors. How to fix those errors?

     

    OpenAIClient client = new OpenAIClient(new Uri(openAiEndpoint), new AzureKeyCredential(openAiApiKey));
    var chatCompletionsOptions = new ChatCompletionsOptions
    {
    Messages =

    {
    new ChatMessage(ChatRole.System, @"You are the virtual assistance of ...."),

    new ChatMessage(ChatRole.User, _conversationHistory)

    },
    Temperature = 0.5f,
    MaxTokens = 150,
    NucleusSamplingFactor = 0.95f,
    FrequencyPenalty = 0,
    PresencePenalty = 0,
    };

    var chatCompletionsResponse = await client.GetChatCompletionsAsync("gpt35turbo", chatCompletionsOptions);
    string chatbotResponse = chatCompletionsResponse.Value.Choices[0].Message.Content;

  • ThomasBarron's avatar
    ThomasBarron
    Copper Contributor

    I, too, am having trouble migrating from v1 to v2. Any word on when the migration document will be complete?

     

    public class ChatService
    {

    private readonly IConfiguration _configuration;

    private string SystemMessage = "You are an AI assistant that helps people find information about food. For anything other than food, respond with 'I can only answer questions about food.'";

    public ChatService(IConfiguration configuration)
    {
    _configuration = configuration;
    }
    public async Task<Message> GetResponse(List<Message> messagechain)
    {
    string response = "";

    AzureOpenAIClient client = new AzureOpenAIClient(
    new Uri(_configuration.GetSection("Azure")["OpenAIUrl"]!),
    new AzureKeyCredential(_configuration.GetSection("Azure")["OpenAIKey"]!));


    ChatCompletionsOptions options = new ChatCompletionsOptions();
    options.Temperature = (float)0.7;
    options.MaxTokens = 800;
    options.NucleusSamplingFactor = (float)0.95;
    options.FrequencyPenalty = 0;
    options.PresencePenalty = 0;
    options.Messages.Add(new ChatMessage(ChatRole.System,SystemMessage));
    foreach (var msg in messagechain)
    {
    if (msg.IsRequest)
    {
    options.Messages.Add(new ChatMessage(ChatRole.User, msg.Body));
    }
    else
    {
    options.Messages.Add(new ChatMessage(ChatRole.Assistant, msg.Body));
    }
    }

    Response<ChatCompletions> resp = await client.GetChatCompletionsAsync(
    _configuration.GetSection("Azure")["OpenAIDeploymentModel"]!,
    options);

    ChatCompletions completions = resp.Value;

    response = completions.Choices[0].Message.Content;

    Message responseMessage = new Message(response,false);
    return responseMessage;
    }
    }