The last few months have shown us the true potential of generative AI, and the next big breakthrough in AI goes beyond just having powerful models. It's about connecting this AI with your data and systems.
Our generative AI capabilities give you access to some of the most mind-blowing language models on the planet, including GPT-4, Turbo, and ChatGPT API, all generally available. With Azure OpenAI Service, you can easily customize these models for any task you have in mind, from content summarization to chatting with your data to unlocking customer insights. Our service is easily accessible through REST APIs, Python SDK, or our user-friendly web-based interface in the Azure AI Studio.
At Microsoft Build 2023, we're excited to unveil groundbreaking new features that will help you integrate your AI with your data and systems, allowing you to create never-before-seen innovations. You can now use your own data to run on these cutting-edge models, add plugins to simplify integrating external data sources with APIs, and reserve provision throughput to gain control over the configuration and performance of OpenAI's large language models at scale. Plus, you can gain control over your quota and rate limits and create and configure content filters. Now let's take a closer look at the announcements.
Azure OpenAI Service on your data
Azure OpenAI Service on your data is a new feature coming to public preview in June. The feature allows enterprise users to utilize OpenAI's powerful conversational AI models, such as ChatGPT and GPT-4, on their own data while complying with their organizational policies. With Azure OpenAI Service on your data, businesses can use these models to chat, view data citations, and customize chat experiences based on their data.
This feature is highly customizable and tailored to meet the specific needs of individual organizations, providing direct answers to questions based on their data and plans. Enterprise users can enjoy faster and more accurate communication, improved customer service, and increased productivity across their organization.
Customers including IKEA and Volvo are leveraging this feature to discover business insights at scale and improve end-user journeys.
Plugins are a standardized interface that allows developers to build and consume APIs to extend the capabilities of large language models (LLMs) and enable a deep integration of GPT-4 across Azure and the Microsoft ecosystem. In the limited preview coming to developers in July, the following plugins are included:
Azure Cognitive Search
Azure Cosmos DB
Although only prebuilt plugins will be available in the preview, soon customers will soon be able to use their own plugins with Azure OpenAI Service too. We'll follow the same standards for building plugins as OpenAI, so plugins will be fully interoperable between the two platforms.
Provisioned Throughput Model
The provisioned throughput feature is a new offering coming to Azure OpenAI Service that allows customers to have more control over the configuration and performance of OpenAI's large language models at scale. It provides a dedicated connection to OpenAI models with a guaranteed throughput, measured in tokens/sec for prompts and completions. The feature will be generally available with limited access in June.
Customers can specify the total number of throughput units they will use and have the flexibility and control to allocate their commitment to any OpenAI model they desire. Each model requires a different number of units to run. Customers can choose from various commitment options with different prices per 100 units and overage charges for exceeding the specified limit.
With a 1-month or 1-year commitment, customers can secure provisioned throughput and receive savings in pricing. The provisioned throughput model offers greater control and flexibility over workload needs, ensuring that the system is ready when higher workloads appear. This feature also enables a consistent product experience and throughput for real-time applications.
Azure OpenAI Service now includes quotas to prevent overuse and ensure fair usage across all customers. Quotas can be set at the resource group level, with separate quotas for prompts, completions, and training. Customers can view their current usage and remaining quota in the Azure portal.
Configurable Content Filters
Configurable content filters allow customers to specify a list of banned or allowed words, phrases, and entities. These filters are applied to all text prompts and completions and help ensure that the output is appropriate for their intended audience. The feature is fully customizable, with customers able to specify their own filters or use the default filters provided by OpenAI.
Our Commitment to Responsible AI
At Microsoft, we're committed to the advancement of AI driven by principles that put people first. Generative models such as the ones available in Azure OpenAI Service have significant potential benefits, but without careful design and thoughtful mitigations, such models have the potential to generate incorrect or even harmful content. Microsoft has made significant investments to help guard against abuse and unintended harm, which includes requiring applicants to show well-defined use cases, incorporating Microsoft's principles for responsible AI use, building content filters to support customers, and providing responsible AI implementation guidance to onboarded customers.
We are inspired by our customers around the world using generative AI to improve business and end user experiences. Here are some of our favorite recent case studies.
In the world of scientific research, accuracy and efficiency are crucial. That's why Thermo Fisher Scientific, a leading provider of scientific equipment and services, turned to Microsoft partner Copy.AI's generative AI solution for help. Copy.AI's advanced AI algorithms allow Thermo Fisher Scientific to quickly generate high-quality written content for technical manuals, product descriptions, and more. Construction leader Strabag SE leveraged Azure OpenAI Service to identify potential impacts and mitigate risks in new construction projects. By utilizing AI capabilities, they were able to enhance their risk assessment process and make informed decisions upfront. KPMG accelerated their automation efforts by harnessing the power of Azure OpenAI Service. With AI-driven tax data extraction and prediction, KPMG reduced risk factors and increased customers' confidence, leading to improved efficiency and accuracy in their tax-related services. Moveworks developed a conversational AI platform using Azure OpenAI Service that revolutionizes workplace request resolution. Their platform understands natural language requests, allowing for rapid and personalized responses, ultimately improving employee productivity and satisfaction. Thread utilized Azure OpenAI Service to enhance their customer service capabilities. By automating time-consuming tasks with AI, Thread's IT technicians were able to focus on delivering excellent customer service, resulting in improved efficiency and customer satisfaction. Zammo's conversational AI platform, powered by Azure OpenAI Service, enables organizations of all sizes to create customized AI experiences. With Zammo, businesses can quickly build and deploy conversational AI solutions, empowering them to engage with customers more effectively. National Taiwan Normal University developed an innovative learning platform using Azure OpenAI Service to help K-12 students learn English faster. By integrating AI capabilities, they offer personalized learning experiences, enhancing students' language acquisition journey. Take Blip leveraged Azure OpenAI Service to create a chatbot for a holiday marketing campaign, enabling over 15,000 interactions. By harnessing the power of AI, Take Blip enhanced customer experiences, providing an engaging and interactive platform for users.
What will you create for your business, customers and end-users with Azure OpenAI Service?