Announcing Azure OpenAI Service Assistants Public Preview Refresh
Published May 21 2024 08:31 AM 21.3K Views
Microsoft

AndyBeatman_3-1716251537068.png

 

In early 2023, we introduced the groundbreaking Assistants API in Azure OpenAI Service (Public Preview) to empower developers to easily build agent-like features into their applications. Building these agentic features was possible before, but often required significant engineering, the use of third-party libraries, and multiple integrations. Now with Assistants, leveraging the latest GPT models, tools, and knowledge, developers are rapidly creating customized stateful copilots grounded in their enterprise data and capable of handling a diverse range of tasks.

 

Today, we are announcing the Public Preview Refresh of Assistants with a range of new features, including File Search and Browse tools, enhanced data security features, improved controls, new models, expanded region support, and various enhancements to make it easy to get from prototyping to production.   

 

Expand your copilot's capabilities with powerful new Assistants Tools 

 

Azure OpenAI's Assistants API comes packed with prebuilt tools that make it easier than ever for developers to extend the capabilities of their AI applications. The Code Interpreter tool enables advanced data analysis and data visualization with charts and graphs, writing and running Python code and solving math problems. With Function Calling, developers can write custom functions and have the app intelligently return the functions that need to be called along with their arguments. Today we are announcing two new tools to supercharge your copilots further:

 

File Search (Public Preview)Enables you to easily connect your enterprise data sources, enable vector search and implement Retrieval-Augmented Generation (RAG). File Search supports up to 10,000 files per Assistant, supports parallel queries through multi-threaded searches, and features enhanced reranking and query rewriting. We are also introducing vector_store as a new object in the API. Once a file is added to a vector store, it is automatically parsed, chunked, and embedded, made ready to be searched. Vector stores can be used across assistants and threads, simplifying file management and billing. File Search uses the text-embedding-3-large model at 256 dimensions, a default chunk size of 800 tokens and a chunk overlap of 400 tokens. 

 

File Search is available on Assistants API starting today and will be available on the Azure AI Studio in June. See it in action in this notebookFile Search is priced at $0.10/GB of vector store storage per day (the first GB of storage is free). This tool will be offered free-of-cost until Jun 17, 2024.

 

Bring Your Indexes to File Search (Public Preview coming in July 2024): Allows existing users of Azure OpenAI On Your Data, to connect existing On Your Data indexes to the File Search tool. On Your Data indexes can be created with data in Azure AI Search, Azure Cosmos DB for MongoDB vCore, Azure Blob Storage, Pinecone, Elasticsearch, etc. Simply select your AML project indexes in the File Search tool and enable Retrieval-Augmented Generation on this data. 

 

Browse (Public Preview coming in July 2024): Enable your Assistant to search the web to help answer questions that benefit from the most up-to-date information. With the Browse tool, you can bring intelligent search to your apps and harness the ability to comb billions of webpages, images, videos, and news with a single API call. If your users ask a question that requires the use of the Browse feature (e.g. What is the weather today in Seattle?), your Assistant will formulate a keyword search based on this query and submit it to the Bing search engine to retrieve results. 

 

Develop enterprise-ready copilots with enhanced data security and controls

 

Customer-managed key (CMK) support for Assistants Thread State and Files (Public Preview coming in June 2024): Enables users to protect and control access to stateful entities and files. Create your own keys and store them in the key vault or managed HSM or use the Azure Key Vault APIs to generate keys. CMK support for File Search is coming soon.

 

Control Assistants Outputs and Manage Costs (Public Preview): Allow users to view the input and output tokens used in a thread, message and run, control the maximum number of tokens a run uses in the Assistants API, and set limits on the number of previous messages used in each run. If your scenario necessitates use of a specific tool, you can force the use of a specific tool like file_search or code_interpreter in a particular run with the new tool_choice parameter.  Learn more about Assistants token management here.

 

We are also announcing support for popular model configuration parameters, including temperature, response_format (JSON mode) and top_p in Assistant and Run objects. By adjusting the temperature and top_p parameters, you can achieve different levels of creativity and control in an Assistant’s outputs, making them suitable for a wide range of applications. 
 

Cross-Prompt Injection Attack (XPIA) Mitigation (Public Preview)Cross-prompt injection attacks, or XPIA for short, are a kind of attack on generative AI systems (copilots and platforms), that can happen whenever your copilot processes some information that wasn’t directly authored by either the developer or the user, for example summarizing a document or web page, or describing an image. An attacker can embed instructions inside that object which the AI misinterprets as instructions, and then acts upon. To ensure that you and your users are protected from cross-prompt injection attacks, we are launching support for prompt shields with Assistants. Prompt shields are designed to protect against actions that may be malicious or attempt to subvert rules set in the system message. you can also protect against regurgitation of protected code or text. To enable the XPIA filter, navigate to the Content Filter Creation wizard on the Azure AI Studio and enable prompt shield for index attack for the model deployment you are using with your Assistant. 

 

XPIA image.png

 

Learn about additional mitigation approaches in Security guidance for Large Language Models | Microsoft Learn.

 

Unlock net-new scenarios with expanded model support

 

GPT-4o Support (Public Preview coming in June 2024): GPT-4o is the latest preview model from OpenAI. GPT-4o integrates text and images in a single model, enabling it to handle multiple data types simultaneously. This multimodal approach enhances accuracy and responsiveness in human-computer interactions. We are announcing support for gpt-4o on Assistants to enable you to create interactive multi-modal experiences.
 

Finetuned Model Support (Public Preview): Use finetuned gpt-35-turbo (0125) in Sweden Central and East US2 regions. We will be expanding model version and region support for finetuned models in the future. 

 

Vision Support (Public Preview coming in June 2024): We are announcing vision support for Assistants through gpt-4-turbo (0409) on Assistants. You can create messages with image URLs or uploaded files, and your assistant will use the visuals as part of its context for the conversation. 

 

Regional Availability Expansion (Public Preview): We have also expanded regional availability for Assistants to include Japan East, UK South, West US and West US3. For information on regional model availability, consult the region-model matrix for Assistants. 

 

Additional Features and Enhancements

 

Streaming and Polling Support (Public Preview): We're excited to announce support for streaming responses to reduce perceived latency in your applications using Assistants. With our Python SDK, you can use 'create and stream' helpers to create runs and stream responses seamlessly. We've also added SDK helpers to share object status updates without needing to poll. 

 

Custom Conversation Histories in Threads (Public Preview): Create messages with the Role Assistant to create custom conversation histories in Threads. 

 

Python, JavaScript/TypeScript, .NET, Java and Go SDK Support (Public Preview): Python developers can start using Azure OpenAI Assistants right now using the OpenAI library for Python (https://aka.ms/oai/py/asst).  We are announcing the new Azure OpenAI client (https://aka.ms/oai/js) in OpenAI library for JavaScript/TypeScript. Developers can access the latest Azure OpenAI Assistant APIs via this unified SDK (https://aka.ms/oai/js/asst). .NET support will be available in June and Java/Go in July.  

 

Assistants Tracing and Evaluation with the PromptFlow SDK (Public Preview coming in June 2024): As developers move from prototyping with Assistants to production, the complexity of components and calls increases, especially because of the non-deterministic nature of Assistants. As a result, fast switching, evaluating, and comparing of Assistants configurations becomes important. We are announcing support for instrumentation of the Assistants API in the PromptFlow SDK to enable better transparency and debuggability by tracing functions and tool calls. In addition, PromptFlow's evaluation feature will enable developers to measure and assess the quality and safety of their Assistants' outputs through built-in and custom evaluators.  

 

AndyBeatman_0-1716251408023.png

 

Tracing: The PromptFlow SDK allows users to trace the tool calls of their existing application by instrumenting commonly used libraries and by providing a rich UX to visualize the traces locally and in AI Studio. Tracing empowers developers to understand the flow of their LLM app both in development and as part of monitoring in production.  

 

Evaluation: The PromptFlow Evaluator SDK will help thoroughly assess the performance of your Assistant-powered application both during development and when in operation. With the SDK’s built-in or custom evaluators, you can measure Assistant outputs as well as individual tool’s quality and safety with both code-based metrics as well as AI-assisted quality and safety evaluators. Built-in evaluators include performance and quality (groundedness, relevance coherence, fluency, similarity, F1 Score) and risk and safety (violence, sexual, self-harm, hate & unfairness).  

 

Build multi-agent setups with Azure OpenAI Assistants and AutoGen: AutoGen by Microsoft Research provides a multi-agent conversation framework to enable convenient building of LLM workflows across a wide range of applications. Azure OpenAI assistants are now integrated into AutoGen via GPTAssistantAgent, a new experimental agent that lets you seamlessly add Assistants into AutoGen-based multi-agent workflows. This enables multiple Azure OpenAI assistants, that could be task or domain specialized to collaborate and tackle complex tasks. Learn more about AutoGen here and notebooks with examples of how to use GPTAssistantAgent here and here.

 

Leverage Azure Logic Apps to build Assistants with function calling (public preview coming soon on May 30th): You can now discover, import and invoke Azure Logic Apps workflows from the Azure OpenAI Assistants playground on Azure OpenAI Studio, without writing code. Testing Assistants with function calling is now easy as you can import your Logic Apps workflows as AI function going through a browse and select experience and the function specification generation and other configuration is pulled from workflow swagger automatically. The workflows are invoked by Open AI function calling based on user prompts, and all the appropriate parameters are passed based on the definition. 

 

Now let's look at a few early adopters of Azure OpenAI Assistants who have unlocked net-new scenarios and accelerated their AI transformation.

 

Assistants Customer Showcase at Microsoft Build 2024

 

Coca Cola Logo.jpeg

The Coca-Cola Company is enhancing productivity for its 30,000+ associates by integrating the Azure OpenAI Assistants API with its KO Assist Tool. This AI-driven platform tailors to Coca-Cola's specific needs, speeding up insights and analysis, aiding in scenario planning and standardizing tools across the organization to improve consistency and efficiency in communications and tasks.

 

“We have seamlessly integrated Azure OpenAI Assistants API into our operations with the launch of KO Assist, a powerful tool that merges human ingenuity with cutting-edge technology to enhance collaboration and co-creation across our workforce. KO Assist is widely appreciated for its ability to boost productivity by providing timely and effortless access to critical business insights and enterprise data. Utilizing features like the Assistant’s Code Interpreter and File Search tools, our associates can navigate complex data with ease and deploy customized Assistance for their business functions. The rapid deployment of KO Assist—operational within weeks rather than months—underscores Azure’s commitment to enterprise efficiency and effectiveness from Day One, significantly accelerating our time-to-market and reinforcing our competitive edge in the industry.”  - Punit Vir, VP Emerging Technologies, The Coca-Cola Company

 

 

Freshworks_logo_2019__august.png

 

Freshworks Inc. makes it easy for companies to delight their customers and their employees. Its AI-powered customer and employee-service solutions increase efficiency and improve engagement for companies of all sizes. The result is happier customers and more productive employees. Headquartered in San Mateo, California, Freshworks operates around the world to serve more than 67,000 customers, including American Express, Bridgestone, Databricks, Fila, Nucor and Sony.  
 

“Our Freddy AI platform leverages Assistants API from Azure OpenAI to enable our customers to build AI Agents with near zero configuration or coding. Assistants' advanced file search and parallel function calling capabilities provide intelligent, accurate responses from our customer's data corpuses spread within their enterprise. This enables the AI agent to take intelligent automated actions resulting in improved deflection rates and significant personalization."    - Ramesh Parthasarthy, Chief Architect, Freshworks 

 

AndyBeatman_1-1716251408028.png

 

 

copilotforfinancelogo.png

 

Microsoft Copilot for Finance is a new Copilot experience for Microsoft 365 that unlocks AI-assisted competencies for financial professionals, right from within productivity applications they use every day. Now available in public preview, Copilot for Finance connects to the organization’s financial systems, including Dynamics 365 and SAP, to provide role-specific workflow automation, guided actions, and recommendations in Microsoft Outlook, Excel, Microsoft Teams and other Microsoft 365 applications —helping to save time and focus on what truly matters: navigating the company to success.

 

Copilot for Finance leverages the Assistants API and its Code Interpreter tool to conduct variance analysis. By uploading and processing data, writing and executing code, and performing transformations, calculations, and exploratory analysis, it delivers faster and more accurate results. 

 

"Copilot-led variance analysis unlocks incredible value for finance professionals by accelerating the time it takes to move from analysis to action. With Azure Open AI Service Assistants, we're able to synthesize vast amounts of diverse and complex financial sources to surface unique insights and empower users with the information they need." - Georg Glantschnig, Vice President, Dynamics 365 ERP Applications

 

 

Looking ahead

 

2024 is the year of AI Agents. As we continue to innovate and push the boundaries of what is possible with building enterprise ready copilot and agentic AI, Azure OpenAI Service remains committed to providing developers with the capabilities needed succeed in an ever-evolving digital world. These new features are just the beginning of what is to come, and we cannot wait to see how enterprises will leverage them to achieve new heights of productivity and security.

 

Join us at Microsoft Build to learn more about these exciting developments and how they can transform your business.

 

Resources 

11 Comments
Co-Authors
Version history
Last update:
‎Jun 03 2024 10:55 AM
Updated by: