Azure Cognitive Search is now Azure AI Search, and semantic search is now semantic ranker. See below for more details.
Over the past few months, we have delivered new capabilities as part of our goal to ensure our customers find the best the market has to offer in Azure AI Search when it comes to retrieval systems for generative AI applications.
Today, we are pleased to announce vector search and semantic ranker (previously known as ‘semantic search’) are now generally available in Azure AI Search.
Vector search in Azure AI Search, offers a comprehensive vector database solution to store, index, query, filter and retrieve your AI data in a secure, enterprise-grade environment.
Finally, we are broadening our LLM community footprint with integrations with OpenAI Cookbook, LangChain, LlamaIndex, Semantic Kernel and more. We remain focused on delivering exceptional performance, industry-leading relevance, and user-friendly experiences to our customers, across all Generative AI scenarios.
Now generally available, semantic ranker is a feature in Azure AI Search that leverages advanced language models to improve the relevance and quality of search results. Semantic ranker is powered by state-of-the-art technology in partnership with Bing, that uses huge data assets and extensive machine learning expertise to rank documents. It can understand the intent and meaning behind a user's query to surface the most relevant matches. Semantic ranker also provides text highlights to show why a document is relevant and how it answers the user's question.
Because of this, we believe this capability to be a critical asset if you want the highest quality retrieval system. To make this benefit more available and accessible to customers, we have lowered and simplified the semantic ranker cost model:
Free Plan: 1000 requests/month for free
Standard Plan: 1000 request/month for free then $1.00 per 1000 additional requests
For more information on Semantic ranker regional availability, please visit our Azure AI Search availability by region page.
Powering the next generation of generative AI applications
Vector search and semantic ranker are now production ready, and we encourage you to use them in your generative AI applications using the Retrieval Augmented Generation (RAG) pattern. Many customers have already started using these features in production and are realizing business value.
Fonterra, headquartered in New Zealand, is one of the largest dairy companies in the world. Fonterra’s Research & Development Centre use Azure AI Search and Azure OpenAI Service for an interactive chatbot, where the team can search thousands of documents and discover information pertinent to their research. This helps the team deliver more comprehensive research while saving hours of valuable time.
“Azure AI Search has provided us a way to use Azure OpenAI embeddings and perform vector similarity searches over our research & development documentation. Using vector search has been a fantastic experience, and the ability to perform hybrid search is amazing because the algorithm includes company-specific jargon as part of its retrieval. We look forward to expanding our application and adding more data sources to our enterprise search use case." - Alex Novikov, Fonterra Research & Development Centre Data Scientist.
The best results are beyond vector search
Since retrieval quality is critical for RAG applications, we've invested heavily in delivering best relevance out of the box. Our extensive evaluations show that while using vector search improves results in many cases, using a combination of hybrid retrieval (keywords + vector search) and a reranking step (with semantic ranker) delivers significantly better results.
We published details of our evaluations in this blog post, and shared our findings on Microsoft Mechanics.
With integrated vectorization, we bring seamless integration of Azure OpenAI service to the forefront, providing automatic chunking and vectorization as part of ingestion for data from Blob Storage, Cosmos DB, Azure SQL, and more. This capability allows you to easily generate embeddings with Azure OpenAI service, or you can pass your own endpoint for custom embedding models. Additionally, Azure AI Search now includes vectorizers to automatically vectorize your queries, eliminating the need for you to generate query vectors manually.
By focusing on automatic chunking and vectorization as part of ingestion, we provide a more streamlined and user-friendly experience that caters to a wider range of users. For more information on how this builds on indexers, please refer to our documentation.
Azure AI Search is dedicated to helping customers build amazing experiences using Generative AI. As an industry-leading vector database, our offering empowers customers to surpass the limits of conventional keyword and vector-based systems, providing a cutting-edge solution for diverse search needs.
Azure Cognitive Search is now Azure AI Search
As of November 15, 2023, Azure Cognitive Search has a new name: Azure AI Search. All existing deployments, configurations, and integrations will continue to function as they currently do. There are no changes to pricing and no breaking changes to APIs or SDKs.
Semantic search, a feature of Azure AI Search (f.k.a Azure Cognitive Search) has been renamed to ‘semantic ranker’ as of November 15, 2023. This new name will not impact functionality, customer experience, or customers.