Blog Post

AI - Azure AI services Blog
2 MIN READ

LlamaIndex TypeScript now supports Azure AI Search as a vector store

fsunavala-msft's avatar
Jan 07, 2025

We are excited to announce that Azure AI Search is now available as a vector store in LlamaIndex TypeScript, the TypeScript implementation of the popular LlamaIndex framework. This integration helps JavaScript and TypeScript developers build smarter GenAI applications by using additional context to improve how they process and respond to information, all while leveraging the enterprise-grade features of Azure AI Search.

LlamaIndex.TS: Empowering JavaScript and TypeScript Developers

LlamaIndex.TS is the TypeScript implementation of the popular LlamaIndex framework, designed for building AI applications that integrate large language models (LLMs) with your data. With strong typing in TypeScript and support for multiple JavaScript server-side environments (Node.js, Deno, Bun, React Server Components, and Cloudflare Workers), LlamaIndex.TS enables developers to:

  • Connect to a variety of data sources, such as APIs, PDFs, and SQL databases.
  • Index data into structures optimized for LLM consumption, including vector store indices for semantic search.
  • Query indexed data naturally using large language models, enabling question-answering and chatbot scenarios.
  • Build Agents capable of performing tasks like research and data extraction.

 

Azure AI Search: Beyond Just a Vector Store

Azure AI Search delivers enterprise-grade features—like metadata filtering, hybrid search, and semantic ranking—that go beyond simple vector storage. By integrating Azure AI Search as a vector store in LlamaIndex.TS, developers can tap into these advanced capabilities to build robust retrieval-augmented generation (RAG) applications.

Key Features of the Integration:

  • Metadata Filtering: Apply filters based on document metadata to refine search results.
  • Hybrid Search: Combine traditional keyword search with vector search to improve search accuracy and relevance.
  • Semantic Ranking: Use Microsoft’s state-of-the-art reranking model built into Azure AI Search to deliver more contextually relevant search results.

Sample Code: Integrating Azure AI Search with LlamaIndex.TS

import { AzureAISearchVectorStore, VectorStoreIndex } from "llamaindex";
import { DefaultAzureCredential } from "@azure/identity";

// Initialize Azure AI Search Vector Store
const vectorStore = new AzureAISearchVectorStore({
  indexName: "your-index-name",
  credential: new DefaultAzureCredential(),
  endpoint: "https://your-search-service.search.windows.net",
});

// Create a Vector Store Index
const index = new VectorStoreIndex({ vectorStore });

// Insert a document into the index
await index.insert({
  text: "Sample document text",
  metadata: { author: "Author Name" },
});

// Query the index
const results = await index.query("Search query");
console.log(results);

 

Resources & Documentation

To begin building context-augmented web applications with LlamaIndex.TS and Azure AI Search, explore the following resources:

Updated Jan 07, 2025
Version 3.0
No CommentsBe the first to comment