model context protocol
5 TopicsModel Context Protocol (MCP) Server for Azure Database for MySQL
We are excited to introduce a new MCP Server for integrating your AI models with data hosted in Azure Database for MySQL. By utilizing this server, you can effortlessly connect any AI application that supports MCP to your MySQL flexible server (using either MySQL password-based authentication or Microsoft Entra authentication methods), enabling you to provide your business data as meaningful context in a standardized and secure manner.1.6KViews2likes0CommentsFueling the Agentic Web Revolution with NLWeb and PostgreSQL
We’re excited to announce that NLWeb (Natural Language Web), Microsoft’s open project for natural language interfaces on websites now supports PostgreSQL. With this enhancement, developers can leverage PostgreSQL and NLWeb to transform any website into an AI-powered application or Model Context Protocol (MCP) server. This integration allows organizations to utilize a familiar, robust database as the foundation for conversational AI experiences, streamlining deployment and maximizing data security and scalability. Soon, autonomous agents, not just human users, will consume and interpret website content, transforming how information is accessed and utilized online. During Microsoft //Build 2025, Microsoft introduced the era of the open agentic web, in which the internet is an open agentic web a new paradigm in which autonomous agents seamlessly interact across individual, organizational, team and end-to-end business contexts. To realize the future of an open agentic web, Microsoft announced the NLWeb project. NLWeb transforms any website to an AI-powered application with just a few lines of code and by connecting to an AI model and a knowledge base. In this post, we’ll cover: What NLWeb is and how it works with vector databases How pgvector enables vector similarity search in PostgreSQL for NLWeb Get started using NLWeb with Postgres Let’s dive in and see how Postgres + NLWeb can redefine conversational web interfaces while keeping your data in a familiar, powerful database. What is NLWeb? A Quick Overview of Conversational Web Interfaces NLWeb is an open project developed by Microsoft to simplify adding conversational AI interfaces to websites. How NLWeb works under the hood: Processes existing data/website content that exists in semi-structured formats like Schema.org, RSS, and other data that websites already publish Embeds and indexes all the content in a vector store (i.e PostgreSQL with pgvector) Routes user queries through several processes which handle natural langague understanding, reranking and retrieval. Answers queries with an LLM The result is a high-quality natural language interface on top of web data, giving developers the ability to let users “talk to” web data. By default, every NLWeb instance is also a Model Context Protocol (MCP) server, allowing websites to make their content discoverable and accessible to agents and other participants in the MCP ecosystem if they choose. Importantly, NLWeb is platform-agnostic and supports many major operating systems, AI models, and vector stores and the NLWeb project is modular by design, so developers can bring their own retrieval system, model APIs, and define their own extensions. NLWeb with PostgreSQL PostgreSQL is now embedded into the NLWeb reference stack as a native retriever, creating a scalable and flexible path for deploying NLWeb instances using open-source infrastructure. Retrieval Powered by pgvector NLWeb leverages pgvector, a PostgreSQL extension for efficient vector similarity search, to handle natural language retrieval at scale. By integrating pgvector into the NLWeb stack, teams can eliminate the need for external vector databases. Web data stored in PostgreSQL becomes immediately searchable and usable for NLWeb experiences, streamlining infrastructure and enhancing security. PostgreSQL's robust governance features and wide adoption align with NLWeb’s mission to enable conversational AI for any website or content platform. With pgvector retrieval built in, developers can confidently launch NLWeb instances on their own databases no additional infrastructure required. Implementation example We are going to use NLWeb and Postgres, to create a conversational AI app and MCP server that will let us chat with content from the Talking Postgres with Claire Giordano Podcast! Prerequisites An active Azure account. Enable and configure the pg_vector extensions. Create an Azure AI Foundry project. Deploy models gpt-4.1, gpt-4.1-mini and text-embedding-3-small. Install Visual Studio Code. Install the Python extension. Install Python 3.11.x. Install the Azure CLI (latest version). Getting started All the code and sample datasets are available in this GitHub repository. Step 1: Setup NLWeb Server 1. Clone or download the code from the repo. git clone https://github.com/microsoft/NLWeb cd NLWeb 2. Open a terminal to create a virtual python environment and activate it. python -m venv myenv source myenv/bin/activate # Or on Windows: myenv\Scripts\activate 3. Go to the 'code/python' folder in NLWeb to install the dependencies. cd code/python pip install -r requirements.txt 4. Go to the project root folder in NLWeb and copy the .env.template file to a new .env file cd ../../ cp .env.template .env 5. In the .env file, update the API key you will use for your LLM endpoint of choice and update the Postgres connection string. For example: AZURE_OPENAI_ENDPOINT="https://TODO.openai.azure.com/" AZURE_OPENAI_API_KEY="<TODO>" # If using Postgres connection string POSTGRES_CONNECTION_STRING="postgresql://<HOST>:<PORT>/<DATABASE>?user=<USERNAME>&sslmode=require" POSTGRES_PASSWORD="<PASSWORD>" 6. Update your config files (located in the config folder) to make sure your preferred providers match your .env file. There are three files that may need changes. config_llm.yaml: Update the first line to the LLM provider you set in the .env file. By default it is Azure OpenAI. You can also adjust the models you call here by updating the models noted. By default, we are assuming 4.1 and 4.1-mini. config_embedding.yaml: Update the first line to your preferred embedding provider. By default it is Azure OpenAI, using text-embedding-3-small. config_retrieval.yaml: Update the first line to postgres. You should update write_endpoint to postgres and You should update postgres retrieval endpoint is enabled to 'true' in the following list of possible endpoints. Step 2: Initialize Postgres Server Go to the 'code/python/misc folder in NLWeb to run Postgres initializer. NOTE: If you are using Azure Postgres Flexible server make sure you have `vector` extension allow-listed and make sure the database has the vector extension enabled, cd code/python/misc python postgres_load.py Step 3: Ingest Data from Talk Postgres Podcast Now we will load some data in our local vector database to test with. We've listed a few RSS feeds you can choose from below. Go to the 'code/python folder in NLWeb and run the command. The format of the command is as follows (make sure you are still in the 'python' folder when you run this): python -m data_loading.db_load <RSS URL> <site-name> Talking Postgres with Claire Giordano Podcast: python -m data_loading.db_load https://feeds.transistor.fm/talkingpostgres Talking-Postgres (Optional) You can check the documents table in your Postgres database and verify the table looks like the one below. To verify all the data from the website was uploaded. Test NLWeb Server Start your NLWeb server (again from the 'python' folder): python app-file.py Go to http://localhost:8000/ Start ask questions about the Talking Postgres with Claire Giordano Podcast, you may try different modes. Trying List Mode: Sample Prompt: “I want to listen to something that talks about the advances in vector search such as DiskANN” Trying Generate Mode Sample Prompt: “What did Shireesh Thota say about the future of Postgres?” Running NLWeb with MCP 1. If you do not already have it, install MCP in your venv: pip install mcp 2. Next, configure your Claude MCP server. If you don’t have the config file already, you can create the file at the following locations: macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json The default MCP JSON file needs to be modified as shown below: macOS Example Configuration { “mcpServers”: { “ask_nlw”: { “command”: “/Users/yourname/NLWeb/myenv/bin/python”, “args”: [ “/Users/yourname/NLWeb/code/chatbot_interface.py”, “—server”, “http://localhost:8000”, “—endpoint”, “/mcp” ], “cwd”: “/Users/yourname/NLWeb/code” } } } Windows Example Configuration { “mcpServers”: { “ask_nlw”: { “command”: “C:\\Users\\yourusername\\NLWeb\\myenv\\Scripts\\python”, “args”: [ “C:\\Users\\yourusername\\NLWeb\\code\\chatbot_interface.py”, “—server”, “http://localhost:8000”, “—endpoint”, “/mcp” ], “cwd”: “C:\\Users\\yourusername\\NLWeb\\code” } } } Note: For Windows paths, you need to use double backslashes (\\) to escape the backslash character in JSON. 3. Go to the 'code/python’ folder in NLWeb and run the command. Enter your virtual environment and start your NLWeb local server. Make sure it is configured to access the data you would like to ask about from Claude. # On macOS source ../myenv/bin/activate python app-file.py # On Windows ..\myenv\Scripts\activate python app-file.py 4. Open Claude Desktop. It should ask you to trust the 'ask_nlw' external connection if it is configured correctly. After clicking yes and the welcome page appears, you should see 'ask_nlw' in the bottom right '+' options. Select it to start a query. 5. To query NLWeb, just type 'ask_nlw' in your prompt to Claude. You'll notice that you also get the full JSON script for your results. Remember, you must have your local NLWeb server started to use this option. Learn More Vector Store in Azure Postgres Flexible Server Generative AI in Azure Postgres Flexible Server NLWeb GitHub repo includes: A reference server for handling natural language queries PGvector integrationCurious About Model Context Protocol? Dive into MCP with Us!
Global Workshops for All Skill Levels We’re hosting a series of free online workshops to introduce you to MCP—available in multiple languages and programming languages! You’ll get hands-on experience building your first MCP server, guided by friendly experts ready to answer your questions. Register now: https://aka.ms/letslearnmcp Who Should Join? This workshop is built for: Students exploring tech careers Beginner devs eager to learn how AI agents and MCP works Curious coders and seasoned pros alike If you’ve got some code curiosity and a laptop, you’re good to go. Workshop Schedule (English Sessions) Date Tech Focus Registration Link July 9 C# Join Here July 15 Java Join Here July 16 Python Join Here July 17 C# + Visual Studio Join Here July 21 TypeScript Join Here Multilingual Sessions We’re also hosting workshops in Spanish, Portuguese, Japanese, Korean, Chinese, Vietnamese, and more! Explore different tech stacks while learning in your preferred language: Date Language Technology Link July 15 한국어 (Korean) C# Join July 15 日本語 (Japanese) C# Join July 17 Español C# Join July 18 Tiếng Việt C# Join July 18 한국어 JavaScript Join July 22 한국어 Python Join July 22 Português Java Join July 23 中文 (Chinese) C# Join July 23 Türkçe C# Join July 23 Español JavaScript/TS Join July 23 Português C# Join July 24 Deutsch Java Join July 24 Italiano Python Join 🗓️ Save your seat: https://aka.ms/letslearnmcp What You’ll Need Before the event starts, make sure you’ve got: Visual Studio Code set up for your language of choice Docker installed A GitHub account (you can sign up for Copilot for free!) A curious mindset—no MCP experience required You can check out the MCP for Beginner course at https://aka.ms/mcp-for-beginners What’s Next? MCP Dev Days! Once you’ve wrapped up the workshop, why not go deeper? MCP Dev Days is happening July 29–30, and it’s packed with pro sessions from the Microsoft team and beyond. You’ll explore the MCP ecosystem, learn from insiders, and connect with other learners and devs. 👉 Info and registration: https://aka.ms/mcpdevdays Whether you're writing your first line of code or fine-tuning models like a pro, MCP is a game-changer. Come learn with us, and let’s build the future together258Views0likes0CommentsSkill Up On The Latest AI Models & Tools on Model Mondays - Season 2 starts Jun 16!
Quick Links To RSVP for each episode: EP1: Advanced Reasoning Models: https://developer.microsoft.com/en-us/reactor/events/25905/ EP2: Model Context Protocol: https://developer.microsoft.com/en-us/reactor/events/25906/ EP3: SLMs (and Reasoning): https://developer.microsoft.com/en-us/reactor/events/25907/ Get All The Details: https://aka.ms/model-mondays Azure AI Foundry offers the best model choice Did you manage to catch up on all the talks from Microsoft Build 2025? If, like me, you are interested in building AI-driven applications on Azure, you probably started by looking at what’s new in Azure AI Foundry. I recommend you read Asha Sharma’s post for the top 10 things you need to know in this context. And it starts with New Models & Smarter Models! New Models | Azure AI Foundry now has 11,000+ models to choose from – including frontier models from partner providers, and thousands of open-source community variants from Hugging Face. But how do you pick the right one for your needs? Smarter Models | It's not just about model selection, but about the effort required to use the model and validate the results. How can you improve the user experience and build trust and confidence in your customers? Solutions like Model Router and Azure AI Evaluations SDK help. The Challenge? | New models, features, and tools being released daily - the information overload is real. How do we keep up with the latest updates – and skill up on model choices? Say Hello to Model Mondays! Model Mondays is a weekly series with a livestream on Microsoft Reactor (on Mondays) and a follow-up AMA on Azure AI Foundry Discord (on Fridays). Here are the three links to know: Visit the Model Mondays repo: https://aka.ms/model-mondays Watch the Model Mondays playlist: https://aka.ms/model-mondays/playlist Join #model-mondays on Discord: https://aka.ms/model-mondays/discord Visit the playlist or repo to catch up on replays from Season 1 (above). Learn about topics like reasoning models, visual generative models, open-source AI, forecasting models, and local AI development with Visual Studio AI Toolkit. Each 30-minute episode consists of: 5-min Highlights. Catch up on top model-related news from the previous week. 15-min Spotlight. Get a deep dive into a specific model, model family, or related tool. Q&A. Ask questions on chat during the livestream, or join the AMA on Discord on Friday. Register Now to Join Us for Season 2! Microsoft Build showed us that the model catalog is expanding quickly – and so are the tools that are available, to help you select, customize, and evaluate, your AI application. In Season 2, we’re going to dive deeper into advanced topics in models and tools. Some of the topics that we hope to cover include: Advanced Reasoning Models – Deep Research, Visual Reasoning and more. Model Context Protocol – What it is, examples of MCP Servers today. SLMs and Reasoning – We dive into the Phi-4 model ecosystem for insights. Foundry Labs – Explore projects like Magentic-UI and MCP Server. Open-Source Models – Explore the 10K+ community models from Hugging Face. Edge Models – Explore Foundry Local and the rise of on-device AI capabiltiies. Models for AI Agents – Explore the Agent Catalog samples like Red-Teaming Model Playgrounds – Explore Image, Video, Agent, Chat, Language playgrounds Advanced Fine Tuning – Learn to fine GPT models, and use the Foundry Portal AI Developer Experience – Get productive with AI Toolkit & VS Code Extension pack Our first three episodes below are open for registration right now! EP1: Advanced Reasoning Models: https://developer.microsoft.com/en-us/reactor/events/25905/ EP2: Model Context Protocol: https://developer.microsoft.com/en-us/reactor/events/25906/ EP3: SLMs (and Reasoning): https://developer.microsoft.com/en-us/reactor/events/25907/ Let's build our model IQ and get starting developing AI applications on Azure! Want to Build AI Apps - and need resources to accelerate your journey? Chat with us on Azure AI Foundry Discord - https://aka.ms/aifoundrydiscord Provide feedback on our Discussion Forum - https://aka.ms/aifoundryforum Skill up with the Azure AI Foundry Learn Course - http://aka.ms/learnatbuild Review the Azure AI Foundry Documentation - http://aka.ms/AzureAI Download and explore the Azure AI Foundry SDK - http://aka.ms/aifoundrysdk199Views0likes3Comments