functions
21 TopicsServerless MCP Agent with LangChain.js v1 — Burgers, Tools, and Traces 🍔
AI agents that can actually do stuff (not just chat) are the fun part nowadays, but wiring them cleanly into real APIs, keeping things observable, and shipping them to the cloud can get... messy. So we built a fresh end‑to‑end sample to show how to do it right with the brand new LangChain.js v1 and Model Context Protocol (MCP). In case you missed it, MCP is a recent open standard that makes it easy for LLM agents to consume tools and APIs, and LangChain.js, a great framework for building GenAI apps and agents, has first-class support for it. You can quickly get up speed with the MCP for Beginners course and AI Agents for Beginners course. This new sample gives you: A LangChain.js v1 agent that streams its result, along reasoning + tool steps An MCP server exposing real tools (burger menu + ordering) from a business API A web interface with authentication, sessions history, and a debug panel (for developers) A production-ready multi-service architecture Serverless deployment on Azure in one command ( azd up ) Yes, it’s a burger ordering system. Who doesn't like burgers? Grab your favorite beverage ☕, and let’s dive in for a quick tour! TL;DR key takeaways New sample: full-stack Node.js AI agent using LangChain.js v1 + MCP tools Architecture: web app → agent API → MCP server → burger API Runs locally with a single npm start , deploys with azd up Uses streaming (NDJSON) with intermediate tool + LLM steps surfaced to the UI Ready to fork, extend, and plug into your own domain / tools What will you learn here? What this sample is about and its high-level architecture What LangChain.js v1 brings to the table for agents How to deploy and run the sample How MCP tools can expose real-world APIs Reference links for everything we use GitHub repo LangChain.js docs Model Context Protocol Azure Developer CLI MCP Inspector Use case You want an AI assistant that can take a natural language request like “Order two spicy burgers and show me my pending orders” and: Understand intent (query menu, then place order) Call the right MCP tools in sequence, calling in turn the necessary APIs Stream progress (LLM tokens + tool steps) Return a clean final answer Swap “burgers” for “inventory”, “bookings”, “support tickets”, or “IoT devices” and you’ve got a reusable pattern! Sample overview Before we play a bit with the sample, let's have a look at the main services implemented here: Service Role Tech Agent Web App ( agent-webapp ) Chat UI + streaming + session history Azure Static Web Apps, Lit web components Agent API ( agent-api ) LangChain.js v1 agent orchestration + auth + history Azure Functions, Node.js Burger MCP Server ( burger-mcp ) Exposes burger API as tools over MCP (Streamable HTTP + SSE) Azure Functions, Express, MCP SDK Burger API ( burger-api ) Business logic: burgers, toppings, orders lifecycle Azure Functions, Cosmos DB Here's a simplified view of how they interact: There are also other supporting components like databases and storage not shown here for clarity. For this quickstart we'll only interact with the Agent Web App and the Burger MCP Server, as they are the main stars of the show here. LangChain.js v1 agent features The recent release of LangChain.js v1 is a huge milestone for the JavaScript AI community! It marks a significant shift from experimental tools to a production-ready framework. The new version doubles down on what’s needed to build robust AI applications, with a strong focus on agents. This includes first-class support for streaming not just the final output, but also intermediate steps like tool calls and agent reasoning. This makes building transparent and interactive agent experiences (like the one in this sample) much more straightforward. Quickstart Requirements GitHub account Azure account (free signup, or if you're a student, get free credits here) Azure Developer CLI Deploy and run the sample We'll use GitHub Codespaces for a quick zero-install setup here, but if you prefer to run it locally, check the README. Click on the following link or open it in a new tab to launch a Codespace: Create Codespace This will open a VS Code environment in your browser with the repo already cloned and all the tools installed and ready to go. Provision and deploy to Azure Open a terminal and run these commands: # Install dependencies npm install # Login to Azure azd auth login # Provision and deploy all resources azd up Follow the prompts to select your Azure subscription and region. If you're unsure of which one to pick, choose East US 2 . The deployment will take about 15 minutes the first time, to create all the necessary resources (Functions, Static Web Apps, Cosmos DB, AI Models). If you're curious about what happens under the hood, you can take a look at the main.bicep file in the infra folder, which defines the infrastructure as code for this sample. Test the MCP server While the deployment is running, you can run the MCP server and API locally (even in Codespaces) to see how it works. Open another terminal and run: npm start This will start all services locally, including the Burger API and the MCP server, which will be available at http://localhost:3000/mcp . This may take a few seconds, wait until you see this message in the terminal: 🚀 All services ready 🚀 When these services are running without Azure resources provisioned, they will use in-memory data instead of Cosmos DB so you can experiment freely with the API and MCP server, though the agent won't be functional as it requires a LLM resource. MCP tools The MCP server exposes the following tools, which the agent can use to interact with the burger ordering system: Tool Name Description get_burgers Get a list of all burgers in the menu get_burger_by_id Get a specific burger by its ID get_toppings Get a list of all toppings in the menu get_topping_by_id Get a specific topping by its ID get_topping_categories Get a list of all topping categories get_orders Get a list of all orders in the system get_order_by_id Get a specific order by its ID place_order Place a new order with burgers (requires userId , optional nickname ) delete_order_by_id Cancel an order if it has not yet been started (status must be pending , requires userId ) You can test these tools using the MCP Inspector. Open another terminal and run: npx -y @modelcontextprotocol/inspector Then open the URL printed in the terminal in your browser and connect using these settings: Transport: Streamable HTTP URL: http://localhost:3000/mcp Connection Type: Via Proxy (should be default) Click on Connect, then try listing the tools first, and run get_burgers tool to get the menu info. Test the Agent Web App After the deployment is completed, you can run the command npm run env to print the URLs of the deployed services. Open the Agent Web App URL in your browser (it should look like https://<your-web-app>.azurestaticapps.net ). You'll first be greeted by an authentication page, you can sign in either with your GitHub or Microsoft account and then you should be able to access the chat interface. From there, you can start asking any question or use one of the suggested prompts, for example try asking: Recommend me an extra spicy burger . As the agent processes your request, you'll see the response streaming in real-time, along with the intermediate steps and tool calls. Once the response is complete, you can also unfold the debug panel to see the full reasoning chain and the tools that were invoked: Tip: Our agent service also sends detailed tracing data using OpenTelemetry. You can explore these either in Azure Monitor for the deployed service, or locally using an OpenTelemetry collector. We'll cover this in more detail in a future post. Wrap it up Congratulations, you just finished spinning up a full-stack serverless AI agent using LangChain.js v1, MCP tools, and Azure’s serverless platform. Now it's your turn to dive in the code and extend it for your use cases! 😎 And don't forget to azd down once you're done to avoid any unwanted costs. Going further This was just a quick introduction to this sample, and you can expect more in-depth posts and tutorials soon. Since we're in the era of AI agents, we've also made sure that this sample can be explored and extended easily with code agents like GitHub Copilot. We even built a custom chat mode to help you discover and understand the codebase faster! Check out the Copilot setup guide in the repo to get started. You can quickly get up speed with the MCP for Beginners course and AI Agents for Beginners course. If you like this sample, don't forget to star the repo ⭐️! You can also join us in the Azure AI community Discord to chat and ask any questions. Happy coding and burger ordering! 🍔Strategic Solutions for Seamless Integration of Third-Party SaaS
Modern systems must be modular and interoperable by design. Integration is no longer a feature, it’s a requirement. Developers are expected to build architectures that connect easily with third-party platforms, but too often, core systems are designed in isolation. This disconnect creates friction for downstream teams and slows delivery. At Microsoft, SaaS platforms like SAP SuccessFactors and Eightfold support Talent Acquisition by handling functions such as requisition tracking, application workflows, and interview coordination. These tools help reduce costs and free up engineering focus for high-priority areas like Azure and AI. The real challenge is integrating them with internal systems such as Demand Planning, Offer Management, and Employee Central. This blog post outlines a strategy centered around two foundational components: an Integration and Orchestration Layer, and a Messaging Platform. Together, these enable real-time communication, consistent data models, and scalable integration. While Talent Acquisition is the use case here, the architectural patterns apply broadly across domains. Whether you're embedding AI pipelines, managing edge deployments, or building platform services, thoughtful integration needs to be built into the foundation, not bolted on later.Azure Database for MySQL bindings for Azure Functions (General Availability)
We’re thrilled to announce the general availability (GA) of Azure Database for MySQL Input and Output bindings for Azure Functions—a powerful way to build event-driven, serverless applications that seamlessly integrate with your MySQL databases. Key Capabilities With this GA release, your applications can use: Input bindings that allow your function to retrieve data from a MySQL database without writing any connection or query logic. Output bindings that allow your function to insert or update data in a MySQL table without writing explicit SQL commands. In addition you can use both the input and output bindings in the same function to read-modify-write data patterns. For example, retrieve a record, update a field, and write it back—all without managing connections or writing SQL. These bindings are fully supported for both in-process and isolated worker models, giving you flexibility in how you build and deploy your Azure Functions. How It Works Azure Functions bindings abstract away the boilerplate code required to connect to external services. With the MySQL Input and Output bindings, you can now declaratively connect your serverless functions to your Azure Database for MySQL database with minimal configuration. You can configure these bindings using attributes in C#, decorators in Python, or annotations in JavaScript/Java. The bindings use the MySql.Data.MySqlClient library under the hood and support Azure Database for MySQL Flexible Server. Getting Started To use the bindings, install the appropriate NuGet or npm package: # For isolated worker model (C#) dotnet add package Microsoft.Azure.Functions.Worker.Extensions.MySql # For in-process model (C#) dotnet add package Microsoft.Azure.WebJobs.Extensions.MySql Then, configure your function with a connection string and binding metadata. Full samples for all the supported programming frameworks are available in our github repository. Here is a sample C# in-process function example where you want to retrieve a user by ID, increment their login count, and save the updated record back to the MySQL database for lightweight data transformations, modifying status fields or updating counters and timestamps. public class User { public int Id { get; set; } public string Name { get; set; } public int LoginCount { get; set; } } public static class UpdateLoginCountFunction { [FunctionName("UpdateLoginCount")] public static async Task<IActionResult> Run( [HttpTrigger(AuthorizationLevel.Function, "post", Route = "user/{id}/login")] HttpRequest req, [MySql("SELECT * FROM users WHERE id = @id", CommandType = System.Data.CommandType.Text, Parameters = "@id={id}", ConnectionStringSetting = "MySqlConnectionString")] User user, [MySql("users", ConnectionStringSetting = "MySqlConnectionString")] IAsyncCollector<User> userCollector, ILogger log) { if (user == null) { return new NotFoundObjectResult("User not found."); } // Modify the user object user.LoginCount += 1; // Write the updated user back to the database await userCollector.AddAsync(user); return new OkObjectResult($"Login count updated to {user.LoginCount} for user {user. Name}."); } } Learn More Azure Functions MySQL Bindings Azure Functions Conclusion With input and output bindings for Azure Database for MySQL now generally available, building serverless apps on Azure with MySQL has never been simpler or more efficient. By eliminating the need for manual connection management and boilerplate code, these bindings empower you to focus on what matters most: building scalable, event-driven applications with clean, maintainable code. Whether you're building real-time dashboards, automating workflows, or syncing data across systems, these bindings unlock new levels of productivity and performance. We can’t wait to see what you’ll build with them. If you have any feedback or questions about the information provided above, please leave a comment below or email us at AskAzureDBforMySQL@service.microsoft.com. Thank you!How to Use Postgres MCP Server with GitHub Copilot in VS Code
GitHub Copilot has changed how developers write code, but when combined with an MCP (Model Copilot Protocol) server, it also connects your services. With it, Copilot can understand your database schema and generate relevant code for your API, data models, or business logic. In this guide, you'll learn how to use the Neon Serverless Postgres MCP server with GitHub Copilot in VS Code to build a sample REST API quickly. We'll walk through how to create an Azure Function that fetches data from a Neon database, all with minimal setup and no manual query writing. From Code Generation to Database Management with GitHub Copilot AI agents are no longer just helping write code—they’re creating and managing databases. When a chatbot logs a customer conversation, or a new region spins up in the Azure cloud, or a new user signs up, an AI agent can automatically create a database in seconds. No need to open a dashboard or call an API. This is the next evolution of software development: infrastructure as intent. With tools like database MCP servers, agents can now generate real backend services as easily as they generate code. GitHub Copilot becomes your full-stack teammate. It can answer database-related questions, fetch your database connection string, update environment variables in your Azure Function, generate SQL queries to populate tables with mock data, and even help you create new databases or tables on the fly. All directly from your editor, with natural language prompts. Neon has a dedicated MCP server that makes it possible for Copilot to directly understand the structure of your Postgres database. Let's get started with using the Neon MCP server and GitHub Copilot. What You’ll Need Node.js (>= v18.0.0) and npm: Download from nodejs.org. An Azure subscription (create one for free) Install either the stable or Insiders release of VS Code: Stable release Insiders release Install the GitHub Copilot, GitHub Copilot for Azure, and GitHub Copilot Chat extensions for VS Code Azure Functions Core Tools (for local testing) Connect GitHub Copilot to Neon MCP Server Create Neon Database Visit the Neon on Azure Marketplace page and follow the Create a Neon resource guide to deploy Neon on Azure for your subscription. Neon has free plan is more than enough to build proof of concept or kick off a new startup project. Install the Neon MCP Server for VS Code Neon MCP Server offers two options for connecting your VS Code MCP client to Neon. We will use the Remote Hosted MCP Server option. This is the simplest setup—no need to install anything locally or configure a Neon API key in your client. Add the following Neon MCP server configuration to your user settings in VS Code: { "mcp":{ "servers":{ "Neon":{ "command":"npx", "args":[ "-y", "mcp-remote", "https://mcp.neon.tech/sse" ] } } } } Click on Start on the MCP server. A browser window will open with an OAuth prompt. Just follow the steps to give your VS Code MCP client access to your Neon account. Generate an Azure Function REST API using GitHub Copilot Step 1: Create an empty folder (ex: mcp-server-vs-code) and open it in VS Code. Step 2: Open GitHub Copilot Chat in VS Code and switch to Agent mode. You should see the available tools. Step 3: Ask Copilot something like "Create an Azure function with an HTTP trigger”: Copilot will generate a REST API using Azure Functions in JavaScript with a basic setup to run the functions locally. Step 4: Next, you can ask to list existing Neon projects: Step 5: Try out different prompts to fetch the connection string for the chosen database and set it to the Azure Functions settings. Then ask to create a sample Customer table and so on. Or you can even prompt to create a new database branch on Neon. Step 6: Finally, you can prompt Copilot to update the Azure functions code to fetch data from the table: Combine the Azure MCP Server Neon MCP gives GitHub Copilot full access to your database schema, so it can help you write SQL queries, connect to the database, and build API endpoints. But when you add Azure MCP Server into the mix, Copilot can also understand your Azure services—like Blob Storage, Queues, and Azure AI. You can run both Neon MCP and Azure MCP at the same time to create a full cloud-native developer experience. For example: Use Neon MCP for serverless Postgres with branching and instant scale. Use Azure MCP to connect to other Azure services from the same Copilot chat. Even better: Azure MCP is evolving. In newer versions, you can spin up Azure Functions and other services directly from Copilot chat, without ever leaving your editor. Copilot pulls context from both MCP servers, which means smarter code suggestions and faster development. You can mix and match based on your stack and let Copilot help you build real, working apps in minutes. Final Thoughts With GitHub Copilot, Neon MCP server, and Azure Functions, you're no longer writing backend code line by line. It is so fast to build APIs. You're orchestrating services with intent. This is not the future—it’s something you can use today.Feedback Loops in GenAI with Azure Functions, Azure OpenAI and Neon serverless Postgres
Generative Feedback Loops (GFL) are focused on optimizing and improving the AI’s outputs over time through a cycle of feedback and learning based on the production data. Learn how to build GenAI solution with feedback loops using Azure OpenAI, Azure Functions and Neon Serverless PostgresAzure Database for MySQL triggers for Azure Functions (Public Preview)
Developers can now accelerate development time and focus only on the core business logic of their applications, for developing event-driven applications with Azure Database for MySQL as the backend data store. We are excited to announce that you can now invoke an Azure Function based on changes to an Azure Database for MySQL table. This new capability is made possible through the Azure Database for MySQL triggers for Azure Functions, now available in public preview. Azure Database for MySQL triggers The Azure Database for MySQL trigger uses change tracking functionality to monitor a MySQL table for changes and trigger a function when a row is created, updated, or deleted enabling customers to build highly-scalable event-driven applications. Similar to the Azure Database for MySQL Input and Output bindings for Azure Functions, a connection string for the MySQL database is stored in the application settings of the Azure Function to trigger the function when a change is detected on the tables. Note: In public preview, Azure Database for MySQL triggers for Azure Functions are available only for dedicated and premium plan of Azure Functions To enable change tracking on an existing Azure Database for MySQL table to use trigger bindings for an Azure Function, it is necessary to alter the table structure, for example, enabling change tracking on an employees data table: ALTER TABLE employees ADD COLUMN az_func_updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP; Azure Database for MySQL trigger uses the 'az_func_updated_at' and column's data to monitor the table for any changes on which change tracking is enabled. Changes are then processed in the order that they were made, with the oldest changes being processed first. Important: If changes to multiple rows are made at once, then the exact order they're sent to the function is determined on the ascending order of the az_func_updated_at and the primary key columns. If multiple changes are made to a row in-between an iteration, then only the latest changes for that particular rows are considered. The following example demonstrates a C# function that is triggered when changes occur in the employees table. The MySQL trigger uses attributes for the table name and the connection string. using System.Collections.Generic; using Microsoft.Azure.WebJobs; using Microsoft.Azure.WebJobs.Extensions.MySql; using Microsoft.Extensions.Logging; namespace EmployeeSample.Function { public static class EmployeesTrigger { [FunctionName(nameof(EmployeesTrigger))] public static void Run( [MySqlTrigger("Employees", "MySqlConnectionString")] IReadOnlyList<MySqlChange<Employee>> changes, ILogger logger) { foreach (MySqlChange<Employee> change in changes) { Employee employee= change. Item; logger.LogInformation($"Change operation: {change.Operation}"); logger.LogInformation($"EmployeeId: {employee.employeeId}, FirstName: {employee.FirstName}, LastName: {employee.LastName}, Company: {employee. Company}, Department: {employee. Department}, Role: {employee. Role}"); } } } } Join the preview and share your feedback! We are eager for you to try out the new Azure Database for MySQL triggers for Azure Functions and build highly scalable event-driven and serverless applications. For more information refer https://aka.ms/mysqltriggers about using MySQL triggers for all the supported programming frameworks with detailed step-by-step instructions If you have any feedback or questions about the information provided above, please leave a comment below or email us at AskAzureDBforMySQL@service.microsoft.com. Thank you!Adding users to an AD group with Azure Functions/Logic Apps
I want to add users to an Entra ID/Azure AD group. The list of users will be retrieved from a REST API call with Azure Functions, and then saved into a database, probably Azure SQL. I'm planning on then using Azure Logic Apps to connect the database to the AD group. How can I make the script run every time the REST API changes? Can I add users to the AD group from SQL? Is there a better way to go about this?749Views0likes5CommentsEnhancing Data Security and Digital Trust in the Cloud using Azure Services.
Enhancing Data Security and Digital Trust in the Cloud by Implementing Client-Side Encryption (CSE) using Azure Apps, Azure Storage and Azure Key Vault. Think of Client-Side Encryption (CSE) as a strategy that has proven to be most effective in augmenting data security and modern precursor to traditional approaches. CSE can provide superior protection for your data, particularly if an authentication and authorization account is compromised.2.8KViews0likes0CommentsHack Together: RAG Hack - Building RAG Applications with LangChain.js
In the rapidly evolving landscape of Artificial Intelligence and Natural Language Processing, the use of Retrieval Augmented Generation (RAG) has emerged as a powerful solution to enhance the accuracy and relevance of responses generated by language models. In this article, we will explore the talk given during the Hack Together: RAG Hack event, where Glaucia Lemos, a Cloud Advocate at Microsoft, and Yohan Lasorsa, a Senior Cloud Advocate at Microsoft, demonstrated how LangChain.js is revolutionizing the development of RAG applications, making it easier to create intelligent applications that combine large language models (LLMs) with your own data sources.Building Intelligent Apps with Azure Cache for Redis, EntraID, Azure Functions, E1 SKU, and more!
We're excited to announce the latest updates to Azure Cache for Redis that will improve your data management and application performance as we kickoff for Microsoft Build 2024. Coming soon, the Enterprise E1 SKU (Preview) will offer a lower entry price, Redis modules, and enterprise-grade features. The Azure Function Triggers and Bindings for Redis are now in general availability, simplifying your workflow with seamless integration. Microsoft EntraID in Azure Cache for Redis is now in GA, providing enhanced security management. And there's more – we are also sharing added resources for developing intelligent applications using Azure Cache for Redis Enterprise, enabling you to build smarter, more responsive apps. Read the blog below to find out more about these amazing updates and how they can enhance your Azure Cache for Redis experience.
2.3KViews2likes0Comments