azure
127 TopicsLogic Apps Community Day 2025
We have delayed the Speaker announcements, as we had to keep the sessions open for an extra day. Speakers will now by notified by email by September 13th, 2025. Speakers and sessions will be published by September 19th, 2025 We are bringing Logic Apps Community Day again this year, on October 30, 2025 (Pacific Time) and we want you to join us as we host a full day of learning where you will be the star! The Logic Apps Community Day is a free event driven by Microsoft, for anyone who wants to learn more about Logic Apps and how it can help to solve real life integration problems. This year, we want to learn how you have been using AI with Logic Apps, so our themes for the sessions are: Creating Intelligent Applications with Logic Apps and AI: tell us how you have been using Logic Apps features to implement your intelligent application scenarios - from Agent Loops, to Logic Apps exposed as MCP tools, to improving your intelligent application knowledge in real time - we want to see the scenarios you created! Accelerating Logic Apps Developer Velocity with AI: How are you taking advantage of Gen AI to make your developer life easier using Logic Apps? From prompts to create test data, to automated creation of maps, unit test or custom code and anything in between. Maybe you have been using prompts, instructions or chat modes to make your development life easier? We want to see it all! Call for papers Within the themes above we are looking for breakout sessions (around 28 minutes in lenght) or lightning sessions (around 12 minutes in length). The call for papers are open until September 7, 2025 - 11:00 PM (PST). Selected sessions and speakers will be informed by email by September 15, 2025. Click on the image below to submit your session to Logic Apps Community Day 2025! We are looking forward your proposals!700Views0likes0CommentsHybrid Logic Apps deployment on Rancher K3s Kubernetes cluster
K3s is a lightweight Kubernetes distribution, certified by the Cloud Native Computing Foundation (CNCF) and originally developed by Rancher. It is optimized for on-premises environments with limited resources, making it ideal for edge computing and lightweight hybrid scenarios. Unlike a full Kubernetes distribution, K3s reduces overhead while maintaining full Kubernetes API compatibility. This makes K3s an ideal choice for hosting Logic Apps Standard near your data sources—such as on-premises SQL Server or local file shares—when you have lightweight workloads. There are 5 steps which are followed to setup the Hybrid Logic Apps including infrastructure which is illustrated in the following diagram. Most of these 5 steps are same as discussed in the Hybrid Logic Apps doc except the K3s Setup part Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn. Step 1: Prepare the K3s Cluster Docker desktop setup - In this case, the host machine is Windows 11 so decided to user Docker with WSL2 to setup the containers. Install the docker desktop using WSL2 Docker Desktop: The #1 Containerization Tool for Developers | Docker and make sure we select WSL2 Install K3s on your infrastructure and create single node cluster using k3d. #Install choco , kubectl and Helm Set-ExecutionPolicy Bypass -Scope Process -Force; [System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://community.chocolatey.org/install.ps1')) powershell choco install kubernetes-cli -y choco install kubernetes-helm -y choco install k3d -y # open in new powershell window powershell k3d cluster create # deleting the default load balancer Traefik as it conflicts with 80 and 443 port - we can configure the load balancer to other ports if needed kubectl delete svc traefik -n kube-system kubectl delete deployment traefik -n kube-system Next two steps are same as given Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn Step 2: Connect the Kubernetes cluster to Azure Arc Step 3: Setup the Azure Container Apps extension and environment You need to skip the core DNS setup required for Azure Local as given in Update CoreDNS Step 4: Conduct the Storage Configuration for SQL and SMB SQL Database (Runtime Store): Hybrid Logic Apps use SQL database for runtime operations and run history. In this scenario I used on-premise SQL server using SQL Authentication. I setup the SQL Server 2022 on the Windows host machine, enabled SQL server authentication and added new SQL admin user. Please follow the link for more details.. The SQL connection string can be validated using following PowerShell script $connectionString = "Server=<server IP address>;Initial Catalog=<databaseName>;Persist Security Info=False;User ID=<sqluser>;Password=<password>;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;" try { $connection = New-Object System.Data.SqlClient.SqlConnection $connection.ConnectionString = $connectionString $connection.Open() Write-Host "✅ Connection successful" $connection.Close() } catch { Write-Host "❌ Connection failed: $($_.Exception.Message)" } SMB is used as local file share on Windows host machine; it is advised to use a new user for the Windows SMB share $Username = "k3suser" $Password = ConvertTo-SecureString "<password complex>" -AsPlainText -Force $FullName = "K3s user" $Description = "Created via PowerShell" # Create the user New-LocalUser -Name $Username -Password $Password -FullName $FullName -Description $Description Add-LocalGroupMember -Group "Users" -Member $Username Once the above user is created you can use Windows hosted machine to create Artifacts folder and allow read and write access. Please follow the link for more details Step 5: Create your Logic App (Hybrid) With all prerequisites and infrastructure in place for creating Hybrid Logic Apps, the next step is to build the Logic Apps using the specified connection string and SMB share path. This can be accomplished through the Azure Portal, as outlined below. Now you can create Logic Apps workflows using the designer and execute the Logic Apps workflow.Announcing the Public Preview of the Applications feature in Azure API management
API Management now supports built-in OAuth 2.0 application-based access to product APIs using the client credentials flow. This feature allows API managers to register Microsoft Entra ID applications, streamlining secure API access for developers through OAuth 2.0 authorization. API publishers and developers can now more effectively manage client identity, access, and authorization flows. With this feature: API managers can identify which products require OAuth authorization by setting a product property to enable application-based access API managers can create and manage client applications and assign them access to specific products. Developers can see their registered applications in API management developer portal and use OAuth tokens to securely call APIs and products OAuth tokens presented in API requests are validated by the API Management gateway to authorize access to the product's APIs. This feature simplifies identity and access management in API programs, enabling a more secure and scalable approach to API consumption. Enable OAuth authorization API managers can now identify specific products which are protected by Microsoft Entra identity by enabling "Application based access". This ensures that only valid client applications which have a secure OAuth token from Microsoft Entra identity can access the APIs associated with this product. An application is created in Microsoft Entra corresponding to the product, with appropriate app role. Register client applications and assign products API managers can register client applications, identify specific developers as owners of these applications and assign products to these applications. This creates a new application in Microsoft Entra and assigns API permissions to access the product. Securely access the API using client applications Developers can login into API management developer portal and see the appropriate applications assigned to them. They can retrieve the application credentials and call Microsoft Entra to get an OAuth token, use this token to call APIM gateway and securely access the product/API. Preview limitations The public preview of the Applications is a limited-access feature. To participate in the preview and enable Applications in your APIM service instance, you must complete a request form. The Azure API Management team will review your request and respond via email within five business days. Learn more Securely access product APIs with Microsoft Entra applicationsEnforce or Audit Policy Inheritance in API Management
We’re excited to announce a new Azure Policy definition that lets you enforce or audit policy inheritance in Azure API Management. With this capability, platform and governance teams can ensure that API Management policies are always inherited across all policy scopes — operations, APIs, products, and workspaces — strengthening consistency, compliance, and security across your API estate. Why this matters In Azure API Management, the <base /> policy element plays a critical role: it ensures that a runtime policy inherits policies defined at a higher scope, such as product, workspace, or all APIs (global). Without <base />, developers can inadvertently (or intentionally) bypass important platform rules, for example: Security controls like authentication or IP restrictions Operational requirements such as logging, tracing, or rate-limiting Business policies such as quota enforcement The result can be inconsistent behavior, compliance drift, and gaps in governance. How the new policy helps With the new Azure Policy definition, you can automatically ensure that <base /> is located at the start of each API Management policy section — <inbound>, <outbound>, <backend>, and <on-error> — across policies configured on operations, APIs, products, and workspaces. You can set the effect parameter to: Audit: Identify operation, API, product, or workspace policies where <base /> is missing. Deny: Prevent deployment of policies that do not include <base />. Get started To enable this new Azure Policy definition: Navigate to Azure Policy in the Azure portal. Select “Definitions” from the menu and choose “API Management policies should inherit parent scope policies using <base />”. In the policy definition view, select “Assign”. Configure the policy assignment scope, parameter (audit or deny), and other details. View built-in Azure Policy definitions for API Management.247Views0likes0CommentsCodeful Workflows: A New Authoring Model for Logic Apps Standard
📝 This blog introduce early concepts of a pre-release functionality and is subject to change. Azure Logic Apps Standard offers you a powerful cloud orchestration engine, enabling you to build and run automated workflows that effortlessly integrate resources from various services, systems, apps, and data sources. Whether you're looking to streamline processes across a complex enterprise or simply reduce the need for extensive coding, this platform provides a solution that's both efficient and flexible. For those of you who require more control over workflow designs or want to leverage your expertise in frameworks like .NET and the Durable Tasks framework, Logic Apps Standard now introduces an exciting new feature: Codeful Workflows. With Codeful Workflows, you can define workflows using an imperative programming style, blending the flexibility of coding with the simplicity and operational strengths of Logic Apps. This means you can structure your workflows the way that makes sense to you while still tapping into the rich ecosystem of connectors and tools built into Logic Apps. What Are Codeful Workflows? Codeful Workflows expand the authoring and execution models of a Logic Apps Standard, offering developers the ability to implement, test and run workflows using an imperative programming model both locally and in the cloud. Built on frameworks like .NET and the Durable Tasks framework, Codeful Workflows allow you to structure workflows in code while seamlessly integrating with Logic Apps Standard rich connector ecosystem, and leverage its operational capabilities. The core elements of a Logic App workflow—triggers, actions and connections —are translated into durable task concepts within this codeful model: Triggers are implemented as Client Functions that invoke durable orchestrations, which contain the body of the workflow, blending logic implemented by the language primitives, with connections actions for external connectivity. Connector actions are presented as Activity Functions. The Logic Apps Connector ecosystem is exposed to you via an SDK, bringing discoverability and rich support for intelisense when creating action inputs, invoking actions or reusing action outputs in later steps. The SDK vastly simplifies the execution of those connectors, by wrapping them internally on a Activity Function, so you don’t need to create new activities for each connector action you want to invoke. Connections, which manages the connectivitiy between actions and end systems, remains unchanged, allowing you to setup once and share connections between multiple orchestrations and logic apps declarative workflows. Connector actions uses a reference to a connection, providing flexibility between local and cloud configurations. Using those building blocks, you can create workflows using familiar programming paradigms, while still benefiting from the easy configuration and operational feature of Logic Apps Standard. If you are an existing Logic Apps Standard customer, your codeful and visual workflows can coexist within the same application, bridging the gap between pro-code and low-code approaches. With those two execution models working hand in hand on the same application, Logic Apps Standard becomes a comprehensive orchestration tool that caters to all developer personas, from integration specialists to enterprise teams, with no cliffs on their experience. Creating Codeful Workflows Designing codeful workflows begins with creating a new Logic Apps project within Visual Studio Code, configured for .NET and the Durable Tasks framework. From triggers to actions, developers gain full flexibility to define their workflows programmatically. Implementing Triggers Triggers are the entry points of workflows, and in Codeful Workflows, they are defined as Client Functions. For example, an HTTP trigger can start a workflow when a request is received: [FunctionName("HelloTrigger")] public static async Task<HttpResponseMessage> HttpStart( [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestMessage req, [DurableClient] IDurableOrchestrationClient starter, ILogger log) { var requestContent = await req.Content.ReadAsStringAsync(); var workflowInput = new HTTPHelloInput { Greeting = $"Hello from Codeful workflows. You said '{requestContent}'" }; log.LogInformation("Workflow Input = '{workflowInput}'.", JsonSerializer.Serialize(workflowInput)); string instanceId = await starter.StartNewAsync("HelloOrchestrator", workflowInput); log.LogInformation("Started orchestration with ID = '{instanceId}'.", instanceId); return await starter.WaitForCompletionOrCreateCheckStatusResponseAsync(req, instanceId); } Using Connector Actions Both Managed and Service Provider Actions are available to be used within your orchestrations. They are organized in the SDK by type making it easy to find the right connector to use. Once you identify the action to use, you can use the rich intelisense interface to generate inputs and call the action directly in your orchestration code. Deployment and Operations Deploying Logic Apps Standard that uses both codeful and codeless workflows follows the same practices already available in Logic Apps Standard. Operational insights, such as endpoint visibility and execution monitoring, are provided within the Azure Portal, ensuring parity with the functionality available for codeless workflows. This cohesive deployment model allows organizations to maximize their resources and cater to diverse development needs, whether they require quick prototyping via low-code tools or robust, scalable solutions through pro-code implementations. Codeful Workflows and Intelligent Agents You can take advantage of codeful workflows and Logic Apps Standard Agent Loop to create new intelligent applications that embed advanced AI decision-making directly into your processes – enabling your apps and automation to not just follow predefined steps, but to reason, adapt, and act autonomously towards goals. See this demo where we share two approaches to implement agent loops – combining codeful and codeless workflows, where you can reuse existing workflows as tools, and writing agent loop actions directly with code: Looking for feedback on Codeful Workflows We are looking for early feedback on this feature. If you are interested in participating on a private preview, please use the form below to register your interest and we will contact you to share the instructions. https://aka.ms/lacodeful/privatepreview/form2.6KViews4likes2CommentsUpdate To API Management Workspaces Breaking Changes: Built-in Gateway & Tiers Support
What’s changing? If your API Management service uses preview workspaces on the built-in gateway and meets the tier-based limits below, those workspaces will continue to function as-is and will automatically transition to general availability once built-in gateway support is fully announced. API Management tier Limit of workspaces on built-in gateway Premium and Premium v2 Up to 30 workspaces Standard and Standard v2 Up to 5 workspaces Basic and Basic v2 Up to 1 workspace Developer Up to 1 workspace Why this change? We introduced the requirement for workspace gateways to improve reliability and scalability in large, federated API environments. While we continue to recommend workspace gateways, especially for scenarios that require greater scalability, isolation, and long-term flexibility, we understand that many customers have established workflows using the preview workspaces model or need workspaces support in non-Premium tiers. What’s not changing? Other aspects of the workspace-related breaking changes remain in effect. For example, service-level managed identities are not available within workspaces. In addition to workspaces support on the built-in gateway described in the section above, Premium and Premium v2 services will continue to support deploying workspaces with workspace gateways. Resources Workspaces in Azure API Management Original breaking changes announcements Reduced tier availability Requirement for workspace gateways919Views2likes7CommentsLogic Apps Aviators Newsletter - September 25
In this issue: Ace Aviator of the Month News from our product group Community Playbook News from our community Ace Aviator of the Month September’s Ace Aviator: Kritika Singh Integration Architect & Sr. Consultant at Capgemini Norge AS What's your role and title? What are your responsibilities? I work as an Integration Architect & Sr. Consultant at Capgemini Norge AS. In my role I assist clients in addressing a wide range of integration challenges, with a particular emphasis on modernizing legacy systems, such as BizTalk, by transitioning them to cloud-native Azure iPaaS solutions. I’m responsible for architecting secure and scalable integration landscapes, designing and developing solutions, mentoring team members, and engaging with stakeholders and cross-functional teams. I work extensively with technologies such as Azure Logic Apps, Azure Functions, API Management, Service Bus, App Service Environment(ASEv3), Virtual Networks and CI/CD pipelines using GitHub. One of my proudest achievements was successfully delivering a complex BizTalk modernization project to Azure that required deep technical expertise and strategic coordination. Can you give us some insights into your day-to-day activities and what a typical day in your role looks like? As part of a distributed team across different locations and countries, my day starts with stand-ups involving both offshore and onshore team members to review progress and assign tasks. I spend time with developers, helping them navigate technical challenges and mentoring them through dedicated sessions—this has helped improve delivery quality and team confidence. I collaborate closely with cross-functional teams and stakeholders to align on requirements and solution design. Throughout the day, I work on resolving issues, improving existing solutions, work on innovation ideas and managing tasks. What motivates and inspires you to be an active member of the Aviators/Microsoft community? I am deeply passionate about technology—having worked with BizTalk throughout my career and, for the past 5+ years, diving into Azure iPaaS. Microsoft products evolve constantly, and that sparks my curiosity to explore, learn, and innovate every day. What truly drives me is the opportunity to give back to the community by sharing my learnings, challenges, and even failures. It’s all about growing together and inspiring others along the way. Looking back, what advice do you wish you had been given earlier that you'd now share with those looking to get into STEM/technology? Curiosity and consistency matter more than perfection. Don’t be afraid to ask questions, experiment, and fail, that’s where the real learning happens. Also, find a community that supports you; sharing your journey, both wins and setbacks, can inspire others and help you grow faster. What has helped you grow professionally? My professional growth has been shaped by a blend of curiosity, courage, and the right opportunities. Starting with BizTalk and evolving into Azure iPaaS, I’ve embraced every challenge as a chance to learn. What’s made the biggest difference is having the boldness to take on responsibility, the willingness to take risks, and the drive to keep growing. Sharing my journey with the community has not only helped others but also deepened my own learning. If you had a magic wand that could create a feature in Logic Apps, what would it be and why? If I had a magic wand to enhance Logic Apps, I’d bring in three powerful features to supercharge developer productivity and solution resilience: Seamless Version Control & Rollback Having Git-like capabilities built into Logic Apps—track every change, compare versions, and roll back instantly when needed. This would empower teams to experiment confidently and collaborate more effectively without fear of breaking production workflows. Effortless Disaster Recovery Setup Setting up DR should be as simple as a few clicks. A built-in, automated DR configuration for AIS would ensure business continuity, reduce downtime, and give developers peace of mind—especially in mission-critical environments. Native JSON Mapper(Not Liquid) A visual, intuitive JSON mapping tool would simplify complex data transformations, reduce manual coding, and speed up development. This would be a game-changer for integration scenarios, especially when working with dynamic schemas and APIs. Simplified Authorization like ClaimChecks for Logic Apps Standard (Beyond EasyAuth) A more developer-friendly authorization setup that minimizes manual configurations and integrates seamlessly with identity providers. This would make securing Logic Apps faster, easier, and more consistent across environments. News from our product group Logic Apps Live August 2025 Missed Logic Apps Live in August? You can watch it here. We had a recap on Logic Apps Hybrid, our special guest Kritika Singh talking about her learnings with BizTalk Migration to AIS, and updates on Data Mapper GA and Logic Apps Standard Deployment Center. Logic Apps Community Day 2025 We are bringing Logic Apps Community Day again this year, on October 30, 2025 (Pacific Time) and we want you to join us as we host a full day of learning where you will be the star! Call for Speakers is still open until September 07, 2025 – so hurry and submit your session! General Availability: Enhanced Data Mapper Experience in Logic Apps (Standard) We’re excited to announce the General Availability (GA) of the redesigned Data Mapper UX in the Azure Logic Apps (Standard) extension for Visual Studio Code. This release marks a major milestone in our journey to modernize and streamline data transformation workflows for integration developer. Announcing: Setup CD in Azure Logic Apps Standard with Deployment Center Looking to automate your Azure Logic Apps code deploymentsin a faster way? Deployment Center - a built-in feature in Azure Logic Apps Standard - in now available, with built-in support on your VS Code projects, making it easier to deploy Logic Apps from your source control repository. Deployment Center is designed to make deploying, updating, and managing your Logic Apps workflows simple and straightforward. Hybrid Logic Apps deployment on Rancher K3s Kubernetes cluster Explore how Hybrid Logic Apps run effortlessly on K3s—delivering the power of Hybrid Logic Apps without the complexity and heavy infrastructure demands of a full Kubernetes cluster! News from our community What gets returned to the LLM by my Logic App Agent Loop tool? Video by Michael Stephenson Michael has been experimenting with Logic App Agent loop and, following a discussion with Kent Weare about the interaction between the workflow and the LLM, aimed to understand what data is returned to the model, since the number of tokens influences cost. He summarizes his findings from that conversation in this brief video. SOAP 1.2 Calls from Logic Apps – Fixing Unsupported Media & WS-Addressing Errors Post by Prashant Singh Struggling with SOAP 1.2 in Azure Logic Apps? Learn how to fix Unsupported Media errors, decode MTOM responses, and handle WS-Addressing headers for seamless integration in this post by Prashant. Demystifying AI Agent Loops in Logic Apps: The Future of Integration (But Not Everywhere) Post by Al Ghoniem Explore how AI Agent Loops enhance Azure Logic Apps for non-deterministic tasks like anomaly detection and IT Ops triage—while knowing when traditional workflows are the better fit. Can I use AI to create and deploy an Azure Logic Apps with Business Central connector? Post by Stefano Demiliani Stefano is testing the boundaries of what AI can do, so you don’t have to – he ran a blind test showing that AI can deploy Azure resources well—but struggles with external connectors like Business Central. Learn what worked, what didn’t, and why better prompts matter. How to use ChatGPT Agent Mode with Azure! Video by Stephen W. Thomas And looks like August was the month to experiment with Azure Resources. Stephen did some research too and shows you how easily ChatGPT-5 Agent Mode can auto-provision resources in Azure. This video demonstrates how to use a single prompt to build a logic app and create a resource group. Follow his video along to see how to get ChatGPT-5 agent working for you! Integration Love Story - Andrew Wilson Video by Ahmed Bayoumy and Robin Wilde In this special fast-paced episode recorded at INTEGRATE, Ahmed and Robin sit down with the brilliant Andrew, newly awarded Microsoft MVP and Logic Apps Ace Aviator, to talk about his journey, passions, and why integration is the powerhouse behind every digital experience. Tips for Migrating SAP IDoc Reception Workloads from BizTalk to Azure Logic Apps Post by Francois Malgreve Learn how to reuse BizTalk XSLTs in Azure Logic Apps! In this post by Francois, you will learn hot to configure the SAP trigger with the right IDoc format and namespace settings—minimizing code changes and easing migration. Query Azure DevOps work items with Logic App and Managed Identity Post by Michael Stephenson Learn how to use a reusable Logic App and a user-assigned managed identity to securely query Azure DevOps work items using WIQL—ideal for building scalable, secure workflows. Understand Agent Loops in Azure Logic Apps Video by Srikanth Gunnala In this video, Srikanth explore Azure Logic Apps AI Agents — also known as Agent Workflows or Agent Loops — and how they’re redefining workflow automation with Azure OpenAI. You’ll learn what an AI agent is in Azure Logic Apps, how it works, and see a live demo of building an AI-powered, adaptive workflow. Automate Microsoft Fabric Cost Savings with Logic Apps Post by Sherry L. Robinson Learn how to pause and resume Microsoft Fabric capacity using Azure Logic Apps—cutting costs during off-hours with minimal code and seamless integration via REST API or Resource Manager, in this insightful post by Sherry. Use Graph API to send Emails in Logic Apps Post by Şahin Özdemir In this post, Şahin shows your hot to use Microsoft Graph API with service principals to securely send emails from Logic Apps using app registrations and access policies. This will be quite useful in cases where you can’t associate the calls to a user account, which is a requirement for the Office 365 connector.323Views0likes0CommentsAnnouncing the availability of TLS 1.3 in Azure API Management in Preview
TLS 1.3 is the latest version of the internet’s most deployed security protocol, which encrypts data to provide a secure communication channel between two endpoints. TLS 1.3 support in Azure API Management is planned to rollout during the first week of February 2024. The rollout will happen in stages, this means some regions will get it first as we roll out globally.22KViews2likes6CommentsExpose REST APIs as MCP servers with Azure API Management and API Center (now in preview)
As AI-powered agents and large language models (LLMs) become central to modern application experiences, developers and enterprises need seamless, secure ways to connect these models to real-world data and capabilities. Today, we’re excited to introduce two powerful preview capabilities in the Azure API Management Platform: Expose REST APIs in Azure API Management as remote Model Context Protocol (MCP) servers Discover and manage MCP servers using API Center as a centralized enterprise registry Together, these updates help customers securely operationalize APIs for AI workloads and improve how APIs are managed and shared across organizations. Unlocking the value of AI through secure API integration While LLMs are incredibly capable, they are stateless and isolated unless connected to external tools and systems. Model Context Protocol (MCP) is an open standard designed to bridge this gap by allowing agents to invoke tools—such as APIs—via a standardized, JSON-RPC-based interface. With this release, Azure empowers you to operationalize your APIs for AI integration—securely, observably, and at scale. 1. Expose REST APIs as MCP servers with Azure API Management An MCP server exposes selected API operations to AI clients over JSON-RPC via HTTP or Server-Sent Events (SSE). These operations, referred to as “tools,” can be invoked by AI agents through natural language prompts. With this new capability, you can expose your existing REST APIs in Azure API Management as MCP servers—without rebuilding or rehosting them. Addressing common challenges Before this capability, customers faced several challenges when implementing MCP support: Duplicating development efforts: Building MCP servers from scratch often led to unnecessary work when existing REST APIs already provided much of the needed functionality. Security concerns: Server trust: Malicious servers could impersonate trusted ones. Credential management: Self-hosted MCP implementations often had to manage sensitive credentials like OAuth tokens. Registry and discovery: Without a centralized registry, discovering and managing MCP tools was manual and fragmented, making it hard to scale securely across teams. API Management now addresses these concerns by serving as a managed, policy-enforced hosting surface for MCP tools—offering centralized control, observability, and security. Benefits of using Azure API Management with MCP By exposing MCP servers through Azure API Management, customers gain: Centralized governance for API access, authentication, and usage policies Secure connectivity using OAuth 2.0 and subscription keys Granular control over which API operations are exposed to AI agents as tools Built-in observability through APIM’s monitoring and diagnostics features How it works MCP servers: In your API Management instance navigate to MCP servers Choose an API: + Create a new MCP Server and select the REST API you wish to expose. Configure the MCP Server: Select the API operations you want to expose as tools. These can be all or a subset of your API’s methods. Test and Integrate: Use tools like MCP Inspector or Visual Studio Code (in agent mode) to connect, test, and invoke the tools from your AI host. Getting started and availability This feature is now in public preview and being gradually rolled out to early access customers. To use the MCP server capability in Azure API Management: Prerequisites Your APIM instance must be on a SKUv1 tier: Premium, Standard, or Basic Your service must be enrolled in the AI Gateway early update group (activation may take up to 2 hours) Use the Azure Portal with feature flag: ➤ Append ?Microsoft_Azure_ApiManagement=mcp to your portal URL to access the MCP server configuration experience Note: Support for SKUv2 and broader availability will follow in upcoming updates. Full setup instructions and test guidance can be found via aka.ms/apimdocs/exportmcp. 2. Centralized MCP registry and discovery with Azure API Center As enterprises adopt MCP servers at scale, the need for a centralized, governed registry becomes critical. Azure API Center now provides this capability—serving as a single, enterprise-grade system of record for managing MCP endpoints. With API Center, teams can: Maintain a comprehensive inventory of MCP servers. Track version history, ownership, and metadata. Enforce governance policies across environments. Simplify compliance and reduce operational overhead. API Center also addresses enterprise-grade security by allowing administrators to define who can discover, access, and consume specific MCP servers—ensuring only authorized users can interact with sensitive tools. To support developer adoption, API Center includes: Semantic search and a modern discovery UI. Easy filtering based on capabilities, metadata, and usage context. Tight integration with Copilot Studio and GitHub Copilot, enabling developers to use MCP tools directly within their coding workflows. These capabilities reduce duplication, streamline workflows, and help teams securely scale MCP usage across the organization. Getting started This feature is now in preview and accessible to customers: https://aka.ms/apicenter/docs/mcp AI Gateway Lab | MCP Registry 3. What’s next These new previews are just the beginning. We're already working on: Azure API Management (APIM) Passthrough MCP server support We’re enabling APIM to act as a transparent proxy between your APIs and AI agents—no custom server logic needed. This will simplify onboarding and reduce operational overhead. Azure API Center (APIC) Deeper integration with Copilot Studio and VS Code Today, developers must perform manual steps to surface API Center data in Copilot workflows. We’re working to make this experience more visual and seamless, allowing developers to discover and consume MCP servers directly from familiar tools like VS Code and Copilot Studio. For questions or feedback, reach out to your Microsoft account team or visit: Azure API Management documentation Azure API Center documentation — The Azure API Management & API Center Teams7.4KViews5likes7Comments🚀 New in Azure API Management: MCP in v2 SKUs + external MCP-compliant server support
Your APIs are becoming tools. Your users are becoming agents. Your platform needs to adapt. Azure API Management is becoming the secure, scalable control plane for connecting agents, tools, and APIs — with governance built in. -------------------------------------------------------------------------------------------------------------------------------------------------------------------- Today, we’re announcing two major updates to bring the power of the Model Context Protocol (MCP) in Azure API Management to more environments and scenarios: MCP support in v2 SKUs — now in public preview Expose existing MCP-compliant servers through API Management These features make it easier than ever to connect APIs and agents with enterprise-grade control—without rewriting your backends. Why MCP? MCP is an open protocol that enables AI agents—like GitHub Copilot, ChatGPT, and Azure OpenAI—to discover and invoke APIs as tools. It turns traditional REST APIs into structured, secure tools that agents can call during execution — powering real-time, context-aware workflows. Why API Management for MCP? Azure API Management is the single, secure control plane for exposing and governing MCP capabilities — whether from your REST APIs, Azure-hosted services, or external MCP-compliant runtimes. With built-in support for: Security using OAuth 2.1, Microsoft Entra ID, API keys, IP filtering, and rate limiting. Outbound token injection via Credential Manager with policy-based routing. Monitoring and diagnostics using Azure Monitor, Logs, and Application Insights. Discovery and reuse with Azure API Center integration. Comprehensive policy engine for request/response transformation, caching, validation, header manipulation, throttling, and more. …you get end-to-end governance for both inbound and outbound agent interactions — with no new infrastructure or code rewrites. ✅ What’s New? 1. MCP support in v2 SKUs Previously available only in classic tiers (Basic, Standard, Premium), MCP support is now in public preview for v2 SKUs — Basic v2, Standard v2, and Premium v2 — with no pre-requisites or manual enablement required. You can now: Expose any REST API as an MCP server in v2 SKUs Protect it with Microsoft Entra ID, keys or tokens Register tools in Azure API Center 2. Expose existing MCP-compliant servers (pass-through scenario) Already using tools hosted in Logic Apps, Azure Functions, LangChain or custom runtimes? Now you can govern those external tool servers by exposing them through API Management. Use API Management to: Secure external MCP servers with OAuth, rate limits, and Credential Manager Monitor and log usage with Azure Monitor and Application Insights Unify discovery with internal tools via Azure API Center 🔗 You bring the tools. API Management brings the governance. 🧭 What’s Next We’re actively expanding MCP capabilities in API Management: Tool-level access policies for granular governance Support for MCP resources and prompts to expand beyond tools 📚 Get Started 📘 Expose APIs as MCP servers 🌐 Connect external MCP servers 🔐 Secure access to MCP servers 🔎 Discover tools in API Center Summary Azure API Management is your single control plane for agents, tools and APIs — whether you're building internal copilots or connecting external toolchains. This preview unlocks more flexibility, less friction, and a secure foundation for the next wave of agent-powered applications. No new infrastructure. Secure by default. Built for the future.2.2KViews2likes3Comments