ai
60 TopicsAzure API Management Your Auth Gateway For MCP Servers
The Model Context Protocol (MCP) is quickly becoming the standard for integrating Tools š ļø with Agents š¤ and Azure API Management is at the fore-front, ready to support this open-source protocol š. You may have already encountered discussions about MCP, so let's clarify some key concepts: Model Context Protocol (MCP) is a standardized way, (a protocol), for AI models to interact with external tools, (and either read data or perform actions) and to enrich context for ANY language models. AI Agents/Assistants are autonomous LLM-powered applications with the ability to use tools to connect to external services required to accomplish tasks on behalf of users. Tools are components made available to Agents allowing them to interact with external systems, perform computation, and take actions to achieve specific goals. Azure API Management: As a platform-as-a-service, API Management supports the complete API lifecycle, enabling organizations to create, publish, secure, and analyze APIs with built-in governance, security, analytics, and scalability. New Cool Kid in Town - MCP AI Agents are becoming widely adopted due to enhanced Large Language Model (LLM) capabilities. However, even the most advanced models face limitations due to their isolation from external data. Each new data source requires custom implementations to extract, prepare, and make data accessible for any model(s). - A lot of heavy lifting. Anthropic developed an open-source standard - the Model Context Protocol (MCP), to connect your agents to external data sources such as local data sources (databases or computer files) or remote services (systems available over the internet through e.g. APIs). MCP Hosts: LLM applications such as chat apps or AI assistant in your IDEs (like GitHub Copilot in VS Code) that need to access external capabilities MCP Clients: Protocol clients that maintain 1:1 connections with servers, inside the host application MCP Servers: Lightweight programs that each expose specific capabilities and provide context, tools, and prompts to clients MCP Protocol: Transport layer in the middle At its core, MCP follows a client-server architecture where a host application can connect to multiple servers. Whenever your MCP host or client needs a tool, it is going to connect to the MCP server. The MCP server will then connect to for example a database or an API. MCP hosts and servers will connect with each other through the MCP protocol. You can create your own custom MCP Servers that connect to your or organizational data sources. For a quick start, please visit our GitHub repository to learn how to build a remote MCP server using Azure Functions without authentication: https://aka.ms/mcp-remote Remote vs. Local MCP Servers The MCP standard supports two modes of operation: Remote MCP servers: MCP clients connect to MCP servers over the Internet, establishing a connection using HTTP and Server-Sent Events (SSE), and authorizing the MCP client access to resources on the user's account using OAuth. Local MCP servers: MCP clients connect to MCP servers on the same machine, using stdio as a local transport method. Azure API Management as the AI Auth Gateway Now that we have learned that MCP servers can connect to remote services through an API. The question now rises, how can we expose our remote MCP servers in a secure and scalable way? This is where Azure API Management comes in. A way that we can securely and safely expose tools as MCP servers. Azure API Management provides: Security: AI agents often need to access sensitive data. API Management as a remote MCP proxy safeguards organizational data through authentication and authorization. Scalability: As the number of LLM interactions and external tool integrations grows, API Management ensures the system can handle the load. Security remains to be a critical piece of building MCP servers, as agents will need to securely connect to protected endpoints (tools) to perform certain actions or read protected data. When building remote MCP servers, you need a way to allow users to login (Authenticate) and allow them to grant the MCP client access to resources on their account (Authorization). MCP - Current Authorization Challenges State: 4/10/2025 Recent changes in MCP authorization have sparked significant debate within the community. š šš²š ššµš®š¹š¹š²š»š“š²š with the Authorization Changes: The MCP server is now treated as both a resource server AND an authorization server. This dual role has fundamental implications for MCP server developers and runtime operations. š” š¢ššæ š¦š¼š¹ššš¶š¼š»: To address these challenges, we recommend using ššššæš² šš£š š š®š»š®š“š²šŗš²š»š as your authorization gateway for remote MCP servers. šFor an enterprise-ready solution, please check out our azd up sample repo to learn how to build a remote MCP server using Azure API Management as your authentication gateway: https://aka.ms/mcp-remote-apim-auth The Authorization Flow The workflow involves three core components: the MCP client, the APIM Gateway, and the MCP server, with Microsoft Entra managing authentication (AuthN) and authorization (AuthZ). Using the OAuth protocol, the client starts by calling the APIM Gateway, which redirects the user to Entra for login and consent. Once authenticated, Entra provides an access token to the Gateway, which then exchanges a code with the client to generate an MCP server token. This token allows the client to communicate securely with the server via the Gateway, ensuring user validation and scope verification. Finally, the MCP server establishes a session key for ongoing communication through a dedicated message endpoint. Diagram source: https://aka.ms/mcp-remote-apim-auth-diagram Conclusion Azure API Management (APIM) is an essential tool for enterprise customers looking to integrate AI models with external tools using the Model Context Protocol (MCP). In this blog, we've emphasized the simplicity of connecting AI agents to various data sources through MCP, streamlining previously complex implementations. Given the critical role of secure access to platforms and services for AI agents, APIM offers robust solutions for managing OAuth tokens and ensuring secure access to protected endpoints, making it an invaluable asset for enterprises, despite the challenges of authentication. API Management: An Enterprise Solution for Securing MCP Servers Azure API Management is an essential tool for enterprise customers looking to integrate AI models with external tools using the Model Context Protocol (MCP). It is designed to help you to securely expose your remote MCP servers. MCP servers are still very new, and as the technology evolves, API Management provides an enterprise-ready solution that will evolve with the latest technology. Stay tuned for further feature announcements soon! Acknowledgments This post and work was made possible thanks to the hard work and dedication of our incredible team. Special thanks to Pranami Jhawar, Julia Kasper, Julia Muiruri, Annaji Sharma Ganti Jack Pa, Chaoyi Yuan and Alex Vieira for their invaluable contributions. Additional Resources MCP Client Server integration with APIM as AI gateway Blog Post: https://aka.ms/remote-mcp-apim-auth-blog Sequence Diagram: https://aka.ms/mcp-remote-apim-auth-diagram APIM lab: https://aka.ms/ai-gateway-lab-mcp-client-auth Python: https://aka.ms/mcp-remote-apim-auth .NET: https://aka.ms/mcp-remote-apim-auth-dotnet On-Behalf-Of Authorization: https://aka.ms/mcp-obo-sample 3rd Party APIs ā Backend Auth via Credential Manager: Blog Post: https://aka.ms/remote-mcp-apim-lab-blog APIM lab: https://aka.ms/ai-gateway-lab-mcp YouTube Video: https://aka.ms/ai-gateway-lab-demo20KViews12likes4Commentsš Announcing General Availability of AI & RAG Connectors in Logic Apps (Standard)
Weāre excited to share that a comprehensive set of AI and Retrieval-Augmented Generation (RAG) capabilities is now Generally Available in Azure Logic Apps (Standard). This release brings native support for document processing, semantic retrieval, embeddings, and grounded reasoning directly into the Logic Apps workflow engine. š Available AI Connectors in Logic Apps Standard Logic Apps (Standard) had previously previewed four AI-focused connectors that open the door for a new generation of intelligent automation across the enterprise. Whether you're processing large volumes of documents, enriching operational data with intelligence, or enabling employees to interact with systems using natural language, these connectors provide the foundation for building solutions that are smarter, faster, and more adaptable to business needs. These are now in GA. They allow teams to move from routine workflow automation to AI-assisted decisioning, contextual responses, and multi-step orchestration that reflects real business intent. Below is the full set of built-in connectors and their actions as they appear in the designer. 1. Azure OpenAI Actions Get an embedding Get chat completions Get chat completions using Prompt Template Get completion Get multiple chat completions Get multiple embeddings What this unlocks Bring natural language reasoning and structured AI responses directly into workflows. Common scenarios include guided decisioning, user-facing assistants, classification and routing, or preparing embeddings for semantic search and RAG workflows. 2. Azure AI Search Actions Delete a document Delete multiple documents Get agentic retrieval output (Preview) Index a document Index multiple documents Merge document Search vectors Search vectors with natural language What this unlocks Add vector, hybrid semantic, and natural language search directly to workflow logic. Ideal for retrieving relevant content from enterprise data, powering search-driven workflows, and grounding AI responses with context from your own documents. 3. Azure AI Document Intelligence Action Analyze document What this unlocks Document Intelligence serves as the entry point for document-heavy scenarios. It extracts structured information from PDFs, images, and forms, allowing workflows to validate documents, trigger downstream processes, or feed high-quality data into search and embeddings pipelines. 4. AI Operations Actions Chunk text with metadata Parse document with metadata What this unlocks Transform unstructured files into enriched, structured content. Enables token-aware chunking, page-level metadata, and clean preparation of content for embeddings and semantic search at scale. š¤ Advanced AI & Agentic Workflows with AgentLoop Logic Apps (Standard) also supports AgentLoop (also Generally Available), allowing AI models to use workflow actions as tools and iterate until the task is complete. Combined with chunking, embeddings, and natural language search, this opens the door to advanced agentic scenarios such as document intelligence agents, RAG-based assistants, and iterative evaluators. Conclusion With these capabilities now built into Logic Apps Standard, teams can bring AI directly into their integration workflows without additional infrastructure or complexity. Whether youāre streamlining document-heavy processes, enabling richer search experiences, or exploring more advanced agentic patterns, these capabilities provide a strong foundation to start building today.šļøPublic Preview: Azure Logic Apps Connectors as MCP Tools in Microsoft Foundry
At Ignite 2025, Foundry tools were introduced in Microsoft Foundry ā a unified catalog of tools, connectors and MCP servers. We are excited to share more about Azure Logic Apps connectors now available as tools in Microsoft Foundry. This unlocks a seamless way for developers to give their agents secure, governed access to the systems they rely onāwithout writing boilerplate authentication code or managing API plumbing. With this feature, agents can now use any Logic Apps connectorāincluding SAP, ServiceNow, Dynamics, Salesforce, SQL, GitHub, and hundreds moreāas a first-class MCP tool inside Foundry. This builds on our recent Public Preview of Logic Apps MCP Servers, which enable both connectors and workflows to be exposed as MCP tools. How it works Step 1 ā Select a Logic Apps connector as a tool In the Agent Tools catalog, youāll now find Logic Apps connectors available alongside existing MCP tools. You can search by name or filter to show only Logic Apps connectors (marked with the Custom tag). After selecting a connector, youāll move on to choosing a Logic App resource. Step 2 ā Create or select a Standard Logic App Logic Apps connectors require a Standard Logic Apps resource as their host. If you donāt already have one, a new Logic App will be created automatically. If you do, youāll see a dropdown allowing you to pick an existing Logic App. Step 3 ā Configure the connector as an MCP server This is where Logic Apps provides its power and flexibility. You effectively generate an MCP server using your connector: Choose which operations you want to expose Optionally configure parameter behavior Specify whether parameters are provided by the model or the user Review or edit the autogenerated tool description based on the connectorās OpenAPI definition Parameters that require user input are clearly indicated in the UI. Step 4 ā Register the MCP server as a tool in Foundry This flow begins in the Azure portal and completes in the Foundry portal. After registration completes, your tool appears in the Agent tool listāready to be added to any agent. In the Connected Resources view, youāll also see the Logic Apps resource that backs your MCP tool. In just a few steps, agents can now use Logic Apps connectors natively, unlocking secure enterprise connectivity without custom code. Foundry continues to support custom MCP servers, including those created from Logic Apps workflows themselves. If you want to expose a workflow as an MCP tool, you can do so with the same mechanism. (See our detailed document on converting workflows to MCP tools.) Roadmap Unified Experience Today, the flow spans the Azure portal and the Foundry portal. A fully Foundry-native experience is coming in an upcoming release. OAuth-based First-Party Connectors Some first-party connectors using OAuth are not yet supported. These will be enabled in the near future. Next Steps Watch this short demo video to see the feature in action. Get started with the documentation to try it yourself.337Views0likes0Commentsš¢Announcing AI Foundry Agent Service Connector v2 (Preview)
We have recently made some updates to the AI Foundry Agent Service connector that allows Azure Logic Apps to call the latest AI Foundry Agent API via our connector and streamline AI integration for enterprises. It delivers secure, governed connectivity through, accelerates deployment with low-code design, and scales globally. This approach embeds advanced AI into business processes without custom code, reducing complexity and driving faster time-to-value. There are a few different use cases that are unlocked as a result of these investments: Leveraging Logic Apps large connector library and vast collection of triggers that allow you to subscribe to events in the enterprise and hand off to a foundry-hosted agent. For example, you can subscribe to a new document being added a SharePoint list and then pass the content from that SharePoint document to the Foundry agent. Another use case my be polling a mailbox for new messages that meet your filter criteria. Those messages and attachments can subsequently be retrieved and passed to the Foundry agent. Agentic business processes built with Logic Apps and Agent Loop allow for multi-agent solutions where Logic Apps can participate in the broader agent ecosystem and leverage agents built using AI Foundry. New Operations The 3 new operations include: External agent activity protocol based on application: track and manage activities across agents operating within a particular application context External agent activity protocol based on agent identifier: track and manage activities for a specific agent by its unique identifier, enabling monitoring, auditing, and external control of that agentās actions. Invoke Agent: used to trigger an agent to perform a specific task or action within an orchestration or agent-based system. Demo In the following video, we will take a look at a demo on how we can use Logic Apps to invoke a Foundry Agent.š¢ Announcing Foundry Control Plane support for Logic Apps Agent Loop (Preview)
At Ignite 2025, Microsoft announced the Foundry Control Plane as a way to build, operate, and govern every agent from a single location. To align to this Microsoft vision, you will find Logic Apps Agent Loop implementations participate in this holistic governance experience. This ensures that Logic Apps customers are included in the broader Microsoft agent ecosystem, without trade-offs! Agent governance isnāt just a technical safeguardāitās the foundation for trust and scalability in the agentic era. As AI agents gain autonomy to make decisions and execute tasks, governance ensures they operate within ethical, secure, and compliant boundaries. Without clear guardrails, agents can introduce risks like unauthorized data access, regulatory violations, and costly errors that undermine business confidence. The Foundry Control Plane seeks to address all of these enterprise needs. Public Preview Launch There are no additional actions required by a customer to have their Agent Loop instances show up in the Foundry Control Plan. This support has been built-in, without any customer intervention. To access the Foundry Control plan, navigate to this link and then toggle on the new experience. Once in the new experience select a Foundry project followed by clicking on the Operate link in the upper right hand corner. Next, click Assets followed by selecting the appropriate Subscription and specify Logic Apps Agent Loop from the Source dropdown. You will now see your Agent Loop instances. For a selected row, you will also have access to additional capabilities such as stopping/starting the agent and deep linking to that particular instance. Coming Soon A capability that we are actively working on is making Agent Loop telemetry also available to the Foundry Control Plan. Note: The images below represent pre-release capabilities. A list of conversations will be displayed. For each conversation you will see important information like token usage (inputs/outputs). To dive deeper into a specific conversation, click a specific trace instance to further explore that execution. Once we have clicked on a specific trace id, we can see rich telemetry for that particular execution. Additional Content To see a demo of the Foundry Control Panel and Logic Apps in action, please check out the following video.Preview: Govern, Secure, and Observe A2A APIs with Azure API Management
Today, weāre announcing the preview support for A2A (Agent2Agent) APIs in Azure API Management. With this capability, organizations can now manage and govern agent APIs alongside AI model APIs, Model Context Protocol (MCP) tools, and traditional APIs such as REST, SOAP, GraphQL, WebSocket, and gRPC ā all within a single, consistent API management plane. Extending API Governance into the Agentic Ecosystem As organizations adopt agentic systems, the need for consistent governance, security, and observability grows. With A2A API support, Azure API Management enables you to extend established API practices into the agentic world ā ensuring secure access, consistent policy enforcement, and complete visibility for AI agents. A2A APIs in Azure API Management: Mediate JSON-RPC runtime operations with policy support Expose and manage agent cards for users, clients, or other agents Support OpenTelemetry GenAI semantic conventions when logging traces to Application Insights ā including "gen_ai.agent.id" and "gen_ai.agent.name" attributes How It Works When you import an A2A API, API Management mediates runtime calls to your agent backend (JSON-RPC only) and exposes the agent card as an operation within the same API. The agent card is transformed automatically to represent the A2A API managed by API Management ā with the hostname replaced by API Managementās gateway address, security schemes converted to authentication configured in API Management, and unsupported interfaces removed. When integrated with Application Insights, API Management enriches traces with GenAI-compliant telemetry attributes ā allowing easy identification of the agent and deep correlation between API and agent execution traces for monitoring and debugging. Try It Out To import an A2A API: Navigate to the APIs page in the Azure portal and select the A2A Agent tile. Enter your agent card URL. If accessible, the portal will automatically populate relevant settings. Configure the remaining properties, such as API path in API Management. This functionality is currently available only in v2 tiers of API Management and it will continue to roll out to all tiers in the coming months. Start Managing Your Agent APIs With A2A support in Azure API Management, you can now bring agent APIs under the same governance and security umbrella as your existing APIs ā strengthening control, security, and observability across your AI and API ecosystems. Learn more about A2A API support in Azure API Management.Announcing Public Preview of Agent Loop in Azure Logic Apps Consumption
Weāre excited to announce a major leap forward in democratizing AI-powered business process automation: Agent Loop is now available in Azure Logic Apps Consumption, bringing advanced AI agent capabilities to a broader audience with a frictionless, pay-as-you-go experience. NOTE: This feature is being rolled out and is expected to be in all planned regions by end of the week Whatās New? Agent Loop, previously available only in Logic Apps Standard, is now available in Consumption logic apps, providing developers, small and medium-sized businesses, startups, and enterprise teams with the ability to create autonomous and conversational AI agents without the necessity of provisioning or managing dedicated AI infrastructure. With Agent Loop, customers can develop both autonomous and conversational agents, seamlessly transforming any workflow into an intelligent workflow using the agent loop action. These agents are powered by knowledge and tools through access to over 1,400 connectors and MCPs (to be introduced soon). Why Does This Matter? By extending Agent Loop to Logic Apps Consumption, weāre making AI agent capabilities accessible to everyoneāfrom individual developers to large enterprisesāwithout barriers. This move supports rapid prototyping, experimentation, and production workloads, all while maintaining the flexibility to upgrade as requirements evolve. Key highlights: Hosted on Behalf Of (HOBO) Model: With this model, customers can harness the power of advanced Foundry models directly within their Logic Apps, without the need to provision or manage AI resources themselves. Microsoft handles all the underlying large language model (LLM) infrastructure, preserving the serverless, low-overhead nature of Consumption Logic Apps that lets you focus purely on building intelligent workflows. Frictionless Entry Point: With Microsoft hosting and managing the Foundry model, customers only need an Azure subscription to set up an agentic workflow. This dramatically reduces entry barriers and enables anyone with access to Azure to leverage powerful AI agent automation right away. Pay-As-You-Go Billing: Youāre billed based on the number of tokens used for each agentic iteration, making experimentation and scaling cost-effective. No fixed infrastructure costs or complex setup. Extensive Connector Ecosystem: Provides access to an extensive ecosystem of over 1,400 connectors, facilitating seamless integration with a broad range of enterprise systems, APIs, and data sources. Enterprise-Grade Upgrade Path: As your needs growāwhether for higher performance, compliance, or custom model hostingāyou can seamlessly graduate to Logic Apps Standard, bringing your own model and unlocking advanced features like VNET support and local development. Refer https://learn.microsoft.com/en-us/azure/logic-apps/clone-consumption-logic-app-to-standard-workflow Security and Tenant Isolation: The HOBO model ensures strong tenant isolation and security boundaries, so your data and workflows remain protected. Chat client Authentication: Setting up the chat client is straightforward, with built-in security provided using OAuth policies. How to Get Started? Check out the video below to see examples of conversational and autonomous agent workflows in Consumption Logic Apps. For detailed instructions on creating agentic workflows, visit Overview | Logic Apps Labs. Refer the official documentation for more information on this feature- Workflows with AI Agents and Models - Azure Logic Apps | Microsoft Learn. Limitations: Local development capabilities and VNET integration are not supported with Consumption Logic Apps. Regional data residency isn't guaranteed for the agentic actions. If any GDPR (General Data Protection Regulation) concerns, use Logic Apps Standard. Nested agents and MCP tools are currently unavailable but will be added soon. If you need these features, refer Logic Apps Standard. Currently, West Europe and West US are supported regions; additional regions will be available soon.š¢ Agent Loop Ignite Update - New Set of AI Features Arrive in Public Preview
Today at Ignite, we announced the General Availability of Agent Loop in Logic Apps Standardābringing production-ready agentic automation to every customer. But GA is just the beginning. Weāre also releasing a broad set of new and powerful AI-first capabilities in Public Preview that dramatically expand what developers can build: run agents in the Consumption SKU ,bring your own models through APIM AI Gateway, call any tool through MCP, deploy agents directly into Teams, secure RAG with document-level permissions, onboard with Okta, and build in a completely redesigned workflow designer. With these preview features layered on top of GA, customers can build AI applications that bring together secure tool calling, user identity, governance, observability, and integration with their existing systemsāwhether theyāre running in Standard, Consumption, or the Microsoft 365 ecosystem. Hereās a closer look at the new capabilities now available in Public Preview. Public Preview of Agentic Workflows in Consumption SKU Agent Loop is now available in Azure Logic Apps Consumption, bringing autonomous and conversational AI agents to everyone through a fully serverless, pay-as-you-go experience. You can now turn any workflow into an intelligent workflow using the agent loop actionāwithout provisioning infrastructure or managing AI models. This release provides instant onboarding, simple authentication, and a frictionless entry point for building agentic automation. Customers can also tap into Logic Appsā ecosystem of 1,400+ connectors for tool calling and system integrations. This update makes AI-powered automation accessible for rapid prototyping while still offering a clear path to scale and production-ready deployments in Logic Apps Standard, including BYOM, VNET integration, and enterprise-grade controls. Preview limitations include limited regions, no VS Code local development, and no nested agents or MCP tools yet. Read more about this in our announcement blog! Bring your Own Model Weāre excited to introduce Bring Your Own Model (BYOM) support in Agent Loop for Logic Apps Standard - making it possible to use any AI model in your agentic workflows from Foundry, and even on-prem or private cloud models. The key highlight of this feature is the deep integration with the Azure API Management (APIM) AI Gateway, which now serves as the control plane for how Agent Loop connects to models. Instead of wiring agents directly to individual endpoints, AI Gateway creates a single, governed interface that manages authentication, keys, rate limits, and quotas in one place. It provides built-in monitoring, logging, and observability, giving you full visibility into every request. It also ensures a consistent API shape for model interactions, so your workflows remain stable even as backends evolve. With AI Gateway in front, you can test, upgrade, and refine your model configuration without changing your Logic Apps, making model management safer, more predictable, and easier to operate at scale. Beyond AI Gateway, Agent Loop also supports: Direct external model integration when you want lightweight, point-to-point access to a third-party model API. Local/VNET model integration for on-prem, private cloud, or custom fine-tuned models that require strict data residency and private networking. Together, these capabilities let you treat the model as a pluggable component - start with the model you have today, bring in specialized or cost-optimized models as needed, and maintain enterprise-grade governance, security, and observability throughout. This makes Logic Apps one of the most flexible platforms for building model-agnostic, production-ready AI agent workflows. Ready to try this out? Go to http://aka.ms/agentloop/byom to learn more and get started. MCP support for Agent Loop in Logic Apps Standard Agent Loop in Azure Logic Apps Standard now supports the Model Context Protocol (MCP), enabling agents to discover and call external tools through an open, standardized interface. This brings powerful, flexible tool extensibility to both conversational and autonomous agents. Agent Loop offers three ways to bring MCP tools into your workflows: Bring Your Own MCP connector ā Point to any external MCP server using its URL and credentials, instantly surfacing its published tools in your agent. Managed MCP connector ā Access Azure-hosted MCP servers through the familiar managed connector experience, with shared connections and Azure-managed catalogs. Custom MCP connector ā Build and publish your own OpenAPI-based MCP connector to expose private or tenant-scoped MCP servers. Idea for reusability of MCPs across organization. Managed and Custom MCP connectors support on-behalf-of (OBO) authentication, allowing agents to call MCP tools using the end userās identity. This provides user-context-aware, permission-sensitive tool access across your intelligent workflows. Want to learn more ā check out our announcement blog and how-to documents. Deploy Conversational Agents to Teams/M365 Workflows with conversational agents in Logic Apps can now be deployed directly into Microsoft Teams, so your agentic workflows show up where your users already live all day. Instead of going to a separate app or portal, employees can ask the agent questions, kick off approvals, check order or incident status, or look up internal policies right from a Teams chat or channel. The agent becomes just another teammate in the conversationājoining stand-ups, project chats, and support rooms as a first-class participant. Because the same Logic Apps agent can also be wired into other Microsoft 365 experiences that speak to Bots and web endpoints, this opens the door to a consistent and personalized āorganization copilotā that follows users across the M365 ecosystem: Teams for chat, meetings, and channels today, and additional surfaces over time. Azure Bot Service and your proxy handle identity, tokens, and routing, while Logic Apps takes care of reasoning, tools, and back-end systems. The result is an agent that feels native to Teams and Microsoft 365āsecure, governed, and always just one @mention away. Ready to bring your agentic workflows into Teams? Hereās how to get started. Secure Knowledge Retrieval for AI Agents in Logic Apps Weāve added native document-level authorization to Agent Loop by integrating Azure AI Search ACLs. This ensures AI agents only retrieve information the requesting user is permitted to accessāmaking RAG workflows secure, compliant, and permission-aware by default. Documents are indexed with user or group permissions, and Agent Loop automatically applies those permissions during search using the callerās principal ID or group memberships. Only authorized documents reach the LLM, preventing accidental exposure of sensitive data. This simplifies development, removes custom security code, and allows a single agent to safely serve users with different access levelsāwhether for HR, IT, or internal knowledge assistants. Here is our blogpost to learn more about this feature. Okta Agent Loop now supports Okta as an identity provider for conversational agents, alongside Microsoft Entra ID. This makes it easy for organizations using Okta for workforce identity to pass authenticated user contextāincluding user attributes, group membership, and permissionsādirectly into the agent at runtime. Agents can now make user-aware decisions, enforce access rules, personalize responses, and execute tools with proper user context. This update helps enterprises adopt Agent Loop without changing their existing identity architecture and enables secure, policy-aligned AI interactions across both Okta and Entra environments. Setting up Okta as the identity provider requires a few steps and they are all explained in details here at Logic Apps Labs. Designer makeover! Weāve introduced a major redesign of the Azure Logic Apps designer, now in Public Preview for Standard workflows. This release marks the beginning of a broader modernization effort to make building, testing, and operating workflows faster, cleaner, and more intuitive. The new designer focuses on reducing friction and streamlining the development loop. You now land directly in the designer when creating a workflow, with plans to remove early decisions like stateful/stateless or agentic setup. The interface has been simplified into a single unified view, bringing together the visual canvas, code view, settings, and run history so you no longer switch between blades. A major addition is Draft Mode with auto-save, which preserves your work every few seconds without impacting production. Drafts can be tested safely and only go live when you choose to publishāwithout restarting the app during editing. Search has also been completely rebuilt for speed and accuracy, powered by backend indexing instead of loading thousands of connectors upfront. The designer now supports sticky notes and markdown, making it easy to document workflows directly on the canvas. Monitoring is integrated into the same page, letting you switch between runs instantly and compare draft and published results. A new hierarchical timeline view improves debugging by showing every action executed in order. This release is just the startāmany more improvements and a unified designer experience across Logic Apps are on the way as we continue to iterate based on your feedback. Learn more about the designer updates in our announcement blog ! What's Next Weād love your feedback. Which capabilities should we prioritize, and what would create the biggest impact for your organization?872Views1like0CommentsšAnnouncing General Availability of Agent Loop in Azure Logic Apps
Transforming Business Automation with Intelligent, Collaborative Multi-Agentic workflows! Agent Loop is now Generally Available in Azure Logic Apps Standard, turning Logic Apps platform into a complete multi-agentic automation system. Build AI agents that work alongside workflows and humans, secured with enterprise-grade identity and access controls, deployed using your existing CI/CD pipelines. Thousands of customers have already built tens of thousands of agentsānow you can take them to production with confidence. Get Started | Workshop | Demo Videos | Ignite 2025 Session | After an incredible journey since we introduced Agent Loop at Build earlier this year, we're thrilled to announce that Agent Loop is now generally available in Azure Logic Apps. This milestone represents more than just a feature releaseāit's the culmination of learnings from thousands of customers who have been pushing the boundaries of what's possible with agentic workflows. Agent Loop transforms Azure Logic Apps into a complete multi-agentic business process automation platform, where AI agents, automated workflows, and human expertise collaborate seamlessly to solve complex business challenges. With GA, we're delivering enterprise-grade capabilities that organizations need to confidently deploy intelligent automation at scale. The Journey to GA: Proven by Customers, Built for Production Since our preview launch at Build, the response has been extraordinary. Thousands of customersāfrom innovative startups to Fortune 500 enterprisesāhave embraced Agent Loop, building thousands of active agents that have collectively processed billions of tokens every month for the past six months. The growth of agents, executions, and token usage has accelerated significantly, doubling month over month. Since the launch of Conversational Agents in September, they already account for nearly 30% of all agentic workflows. Across the platform, agentic workflows now consume billions of tokens, with overall token usage increasing at nearly 3Ć month over month. Cyderes: 5X Faster Security Investigation Cycles Cyderes leveraged Agent Loop to automate triage and handling of security alerts, leading to faster investigation cycles and significant cost savings. "We were drowning in dataāprocessing over 10,000 alerts daily while analysts spent more time chasing noise than connecting narratives. Agent Loop changed everything. By empowering our team to design and deploy their own AI agents through low-code orchestration, we've achieved 5X faster investigation cycles and significant cost savings, all while keeping pace with increasingly sophisticated cyber threats that now leverage AI to operate 25X faster than traditional attacks." ā Eric Summers, Engineering Manager - AI & SOAR Vertex Pharmaceuticals: Hours Condensed to Minutes Vertex Pharmaceuticals unlocked knowledge trapped across dozens of systems via a team of agents. VAIDA, built with Logic Apps and Agent Loop, orchestrates multiple AI agents and helps employees find information faster, while maintaining compliance and supporting multiple languages. "We had knowledge trapped across dozens of systemsāServiceNow, documentation, training materialsāand teams were spending valuable time hunting for answers. Logic Apps Agent Loop changed that. VAIDA now orchestrates multiple AI agents to summarize, search, and analyze this knowledge, then routes approvals right in Teams and Outlook. We've condensed hours into minutes while maintaining compliance and delivering content in multiple languages." ā Pratik Shinde, Director, Digital Infrastructure & GenAI Platforms Where Customers Are Deploying Agent Loop Customers across industries are using Agent Loop to build AI applications that power both everyday tasks and mission-critical business processes across Healthcare, Retail, Energy, Financial Services, and beyond. These applications drive impact across a wide range of scenarios: Developer Productivity: Write code, generate unit tests, create workflows, map data between systems, automate source control, deployment and release pipelines IT Operations: Incident management, ticket and issue handling, policy review and enforcement, triage, resource management, cost optimization, issue remediation Business Process Automation: Empower sales specialists, retail assistants, order processing/approval flows, and healthcare assistants for intake and scheduling Customer & Stakeholder Support: Project planning and estimation, content generation, automated communication, and streamlined customer service workflows Proven Internally at Microsoft Agent Loop is also powering Microsoft and Logic Apps team's own operations, demonstrating its versatility and real-world impact: IcM Automation Team: Transforming Microsoft's internal incident automation platform into an agent studio that leverages Logic Apps' Agent Loop, enabling teams across Microsoft to build agentic live site incident automations Logic Apps Team Use Cases: Release & Deployment Agent: Streamlines deployment and release management for the Logic Apps platform Incident Management Agent: An extension of our SRE Agent, leveraging Agent Loop to accelerate incident response and remediation Analyst Agent: Assists teams in exploring product usage and health data, generating insights directly from analytics What's Generally Available Today Core Agent Loop Capabilities (GA) Agent Loop in Logic Apps Standard SKU - Support for both Autonomous and Conversational workflows Autonomous workflows run agents automatically based on triggers and conditions Conversational workflows use A2A to enable interactive chat experiences with agents On-Behalf-Of Authentication - Per-user authentication for 1st-party and 3rd-party connectors Agent Hand-Off - Enable seamless collaboration in multi-agent workflows Python Code Interpreter - Execute Python code dynamically for data analysis and computation Nested Agent Action - Use agents as tools within other agents for sophisticated orchestration User ACLs Support - Fine-grained document access control for knowledge Exciting New Agent Loop Features in Public Preview We've also released several groundbreaking features in Public Preview: New Designer Experience - Redesigned interface optimized for building agentic workflows Agent Loop in Consumption SKU - Deploy agents in the serverless Consumption tier MCP Support - Integrate Model Context Protocol servers as tools, enabling agents to access standardized tool ecosystems AI Gateway Integration - Use Azure AI Gateway as a model source for unified governance and monitoring Teams/M365 Deployment - Deploy conversational agents directly in Microsoft Teams and Microsoft 365 Okta Identity Provider - Use Okta as the identity provider for conversational agents Hereās our Announcement Blog for these new capabilities Built on a Platform You Already Trust Azure Logic Apps is already a proven iPaaS platform with thousands of customers using it for automation ā ranging from startups to 100% of Fortune 500 companies. Agent Loop doesn't create a separate "agentic workflow automation platform" you have to learn and operate. Instead, it makes Azure Logic Apps itself your agentic platform: Workflows orchestrate triggers, approvals, retries, and branching Agent Loop, powered by LLMs, handle reasoning, planning, and tool selection Humans stay in control through approvals, exceptions, and guided hand-offs Agent Loop runs inside your Logic Apps Standard environment, so you get the same benefits you already know: enterprise SLAs, VNET integration, data residency controls, hybrid hosting options, and integration with your existing deployment pipelines and governance model. Enterprise Ready - Secure, User-Aware Agents by Design Bringing agents into the enterprise only works if security and compliance are first-class. With Agent Loop in Azure Logic Apps, security is built into every layer of the stack. Per-User Actions with On-Behalf-Of (OBO) and Delegated Permissions Many agent scenarios require tools to act in the context of the signed-in user. Agent Loop supports the OAuth 2.0 On-Behalf-Of (OBO) flow so that supported connector actions can run with delegated, per-user connections rather than a broad app-only identity. That means when an agent sends mail, reads SharePoint, or updates a service desk system, it does so as the user (where supported), respecting that user's licenses, permissions, and data boundaries. This is critical for scenarios like IT operations, HR requests, and finance approvals where "who did what" must be auditable. Document-Level Security with Microsoft Entra-Based Access Control Agents should only see the content a user is entitled to see. With Azure AI Search's Entra-based document-level security, your retrieval-augmented workflows can enforce ACLs and RBAC directly in the index so that queries are automatically trimmed to documents the user has access to. Secured Chat Entry Point with Easy Auth and Entra ID The built-in chat client and your custom clients can be protected using App Service Authentication (Easy Auth) and Microsoft Entra ID, so only authorized users and apps can invoke your conversational endpoints. Together, OBO, document-level security, and Easy Auth give you end-to-end identity and access controlāfrom the chat surface, through the agent, down to your data and systems. An Open Toolbox: Connectors, Workflows, MCP Servers, and External Agents Agent Loop inherits the full power of the Logic Apps ecosystem and more - 1,400+ connectors for SaaS, on-premises, and custom APIs Workflows and agents as tools - compose sophisticated multi-step capabilities MCP server support - integrate with the Model Context Protocol for standardized tool access (Preview) A2A protocol support - enable agent-to-agent communication across platforms Multi-model flexibility - use Azure OpenAI, Azure AI Foundry hosted models, or bring your own model on any endpoint via AI gateway You're not locked into a single vendor or model provider. Agent Loop gives you an open, extensible framework that works with your existing investments and lets you choose the right tools for each job. Run Agents Wherever You Run Logic Apps Agent Loop is native to Logic Apps Standard, so your agentic workflows run consistently across cloud, on-premises, or hybrid environments. They inherit the same deployment, scaling, and networking capabilities as your workflows, bringing adaptive, AI-driven automation to wherever your systems and data live. Getting Started with Agent Loop We're in very exciting times, and we can't wait to see our customers go to production and realize the benefits of these capabilities for their business outcomes and success. Here are some useful links to get started on your AI journey with Logic Apps! Logic Apps Labs - https://aka.ms/LALabs Workshop - https://aka.ms/la-agent-in-a-day Demos - https://aka.ms/agentloopdemos1KViews3likes0Commentsš¢Announcing MCP Server Support for Logic Apps Agent Loop
At Ignite, we announced that Agent Loop in Azure Logic Apps Standard now supports the Model Context Protocol (MCP), enabling agents to discover and call external tools through an open, standardized interface. This brings powerful, flexible tool extensibility to both conversational and autonomous agents. MCP support gives agents a common language and secure channel to interact with enterprise systems. By standardizing communication through connector-driven MCP servers, it ensures consistent data exchange, governance, and trust across every agent interaction. Agent Loop offers three ways to bring MCP tools into your workflows: Bring Your Own MCP connector ā Point to any external MCP server using its URL and credentials, instantly surfacing its published tools in your agent. An example of this type of MCP Server is using Logic Apps as an MCP Server which enables you to dynamically build MCP Servers using Logic Apps connectors. Managed MCP connector ā Access Azure-hosted MCP servers through the familiar managed connector experience, with shared connections and Azure-managed catalogs. Inside of Logic Apps, there are already a set of managed MCP servers that are ready to use. Popular MCP Servers include: Office 365 Email Office 365 Calendar Salesforce Microsoft Learn Atlassian Jira GitHub Custom MCP connector ā Build and publish your own OpenAPI-based MCP connector to expose private or tenant-scoped MCP servers. Idea for reusability of MCPs across organization. Managed and Custom MCP connectors support on-behalf-of (OBO) authentication, allowing agents to call MCP tools using the end userās identity. This provides user-context-aware, permission-sensitive tool access across your intelligent workflows. Demo To view how you can setup using MCP Servers inside of Agent Loop, please review the following video. Additional Resources For additional resources: Agent Loop Calling MCP Servers - Logic Apps Labs Logic Apps MCP Demos