ai
900 TopicsAnnouncing Public Preview of Agent Loop in Azure Logic Apps Consumption
We’re excited to announce a major leap forward in democratizing AI-powered business process automation: Agent Loop is now available in Azure Logic Apps Consumption, bringing advanced AI agent capabilities to a broader audience with a frictionless, pay-as-you-go experience. NOTE: This feature is being rolled out and is expected to be in all planned regions by end of the week What’s New? Agent Loop, previously available only in Logic Apps Standard, is now available in Consumption logic apps, providing developers, small and medium-sized businesses, startups, and enterprise teams with the ability to create autonomous and conversational AI agents without the necessity of provisioning or managing dedicated AI infrastructure. With Agent Loop, customers can develop both autonomous and conversational agents, seamlessly transforming any workflow into an intelligent workflow using the agent loop action. These agents are powered by knowledge and tools through access to over 1,400 connectors and MCPs (to be introduced soon). Why Does This Matter? By extending Agent Loop to Logic Apps Consumption, we’re making AI agent capabilities accessible to everyone—from individual developers to large enterprises—without barriers. This move supports rapid prototyping, experimentation, and production workloads, all while maintaining the flexibility to upgrade as requirements evolve. Key highlights: Hosted on Behalf Of (HOBO) Model: With this model, customers can harness the power of advanced Foundry models directly within their Logic Apps, without the need to provision or manage AI resources themselves. Microsoft handles all the underlying large language model (LLM) infrastructure, preserving the serverless, low-overhead nature of Consumption Logic Apps that lets you focus purely on building intelligent workflows. Frictionless Entry Point: With Microsoft hosting and managing the Foundry model, customers only need an Azure subscription to set up an agentic workflow. This dramatically reduces entry barriers and enables anyone with access to Azure to leverage powerful AI agent automation right away. Pay-As-You-Go Billing: You’re billed based on the number of tokens used for each agentic iteration, making experimentation and scaling cost-effective. No fixed infrastructure costs or complex setup. Extensive Connector Ecosystem: Provides access to an extensive ecosystem of over 1,400 connectors, facilitating seamless integration with a broad range of enterprise systems, APIs, and data sources. Enterprise-Grade Upgrade Path: As your needs grow—whether for higher performance, compliance, or custom model hosting—you can seamlessly graduate to Logic Apps Standard, bringing your own model and unlocking advanced features like VNET support and local development. Refer https://learn.microsoft.com/en-us/azure/logic-apps/clone-consumption-logic-app-to-standard-workflow Security and Tenant Isolation: The HOBO model ensures strong tenant isolation and security boundaries, so your data and workflows remain protected. Chat client Authentication: Setting up the chat client is straightforward, with built-in security provided using OAuth policies. How to Get Started? Check out the video below to see examples of conversational and autonomous agent workflows in Consumption Logic Apps. For detailed instructions on creating agentic workflows, visit Overview | Logic Apps Labs. Refer the official documentation for more information on this feature- Workflows with AI Agents and Models - Azure Logic Apps | Microsoft Learn. Limitations: Local development capabilities and VNET integration are not supported with Consumption Logic Apps. Regional data residency isn't guaranteed for the agentic actions. If any GDPR (General Data Protection Regulation) concerns, use Logic Apps Standard. Nested agents and MCP tools are currently unavailable but will be added soon. If you need these features, refer Logic Apps Standard. Currently, West Europe and West US are supported regions; additional regions will be available soon.AI Toolkit Extension Pack for Visual Studio Code: Ignite 2025 Update
Unlock the Latest Agentic App Capabilities The Ignite 2025 update delivers a major leap forward for the AI Toolkit extension pack in VS Code, introducing a unified, end-to-end environment for building, visualizing, and deploying agentic applications to Microsoft Foundry, and the addition of Anthropic’s frontier Claude models in the Model Catalog! This release enables developers to build and debug locally in VS Code, then deploy to the cloud with a single click. Seamlessly switch between VS Code and the Foundry portal for visualization, orchestration, and evaluation, creating a smooth roundtrip workflow that accelerates innovation and delivers a truly unified AI development experience. Download the http://aka.ms/aitoolkit today and start building next-generation agentic apps in VS Code! What Can You Do with the AI Toolkit Extension Pack? Access Anthropic models in the Model Catalog Following the Microsoft, NVIDIA and Anthropic strategic partnerships announcement today, we are excited to share that Anthropic’s frontier Claude models including Claude Sonnet 4.5, Claude Opus 4.1, and Claude Haiku 4.5, are now integrated into the AI Toolkit, providing even more choices and flexibility when building intelligent applications and AI agents. Build AI Agents Using GitHub Copilot Scaffold agent applications using best-practice patterns, tool-calling examples, tracing hooks, and test scaffolds, all powered by Copilot and aligned with the Microsoft Agent Framework. Generate agent code in Python or .NET, giving you flexibility to target your preferred runtime. Build and Customize YAML Workflows Design YAML-based workflows in the Foundry portal, then continue editing and testing directly in VS Code. To customize your YAML-based workflows, instantly convert it to Agent Framework code using GitHub Copilot. Upgrade from declarative design to code-first customization without starting from scratch. Visualize Multi-Agent Workflows Envision your code-based agent workflows with an interactive graph visualizer that reveals each component and how they connect Watch in real-time how each node lights up as you run your agent. Use the visualizer to understand and debug complex agent graphs, making iteration fast and intuitive. Experiment, Debug, and Evaluate Locally Use the Hosted Agents Playground to quickly interact with your agents on your development machine. Leverage local tracing support to debug reasoning steps, tool calls, and latency hotspots—so you can quickly diagnose and fix issues. Define metrics, tasks, and datasets for agent evaluation, then implement metrics using the Foundry Evaluation SDK and orchestrate evaluations runs with the help of Copilot. Seamless Integration Across Environments Jump from Foundry Portal to VS Code Web for a development environment in your preferred code editor setting. Open YAML workflows, playgrounds, and agent templates directly in VS Code for editing and deployment. How to Get Started Install the AI Toolkit extension pack from the VS Code marketplace. Check out documentation. Get started with building workflows with Microsoft Foundry in VS Code 1. Work with Hosted (Pro-code) Agent workflows in VS Code 2. Work with Declarative (Low-code) Agent workflows in VS Code Feedback & Support Try out the extensions and let us know what you think! File issues or feedback on our GitHub repo for Foundry extension and AI Toolkit extension. Your input helps us make continuous improvements.Announcing Azure HorizonDB
Affan Dar, Vice President of Engineering, PostgreSQL at Microsoft Charles Feddersen, Partner Director of Program Management, PostgreSQL at Microsoft Today at Microsoft Ignite, we’re excited to unveil the preview of Azure HorizonDB, a fully managed Postgres-compatible database service designed to meet the needs of modern enterprise workloads. The cloud native architecture of Azure HorizonDB delivers highly scalable shared storage, elastic scale-out compute, and a tiered cache optimized for running cloud applications of any scale. Postgres is transforming industries worldwide and is emerging as the foundation of modern data solutions across all sectors at an unprecedented pace. For developers, it is the database of choice for building new applications with its rich set of extensions, open-source API, and expansive ecosystems of tools and libraries. At the same time, but at the opposite end of the workload spectrum, enterprises around the world are also increasingly turning to Postgres to modernize their existing applications. Azure HorizonDB is designed to support applications across the entire workload spectrum from the first line of code in a new app to the migration of large-scale, mission-critical solutions. Developers benefit from the robust Postgres ecosystem and seamless integration with Azure’s advanced AI capabilities, while enterprises can gain a secure, highly available, and performant cloud database to host their business applications. Whether you’re building from scratch or transforming legacy infrastructure, Azure HorizonDB empowers you to innovate and scale with confidence, today and into the future. Azure HorizonDB introduces new levels of performance and scalability to PostgreSQL. The scale-out compute architecture supports up to 3,072 vCores across primary and replica nodes, and the auto-scaling shared storage supports up to 128TB databases while providing sub-millisecond multi-zone commit latencies. This storage innovation enables Azure HorizonDB to deliver up to 3x more throughput when compared with open-source Postgres for transactional workloads. Azure HorizonDB is enterprise ready on day one. With native support for Entra ID, Private Endpoints, and data encryption, it provides compliance and security for sensitive data stored in the cloud. All data is replicated across availability zones by default and maintenance operations are transparent with near-zero downtime. Backups are fully automated, and integration with Azure Defender for Cloud provides additional protection for highly sensitive data. All up, Azure HorizonDB offers enterprise-grade security, compliance, and reliability, making it ready for business use today. Since the launch of ChatGPT, there has been an explosion of new AI apps being built, and Postgres has become the database of choice due in large part to its vector index support. Azure HorizonDB extends the AI capabilities of Postgres further with two key features. We are introducing advanced filtering capabilities to the DiskANN vector index which enable query predicate pushdowns directly into the vector similarity search. This provides significant performance and scalability improvements over pgvector HNSW while maintaining accuracy and is ideal for similarity search over transactional data in Postgres. The second feature is built-in AI model management that seamlessly integrates generative, embedding, and reranking models from Microsoft Foundry for developers to use in the database with zero configuration. In addition to enhanced vector indexing and simplified model management to build powerful new AI apps, we’re also pleased to announce the general availability of Microsoft’s PostgreSQL Extension for VS Code that provides the tooling for Postgres developers to maximize their productivity. Using this extension, GitHub Copilot is context aware of the Postgres database which means less prompting and higher quality answers, and in the Ignite release, we’ve added live monitoring with one-click GitHub Copilot debugging where Agent mode can launch directly from the performance monitoring dashboard to diagnose Postgres performance issues and guide users to a fix. Alpha Life Sciences are an existing Azure customers “I’m truly excited about how Azure HorizonDB empowers our AI development. Its seamless support for Vector DB, RAG, and Agentic AI allows us to build intelligent features directly on a reliable Postgres foundation. With Azure HorizonDB, I can focus on advancing AI capabilities instead of managing infrastructure complexities. It’s a smart, forward-looking solution that perfectly aligns with how we design and deliver AI-powered applications.” Pengcheng Xu, CTO Alpha Life Sciences For enterprises that are modernizing their applications to Postgres in the cloud, the security and availability of Azure HorizonDB make it an ideal platform. However, these migrations are often complex and time consuming for large legacy codebase conversions. To simplify this and reduce the risk, we’re pleased to announce the preview of GitHub Copilot powered Oracle migration built into the PostgreSQL Extension for VS Code. Built into VS Code, teams of engineers can work with GitHub Copilot to automate the end-to-end conversion of complex database code using rich code editing, version control, text authoring, and deployment in an integrated development environment. Azure HorizonDB is the next generation of fully managed, cloud native PostgreSQL database service. Built on the latest Azure infrastructure with state-of-the-art cloud architecture, Azure HorizonDB is ready to for the most demanding application workloads. In addition to our portfolio of managed Postgres services in Azure, Microsoft is deeply invested into the open source Postgres project and is one of the top corporate upstream contributors and sponsors for the PostgreSQL project, with 19 Postgres project contributors employed by Microsoft. As a hyperscale Postgres vendor, it’s critical to actively participate in the open-source project. It enables us to better support our customers down to the metal in Azure, and to contribute our learnings from running Postgres at scale back to the community. We’re committed to continuing our investment to push the Postgres project forward, and the team is already active in making contributions to Postgres 19 to be released in 2026. Ready to explore Azure HorizonDB? Azure HorizonDB is initially available in Central US, West US3, UK South and Australia East regions. Customers are invited to apply for early preview access to Azure HorizonDB and get hands-on experience with this new service. Participation is limited, apply now at aka.ms/PreviewHorizonDBBuild and connect Microsoft Foundry agents to MongoDB Atlas with new native integrations
MongoDB Atlas is a leading document database that’s rapidly gaining traction among Azure customers due to various deep integrations available across Microsoft platforms. In Aug 2025, we announced the General Availability (GA) of native integration for MongoDB Atlas — a milestone that marked a new era for developers and enterprises building AI-powered applications on Azure. Alongside this, we introduced Semantic Kernel connectors and other integrations to further enhance the developer experience. At last year’s Ignite, we announced the Microsoft Foundry integration with MongoDB Atlas to enable developers to reference vector stores. At Microsoft Ignite 2025, we are announcing the availability of MongoDB Atlas as one of the supported tools in Microsoft Foundry Tool Catalog (preview), allowing agents to connect to live enterprise data and operational systems. With the support of MongoDB MCP Server in the agent building experience, developers can now build agents that leverage MongoDB for both vector search and database operations. This unlocks new possibilities for intelligent automation and contextual decision-making, empowering organizations to create agents that are not only smarter but also deeply integrated with their operational data. Build Agents in Microsoft Foundry and add MongoDB as a Tool The goal is for enterprises to build and deploy agents quickly that can connect to their MongoDB Atlas collections. Discover MongoDB MCP Server in the Microsoft Foundry Tool Catalog Start by browsing the Microsoft Foundry Tool Catalog (preview), which features a growing list of enterprise-ready tools. Search for MongoDB MCP Server to find the landing page. Deploy the MCP Server The local MCP Server can be deployed to Azure Container Apps by following the documentation. Once hosted, copy the remote endpoint for the MCP Server. Add the tool to your agent In Microsoft Foundry, click on create a new agent, provide agent instructions and then click on ‘Tools’ Provide the remote endpoint in the custom tool option to connect to MongoDB Atlas. That’s it! Your Agent is now ready to query your databases. Access more Microsoft Foundry features Agents built in Microsoft Foundry can access a wide range of monitoring and governance capabilities. Organizations can automatically enforce security, compliance, and operational best practices. Every agent-to-tool interaction is authenticated, authorized, and monitored at scale, with unified telemetry and logging available through Azure Monitor and Microsoft Foundry analytics. This gives IT teams full visibility and control, while developers can confidently build and deploy agents knowing that enterprise standards are enforced by default. What Can Agents Do with MongoDB Atlas? Contextual Retrieval: Agents can ground their reasoning in fresh, operational data from MongoDB Atlas, supporting vector search and RAG workflows. Autonomous Database Operations: Agents can go beyond simple queries to perform end-to-end data exploration and management including creation of projects, clusters and database users while adhering to enterprise-grade compliance and security standards. Conclusion The MongoDB MCP Server integration for Microsoft Foundry is a major milestone in enterprise AI enablement. By bridging powerful document and vector data capabilities with secure, governed agent workflows, organizations can unlock new levels of automation, intelligence, and operational agility. Try the integration today Learn more about MongoDB Atlas on Azure. Try out the free cluster today! Get started with MongoDB MCP Server Link to MongoDB Blog45Views0likes0CommentsNew: Microsoft 365 Certified: Copilot and Agent Administration Fundamentals
Are you familiar with Microsoft 365, including its core services, security, identity and access, data protection, and governance, along with Microsoft 365 Copilot and agents? Are you able to perform basic administrative tasks for Copilot and agents? If this is your skill set, we have a new Microsoft Certification for you. The Microsoft 365 Certified: Copilot and Agent Administration Fundamentals Certification validates that you understand what's required to support and protect an AI-enabled Microsoft 365 environment. To earn this Certification, you need to pass Exam AB-900: Microsoft 365 Copilot and Agent Administration Fundamentals, currently in beta. The new Certification shows employers that you understand how to configure, secure, and govern Copilot and agents across Microsoft 365 to help keep your organization productive and compliant, that you know the fundamentals of identity, security, and compliance so you can support AI adoption, and that you not only know how to use AI but also how to enable and manage AI in the workplace. Is this the right Certification for you? As a candidate for this Certification, you should be familiar with the admin centers used to access Microsoft 365 workloads, such as Exchange Online, SharePoint in Microsoft 365, Microsoft Teams, Microsoft Entra, and Microsoft Purview. You need to have experience with AI-powered productivity tools and modern IT management strategies. You should also be able to identify the roles of the core features and objects available in Microsoft 365, such as users, groups, teams, sites, and libraries. And you must understand the core security features of Microsoft 365, such as authentication methods, conditional access policies, and single sign-on (SSO). Additionally, you need to be familiar with how to manage Copilot, including licensing and access control, billing and usage monitoring, prompt governance, and strategies for tracking and driving user adoption. You should also be familiar with configuring user access and permissions for agents; creating, editing, testing, and publishing agents; and managing agent approval and governance. Ready to prove your skills? Take advantage of the discounted beta exam offer. The first 300 people who take Exam AB-900 (beta), on or before January 3, 2026, can get 80% off market price. To receive the discount, when you register for the exam and are prompted for payment, use code AB900Goals26. This is not a private access code. The seats are offered on a first-come, first-served basis. As noted, you must take the exam on or before January 3, 2026. Please note that this beta exam is not available in Turkey, Pakistan, India, or China. Get ready to take Exam AB-900 (beta): Review the Exam AB-900 (beta) exam page for details. The Exam AB-900 study guide explores key topics covered in the exam. Want even more in-depth, instructor-led training? Connect with Microsoft Training Services Partners in your area for in-person offerings. Need other preparation ideas? Check out Just How Does One Prepare for Beta Exams? Did you know that you can take any Microsoft Certification exam online? Taking your exam from home or the office can be more convenient and less stressful than traveling to a test center—especially when you know what to expect. To find out more, read Online proctored exams: What to expect and how to prepare. The rescore process starts on the day an exam goes live, and final scores for beta exams are released approximately 10 days after that. For details on the timing of beta exam rescoring and results, check out Creating high-quality exams: The path from beta to live. Ready to get started? Remember, the number of spots is limited to the first 300 candidates taking Exam AB-900 (beta) on or before January 3, 2026. Stay tuned for general availability of this Certification in February 2026. Learn more about Microsoft Credentials. Related announcements We recently migrated our subject matter expert (SME) database to LinkedIn. To be notified of beta exam availability or opportunities to help with the development of exam, assessment, or learning content, sign up today for the Microsoft Worldwide Learning SME Group for Credentials.3.6KViews3likes8CommentsBridging the Digital and Physical Worlds with Azure IoT Hub and Azure IoT Operations
Operational excellence starts with people. Empowering those people with the most up to date insights and recommendations requires bridging the gap between the physical and digital worlds to generate the best possible outcomes for real time decision making. Creating this bridge transforms data into insights, insights into intelligent actions, and actions into real-world results. Digital Operations, integrated with AI insights, help make this possible by combining data from connected assets across a variety of physical locations and deployment topologies, and transforming that data into insights and decisions that scale using AI and Analytics. At Microsoft Ignite, we’re extending this vision with new Azure IoT Hub and Azure IoT Operations capabilities to manage connected assets at scale, unify digital operations, and realize AI-enabled outcomes across your enterprise. Connected Operations in Action Azure IoT Hub and Azure IoT Operations form the backbone of connected operations, where every asset, sensor, and system contributes to a continuous loop of intelligence by moving data to Microsoft Fabric for real-time analytics, and for use with AI agents. This pattern applies to nearly every sector of the economy. In manufacturing, these capabilities allow production engineers to predict and avoid equipment failures by analyzing vibration and temperature data at the edge before costly downtime occurs. In energy and utilities, distributed sensors can provide data to control points that help balance load, optimize grid efficiency, and ensure safe operations even in remote areas. In transportation and logistics, connected fleets use edge AI models to detect safety risks in real time, while cloud-based analytics optimize routing and fuel efficiency across entire regions. Across industries, this edge-to-cloud collaboration enables the ability for intelligent systems to sense, reason, and act in the physical world with speed, safety, and precision. From Data to Intelligent Action Organizations today must capture and act on data from both geographically dispersed and tightly collocated assets. That data needs to be processed close to where it’s generated, at the edge, to enable real-time decision-making, reduce latency, and enhance security. At the same time, the cloud remains vital for contextualizing operational data with enterprise systems, training AI models, and managing a consistent identity and security framework across all assets. AI models trained in the cloud can then be deployed back to the edge, where they act on events in real time. Operators can work with AI agents to reason over this data whether it’s structured or unstructured, organized in silos, or contained in free-text fields, to provide results to a mixed team of human and AI operational assets. We have a portfolio of products uniquely designed to make this continuum, from edge to cloud, more intelligent, secure, and repeatable. Together with our partners, we help bridge Operational Technology (OT) with Information Technology (IT) to deliver better business outcomes. New at Ignite: Accelerating Digital Operations We’re excited to share our latest set of investments at Ignite across our portfolio of services. A few key announcements: Azure IoT Hub New Features (Preview): Simplifying Secure Connectivity at Scale Azure IoT Hub empowers organizations to securely and reliably manage connected assets across the globe, providing real-time visibility and control over diverse operations. With proven scalability, broad device support, and robust management tools, IoT Hub delivers a unified platform for developing and operating IoT solutions. As customers evolve, Azure IoT Hub continues to advance, deepening its integration with the Azure ecosystem and enabling AI-driven, connected operations for the next generation of applications. The next generation of Azure IoT Hub investments makes it easier and more secure than ever to connect and manage distributed assets. At Ignite, we’re previewing: New certificate management capabilities that simplify device onboarding and lifecycle management. Integration with Azure Device Registry (ADR) that brings all devices into a common control plane, enabling unified identity, security, and policy management. ADR enhancements that make it easier to register, classify, and monitor assets, paving the way for consistent governance and operational insight across millions of devices. This deeper Azure integration with ADR standardizes operations, simplifies oversight of edge portfolios including IoT devices, and brings the full power of Azure’s management ecosystem to IoT and Digital Operations workloads. Azure IoT Operations New Features (GA): The Foundation for AI in the Physical World Azure IoT Operations is more than an edge-to-cloud data plane, it’s the foundation for achieving AI in the physical world, enabling intelligent operational systems that can perceive, reason, and act to drive new operational efficiencies. Built on Arc-enabled Kubernetes, Azure IoT Operations unifies operational and business data across distributed environments, eliminating silos and providing a repeatable, scalable foundation for autonomous, adaptive operations. By extending familiar Azure management concepts to physical sites, Azure IoT Operations creates an AI-ready infrastructure that supports autonomous, adaptive operations at scale. Our latest GA release of Azure IoT Operations introduced major enhancements: Wasm-powered data graphs deliver fast, modular analytics helping businesses make near real-time decisions at the edge. Expanded connectors now include OPC UA, ONVIF, REST/HTTP, Server-Sent Events (SSE), and direct MQTT for richer industrial and IT integrations. OpenTelemetry (OTel) endpoint support enables seamless telemetry pipelines and observability. Asset health monitoring to provide unprecedented visibility and control. These capabilities help bridge Information Technology, Operational Technology, and data domains, empowering customers to discover, collect, process, and send data using open standards while laying the groundwork for self-optimizing environments where AI agents and human supervisors collaborate seamlessly. Integration with Fabric IQ and Digital Twin Builder To fully unlock the value of connected data, organizations need to contextualize it, linking operational signals to business meaning. Fabric IQ, a new offering announced at Ignite, and Digital Twin Builder in Fabric make this possible, transforming raw telemetry into AI-ready context. This integration allows companies to model complex systems, run simulations, and create intelligent feedback loops across manufacturing, logistics, and energy environments. Edge AI: Real-Time Intelligence in the Physical World Azure’s AI capabilities for edge environments bring intelligence closer to where it matters most. And, because these services are Arc-enabled, organizations can develop, manage and scale AI workloads across diverse environments using consistent tooling. Today, we are announcing updates to two of our key services that enable AI at the edge: Live Video Analysis features (Public Preview) in Azure AI Video Indexer enabled by Arc: delivers real-time agentic video intelligence to improve safety, quality, and operations. Edge RAG (Retrieval Augmented Generation) Public Preview Refresh enables local generative AI reasoning with contextual awareness - empowering AI agents to act within industrial constraints securely and efficiently. These innovations accelerate time to insight and help organizations deploy AI where milliseconds matter. Partner Innovation: Scaling Real Business Value Last year, we showcased the breadth of Azure IoT Operations’ industrial ecosystem. This year, we’re celebrating how partners are integrating, co-innovating, and scaling real customer outcomes. Our partners are packaging repeatable, scalable solutions that connect operational data to enterprise systems—enabling AI-driven insights and automation across sites, regions, and industries. At this year’s Ignite, we’re highlighting some great new partner innovations: NVIDIA is working with Microsoft to enable factory digital twins using the OpenUSD standard Siemens is enabling adaptive production through AI- and digital-twin-powered solutions supported by the integration of Siemens Industrial Edge with Azure IoT Operations Litmus Edge integrates with Azure IoT Operations via the Akri framework to automatically discover industrial devices, enable secure data flows, and support Arc-enabled deployment. Rockwell Automation is streamlining edge-to-cloud integration with its FactoryTalk Optix platform by delivering contextualized, AI-ready data seamlessly within Microsoft Azure IoT Operations architectures. Sight Machine is driving advanced analytics for quality and efficiency across multi-site operations. Through initiatives like Akri, Co-Innovate, and Co-Sell Readiness, our ecosystem is developing managed applications, packaged solutions, and marketplace offerings that accelerate deployment and unlock new revenue streams. These collaborations show how Azure IoT Operations is not just a platform, but a growth engine for industrial transformation. The Path Forward With these advancements, we’re helping organizations bring AI to the physical world by turning data into intelligence and intelligence into action. Customers like Chevron and Husqvarna are scaling beyond initial pilots, expanding their deployments from single-site to multi-site rollouts, unlocking new use cases from predictive maintenance to worker safety, and proving how adaptive cloud architectures deliver measurable impact across global operations. By connecting assets, empowering partners, and delivering open, scalable platform solutions, Microsoft is helping industries achieve resilient, adaptive operations that drive measurable business value. The digital and physical worlds are coming together with solutions that are secure, observable, AI-ready, and built to scale from a single site to global operations. Together, we’re creating a smarter, more connected future. Learn More Learn more about Azure IoT Hub and Azure IoT Operations here: Azure IoT – Internet of Things Platform | Microsoft Azure Learn more about new IoT Hub public preview features here: Azure IoT Hub documentation Discover Partner Solutions: Learn how Litmus and Sight Machine are advancing industrial analytics and integration with Azure IoT Operations. Explore Rockwell Automation and Siemens for more on adaptive cloud architectures and shop floor intelligence. Going to Ignite? If you’re at Ignite this week, you can learn more about how Microsoft enables Industrial Transformation at the following sessions: The New Industrial Frontier Reshaping Digital Operations with AI from Cloud and Edge Or come visit us on the show floor at the Azure Arc Expert Meet Up Focus Area in the Cloud and AI Platforms neighborhood318Views0likes0Comments📢 Agent Loop Ignite Update - New Set of AI Features Arrive in Public Preview
Today at Ignite, we announced the General Availability of Agent Loop in Logic Apps Standard—bringing production-ready agentic automation to every customer. But GA is just the beginning. We’re also releasing a broad set of new and powerful AI-first capabilities in Public Preview that dramatically expand what developers can build: run agents in the Consumption SKU ,bring your own models through APIM AI Gateway, call any tool through MCP, deploy agents directly into Teams, secure RAG with document-level permissions, onboard with Okta, and build in a completely redesigned workflow designer. With these preview features layered on top of GA, customers can build AI applications that bring together secure tool calling, user identity, governance, observability, and integration with their existing systems—whether they’re running in Standard, Consumption, or the Microsoft 365 ecosystem. Here’s a closer look at the new capabilities now available in Public Preview. Public Preview of Agentic Workflows in Consumption SKU Agent Loop is now available in Azure Logic Apps Consumption, bringing autonomous and conversational AI agents to everyone through a fully serverless, pay-as-you-go experience. You can now turn any workflow into an intelligent workflow using the agent loop action—without provisioning infrastructure or managing AI models. This release provides instant onboarding, simple authentication, and a frictionless entry point for building agentic automation. Customers can also tap into Logic Apps’ ecosystem of 1,400+ connectors for tool calling and system integrations. This update makes AI-powered automation accessible for rapid prototyping while still offering a clear path to scale and production-ready deployments in Logic Apps Standard, including BYOM, VNET integration, and enterprise-grade controls. Preview limitations include limited regions, no VS Code local development, and no nested agents or MCP tools yet. Read more about this in our announcement blog! Bring your Own Model We’re excited to introduce Bring Your Own Model (BYOM) support in Agent Loop for Logic Apps Standard - making it possible to use any AI model in your agentic workflows from Foundry, and even on-prem or private cloud models. The key highlight of this feature is the deep integration with the Azure API Management (APIM) AI Gateway, which now serves as the control plane for how Agent Loop connects to models. Instead of wiring agents directly to individual endpoints, AI Gateway creates a single, governed interface that manages authentication, keys, rate limits, and quotas in one place. It provides built-in monitoring, logging, and observability, giving you full visibility into every request. It also ensures a consistent API shape for model interactions, so your workflows remain stable even as backends evolve. With AI Gateway in front, you can test, upgrade, and refine your model configuration without changing your Logic Apps, making model management safer, more predictable, and easier to operate at scale. Beyond AI Gateway, Agent Loop also supports: Direct external model integration when you want lightweight, point-to-point access to a third-party model API. Local/VNET model integration for on-prem, private cloud, or custom fine-tuned models that require strict data residency and private networking. Together, these capabilities let you treat the model as a pluggable component - start with the model you have today, bring in specialized or cost-optimized models as needed, and maintain enterprise-grade governance, security, and observability throughout. This makes Logic Apps one of the most flexible platforms for building model-agnostic, production-ready AI agent workflows. Ready to try this out? Go to http://aka.ms/agentloop/byom to learn more and get started. MCP support for Agent Loop in Logic Apps Standard Agent Loop in Azure Logic Apps Standard now supports the Model Context Protocol (MCP), enabling agents to discover and call external tools through an open, standardized interface. This brings powerful, flexible tool extensibility to both conversational and autonomous agents. Agent Loop offers three ways to bring MCP tools into your workflows: Bring Your Own MCP connector – Point to any external MCP server using its URL and credentials, instantly surfacing its published tools in your agent. Managed MCP connector – Access Azure-hosted MCP servers through the familiar managed connector experience, with shared connections and Azure-managed catalogs. Custom MCP connector – Build and publish your own OpenAPI-based MCP connector to expose private or tenant-scoped MCP servers. Idea for reusability of MCPs across organization. Managed and Custom MCP connectors support on-behalf-of (OBO) authentication, allowing agents to call MCP tools using the end user’s identity. This provides user-context-aware, permission-sensitive tool access across your intelligent workflows. Want to learn more – check out our announcement blog and how-to documents. Deploy Conversational Agents to Teams/M365 Workflows with conversational agents in Logic Apps can now be deployed directly into Microsoft Teams, so your agentic workflows show up where your users already live all day. Instead of going to a separate app or portal, employees can ask the agent questions, kick off approvals, check order or incident status, or look up internal policies right from a Teams chat or channel. The agent becomes just another teammate in the conversation—joining stand-ups, project chats, and support rooms as a first-class participant. Because the same Logic Apps agent can also be wired into other Microsoft 365 experiences that speak to Bots and web endpoints, this opens the door to a consistent and personalized “organization copilot” that follows users across the M365 ecosystem: Teams for chat, meetings, and channels today, and additional surfaces over time. Azure Bot Service and your proxy handle identity, tokens, and routing, while Logic Apps takes care of reasoning, tools, and back-end systems. The result is an agent that feels native to Teams and Microsoft 365—secure, governed, and always just one @mention away. Ready to bring your agentic workflows into Teams? Here’s how to get started. Secure Knowledge Retrieval for AI Agents in Logic Apps We’ve added native document-level authorization to Agent Loop by integrating Azure AI Search ACLs. This ensures AI agents only retrieve information the requesting user is permitted to access—making RAG workflows secure, compliant, and permission-aware by default. Documents are indexed with user or group permissions, and Agent Loop automatically applies those permissions during search using the caller’s principal ID or group memberships. Only authorized documents reach the LLM, preventing accidental exposure of sensitive data. This simplifies development, removes custom security code, and allows a single agent to safely serve users with different access levels—whether for HR, IT, or internal knowledge assistants. Here is our blogpost to learn more about this feature. Okta Agent Loop now supports Okta as an identity provider for conversational agents, alongside Microsoft Entra ID. This makes it easy for organizations using Okta for workforce identity to pass authenticated user context—including user attributes, group membership, and permissions—directly into the agent at runtime. Agents can now make user-aware decisions, enforce access rules, personalize responses, and execute tools with proper user context. This update helps enterprises adopt Agent Loop without changing their existing identity architecture and enables secure, policy-aligned AI interactions across both Okta and Entra environments. Setting up Okta as the identity provider requires a few steps and they are all explained in details here at Logic Apps Labs. Designer makeover! We’ve introduced a major redesign of the Azure Logic Apps designer, now in Public Preview for Standard workflows. This release marks the beginning of a broader modernization effort to make building, testing, and operating workflows faster, cleaner, and more intuitive. The new designer focuses on reducing friction and streamlining the development loop. You now land directly in the designer when creating a workflow, with plans to remove early decisions like stateful/stateless or agentic setup. The interface has been simplified into a single unified view, bringing together the visual canvas, code view, settings, and run history so you no longer switch between blades. A major addition is Draft Mode with auto-save, which preserves your work every few seconds without impacting production. Drafts can be tested safely and only go live when you choose to publish—without restarting the app during editing. Search has also been completely rebuilt for speed and accuracy, powered by backend indexing instead of loading thousands of connectors upfront. The designer now supports sticky notes and markdown, making it easy to document workflows directly on the canvas. Monitoring is integrated into the same page, letting you switch between runs instantly and compare draft and published results. A new hierarchical timeline view improves debugging by showing every action executed in order. This release is just the start—many more improvements and a unified designer experience across Logic Apps are on the way as we continue to iterate based on your feedback. Learn more about the designer updates in our announcement blog ! What's Next We’d love your feedback. Which capabilities should we prioritize, and what would create the biggest impact for your organization?336Views0likes0Comments🎉Announcing General Availability of Agent Loop in Azure Logic Apps
Transforming Business Automation with Intelligent, Collaborative Multi-Agentic workflows! Agent Loop is now Generally Available in Azure Logic Apps Standard, turning Logic Apps platform into a complete multi-agentic automation system. Build AI agents that work alongside workflows and humans, secured with enterprise-grade identity and access controls, deployed using your existing CI/CD pipelines. Thousands of customers have already built tens of thousands of agents—now you can take them to production with confidence. Get Started | Workshop | Demo Videos | Ignite 2025 Session | After an incredible journey since we introduced Agent Loop at Build earlier this year, we're thrilled to announce that Agent Loop is now generally available in Azure Logic Apps. This milestone represents more than just a feature release—it's the culmination of learnings from thousands of customers who have been pushing the boundaries of what's possible with agentic workflows. Agent Loop transforms Azure Logic Apps into a complete multi-agentic business process automation platform, where AI agents, automated workflows, and human expertise collaborate seamlessly to solve complex business challenges. With GA, we're delivering enterprise-grade capabilities that organizations need to confidently deploy intelligent automation at scale. The Journey to GA: Proven by Customers, Built for Production Since our preview launch at Build, the response has been extraordinary. Thousands of customers—from innovative startups to Fortune 500 enterprises—have embraced Agent Loop, building thousands of active agents that have collectively processed billions of tokens every month for the past six months. The growth of agents, executions, and token usage has accelerated significantly, doubling month over month. Since the launch of Conversational Agents in September, they already account for nearly 30% of all agentic workflows. Across the platform, agentic workflows now consume billions of tokens, with overall token usage increasing at nearly 3× month over month. Cyderes: 5X Faster Security Investigation Cycles Cyderes leveraged Agent Loop to automate triage and handling of security alerts, leading to faster investigation cycles and significant cost savings. "We were drowning in data—processing over 10,000 alerts daily while analysts spent more time chasing noise than connecting narratives. Agent Loop changed everything. By empowering our team to design and deploy their own AI agents through low-code orchestration, we've achieved 5X faster investigation cycles and significant cost savings, all while keeping pace with increasingly sophisticated cyber threats that now leverage AI to operate 25X faster than traditional attacks." – Eric Summers, Engineering Manager - AI & SOAR Vertex Pharmaceuticals: Hours Condensed to Minutes Vertex Pharmaceuticals unlocked knowledge trapped across dozens of systems via a team of agents. VAIDA, built with Logic Apps and Agent Loop, orchestrates multiple AI agents and helps employees find information faster, while maintaining compliance and supporting multiple languages. "We had knowledge trapped across dozens of systems—ServiceNow, documentation, training materials—and teams were spending valuable time hunting for answers. Logic Apps Agent Loop changed that. VAIDA now orchestrates multiple AI agents to summarize, search, and analyze this knowledge, then routes approvals right in Teams and Outlook. We've condensed hours into minutes while maintaining compliance and delivering content in multiple languages." – Pratik Shinde, Director, Digital Infrastructure & GenAI Platforms Where Customers Are Deploying Agent Loop Customers across industries are using Agent Loop to build AI applications that power both everyday tasks and mission-critical business processes across Healthcare, Retail, Energy, Financial Services, and beyond. These applications drive impact across a wide range of scenarios: Developer Productivity: Write code, generate unit tests, create workflows, map data between systems, automate source control, deployment and release pipelines IT Operations: Incident management, ticket and issue handling, policy review and enforcement, triage, resource management, cost optimization, issue remediation Business Process Automation: Empower sales specialists, retail assistants, order processing/approval flows, and healthcare assistants for intake and scheduling Customer & Stakeholder Support: Project planning and estimation, content generation, automated communication, and streamlined customer service workflows Proven Internally at Microsoft Agent Loop is also powering Microsoft and Logic Apps team's own operations, demonstrating its versatility and real-world impact: IcM Automation Team: Transforming Microsoft's internal incident automation platform into an agent studio that leverages Logic Apps' Agent Loop, enabling teams across Microsoft to build agentic live site incident automations Logic Apps Team Use Cases: Release & Deployment Agent: Streamlines deployment and release management for the Logic Apps platform Incident Management Agent: An extension of our SRE Agent, leveraging Agent Loop to accelerate incident response and remediation Analyst Agent: Assists teams in exploring product usage and health data, generating insights directly from analytics What's Generally Available Today Core Agent Loop Capabilities (GA) Agent Loop in Logic Apps Standard SKU - Support for both Autonomous and Conversational workflows Autonomous workflows run agents automatically based on triggers and conditions Conversational workflows use A2A to enable interactive chat experiences with agents On-Behalf-Of Authentication - Per-user authentication for 1st-party and 3rd-party connectors Agent Hand-Off - Enable seamless collaboration in multi-agent workflows Python Code Interpreter - Execute Python code dynamically for data analysis and computation Nested Agent Action - Use agents as tools within other agents for sophisticated orchestration User ACLs Support - Fine-grained document access control for knowledge Exciting New Agent Loop Features in Public Preview We've also released several groundbreaking features in Public Preview: New Designer Experience - Redesigned interface optimized for building agentic workflows Agent Loop in Consumption SKU - Deploy agents in the serverless Consumption tier MCP Support - Integrate Model Context Protocol servers as tools, enabling agents to access standardized tool ecosystems AI Gateway Integration - Use Azure AI Gateway as a model source for unified governance and monitoring Teams/M365 Deployment - Deploy conversational agents directly in Microsoft Teams and Microsoft 365 Okta Identity Provider - Use Okta as the identity provider for conversational agents Here’s our Announcement Blog for these new capabilities Built on a Platform You Already Trust Azure Logic Apps is already a proven iPaaS platform with thousands of customers using it for automation – ranging from startups to 100% of Fortune 500 companies. Agent Loop doesn't create a separate "agentic workflow automation platform" you have to learn and operate. Instead, it makes Azure Logic Apps itself your agentic platform: Workflows orchestrate triggers, approvals, retries, and branching Agent Loop, powered by LLMs, handle reasoning, planning, and tool selection Humans stay in control through approvals, exceptions, and guided hand-offs Agent Loop runs inside your Logic Apps Standard environment, so you get the same benefits you already know: enterprise SLAs, VNET integration, data residency controls, hybrid hosting options, and integration with your existing deployment pipelines and governance model. Enterprise Ready - Secure, User-Aware Agents by Design Bringing agents into the enterprise only works if security and compliance are first-class. With Agent Loop in Azure Logic Apps, security is built into every layer of the stack. Per-User Actions with On-Behalf-Of (OBO) and Delegated Permissions Many agent scenarios require tools to act in the context of the signed-in user. Agent Loop supports the OAuth 2.0 On-Behalf-Of (OBO) flow so that supported connector actions can run with delegated, per-user connections rather than a broad app-only identity. That means when an agent sends mail, reads SharePoint, or updates a service desk system, it does so as the user (where supported), respecting that user's licenses, permissions, and data boundaries. This is critical for scenarios like IT operations, HR requests, and finance approvals where "who did what" must be auditable. Document-Level Security with Microsoft Entra-Based Access Control Agents should only see the content a user is entitled to see. With Azure AI Search's Entra-based document-level security, your retrieval-augmented workflows can enforce ACLs and RBAC directly in the index so that queries are automatically trimmed to documents the user has access to. Secured Chat Entry Point with Easy Auth and Entra ID The built-in chat client and your custom clients can be protected using App Service Authentication (Easy Auth) and Microsoft Entra ID, so only authorized users and apps can invoke your conversational endpoints. Together, OBO, document-level security, and Easy Auth give you end-to-end identity and access control—from the chat surface, through the agent, down to your data and systems. An Open Toolbox: Connectors, Workflows, MCP Servers, and External Agents Agent Loop inherits the full power of the Logic Apps ecosystem and more - 1,400+ connectors for SaaS, on-premises, and custom APIs Workflows and agents as tools - compose sophisticated multi-step capabilities MCP server support - integrate with the Model Context Protocol for standardized tool access (Preview) A2A protocol support - enable agent-to-agent communication across platforms Multi-model flexibility - use Azure OpenAI, Azure AI Foundry hosted models, or bring your own model on any endpoint via AI gateway You're not locked into a single vendor or model provider. Agent Loop gives you an open, extensible framework that works with your existing investments and lets you choose the right tools for each job. Run Agents Wherever You Run Logic Apps Agent Loop is native to Logic Apps Standard, so your agentic workflows run consistently across cloud, on-premises, or hybrid environments. They inherit the same deployment, scaling, and networking capabilities as your workflows, bringing adaptive, AI-driven automation to wherever your systems and data live. Getting Started with Agent Loop We're in very exciting times, and we can't wait to see our customers go to production and realize the benefits of these capabilities for their business outcomes and success. Here are some useful links to get started on your AI journey with Logic Apps! Logic Apps Labs - https://aka.ms/LALabs Workshop - https://aka.ms/la-agent-in-a-day Demos - https://aka.ms/agentloopdemos364Views1like0Comments