integration
171 TopicsIntroducing native Service Bus message publishing from Azure API Management (Preview)
We’re excited to announce a preview capability in Azure API Management (APIM) — you can now send messages directly to Azure Service Bus from your APIs using a built-in policy. This enhancement, currently in public preview, simplifies how you connect your API layer with event-driven and asynchronous systems, helping you build more scalable, resilient, and loosely coupled architectures across your enterprise. Why this matters? Modern applications increasingly rely on asynchronous communication and event-driven designs. With this new integration: Any API hosted in API Management can publish to Service Bus — no SDKs, custom code, or middleware required. Partners, clients, and IoT devices can send data through standard HTTP calls, even if they don’t support AMQP natively. You stay in full control with authentication, throttling, and logging managed centrally in API Management. Your systems scale more smoothly by decoupling front-end requests from backend processing. How it works The new send-service-bus-message policy allows API Management to forward payloads from API calls directly into Service Bus queues or topics. High-level flow A client sends a standard HTTP request to your API endpoint in API Management. The policy executes and sends the payload as a message to Service Bus. Downstream consumers such as Logic Apps, Azure Functions, or microservices process those messages asynchronously. All configurations happen in API Management — no code changes or new infrastructure are required. Getting started You can try it out in minutes: Set up a Service Bus namespace and create a queue or topic. Enable a managed identity (system-assigned or user-assigned) on your API Management instance. Grant the identity the “Service Bus data sender” role in Azure RBAC, scoped to your queue/ topic. Add the policy to your API operation: <send-service-bus-message queue-name="orders"> <payload>@(context.Request.Body.As<string>())</payload> </send-service-bus-message> Once saved, each API call publishes its payload to the Service Bus queue or topic. 📖 Learn more. Common use cases This capability makes it easy to integrate your APIs into event-driven workflows: Order processing – Queue incoming orders for fulfillment or billing. Event notifications – Trigger internal workflows across multiple applications. Telemetry ingestion – Forward IoT or mobile app data to Service Bus for analytics. Partner integrations – Offer REST-based endpoints for external systems while maintaining policy-based control. Each of these scenarios benefits from simplified integration, centralized governance, and improved reliability. Secure and governed by design The integration uses managed identities for secure communication between API Management and Service Bus — no secrets required. You can further apply enterprise-grade controls: Enforce rate limits, quotas, and authorization through APIM policies. Gain API-level logging and tracing for each message sent. Use Service Bus metrics to monitor downstream processing. Together, these tools help you maintain a consistent security posture across your APIs and messaging layer. Build modern, event-driven architectures With this feature, API Management can serve as a bridge to your event-driven backbone. Start small by queuing a single API’s workload, or extend to enterprise-wide event distribution using topics and subscriptions. You’ll reduce architectural complexity while enabling more flexible, scalable, and decoupled application patterns. Learn more: Get the full walkthrough and examples in the documentation 👉 here1.5KViews2likes3CommentsHow Azure NetApp Files Object REST API powers Azure and ISV Data and AI services – on YOUR data
This article introduces the Azure NetApp Files Object REST API, a transformative solution for enterprises seeking seamless, real-time integration between their data and Azure's advanced analytics and AI services. By enabling direct, secure access to enterprise data—without costly transfers or duplication—the Object REST API accelerates innovation, streamlines workflows, and enhances operational efficiency. With S3-compatible object storage support, it empowers organizations to make faster, data-driven decisions while maintaining compliance and data security. Discover how this new capability unlocks business potential and drives a new era of productivity in the cloud.286Views0likes0CommentsLogic Apps Aviators Newsletter - October 25
In this issue: Ace Aviator of the Month News from our product group News from our community Ace Aviator of the Month October Ace Aviator: Robin Wilde Business and Marketing Manager @Contica What's your role and title? What are your responsibilities? My role is Business and Marketing Manager at Contica. My main responsibility is helping our customers translate business challenges into technical solutions. I come from a technical background as a BizTalk and Azure developer and architect, so I have one foot in the technical world and the other in the business side. Can you give us some insights into your day-to-day activities and what a typical day in your role looks like? On any given day, I’m improving our customer offerings, diving into project challenges with colleagues, and exploring new technologies with Ahmed Bayoumy to find better ways of working. If I’m lucky, we’re recording a new episode of Integration Love Story with a community member/friend and learning from their life and tech experience. What motivates and inspires you to be an active member of the Aviators/Microsoft community? The community itself. Since I started working with integration, I’ve always felt that people genuinely want to share their knowledge and experience. The people behind the blog posts that helped me grow as a junior integration developer have turned out to be some of the most humble and generous individuals I’ve met. Everyone is open to sharing their experiences in a kind and respectful way, and that’s incredibly motivating. Looking back, what advice do you wish you had been given earlier that you'd now share with those looking to get into STEM/technology? Everyone has been a beginner. Don’t be afraid to reach out, ask for help and most importantly, stay curious and lean into new technology. “The more you know, the more you realize you don’t know.” And that’s the beauty of it! What has helped you grow professionally? Curiosity and the drive to keep learning. I've been something of a "Yes man" when it comes to new challenges. And being in an environment where the people around you accept you for who you are, every day, that’s a recipe for real growth. If you had a magic wand that could create a feature in Logic Apps, what would it be and why? A built-in AI-powered "next action suggestion” in your Logic App workflow based on other designs in your resource group or predefined business process in the same logic app standard that you are working one. A feature that would help me be more productivity! News from our product group Logic Apps Live September 2025 Missed Logic Apps Live in September? You can watch it here. We shared our latest updates in AI, including the latest refresh on Agent Loop and the introduction of Logic Apps MCP support. Logic Apps Community Day 2025 We are bringing Logic Apps Community Day again this year, on October 30, 2025 (Pacific Time) and we want you to join us as we host a full day of learning where you will be the star! Check the agenda register for this must watch event. Logic Apps - MCP Demos Explore how Azure Logic Apps and API Center simplify building MCP servers to integrate with Salesforce, Dataverse, SharePoint, ServiceNow, and even Copilot Studio. Introducing Logic Apps MCP servers (Public Preview) Microsoft has launched public preview support for building MCP servers using Azure Logic Apps (Standard). This enables developers to turn connectors into modular, reusable tools for scalable agent creation. Two approaches are supported: streamlined setup via Azure API Center and custom configuration for advanced control. Both methods simplify integration across enterprise systems. Announcement! Python Code Interpreter in Logic Apps is now in Public Preview Logic Apps now support Python code execution via Azure Container Apps, enabling users to analyze data, generate insights, and automate tasks using natural language. This feature empowers business users to explore data without writing code, streamlining workflows across sales, finance, and operations. Azure Logic Apps: Ushering in the Era of Multi-Agentic Business Process Automation Microsoft has transformed Azure Logic Apps into a multi-agentic automation platform, introducing features like Agent Loop for AI-driven workflows, Python Code Interpreter for custom logic, and Foundry Agent Service for model integration. These enhancements enable intelligent, collaborative automation with conversational agents, advanced orchestration, and enterprise-grade security and observability. Calling Logic Apps MCP Server from Copilot Studio Microsoft’s Azure Logic Apps now supports MCP Server integration with Copilot Studio, enabling secure and scalable access to enterprise data assets. Using API Center and Logic Apps Standard, developers can expose workflows and connectors as MCP tools, authenticated via Entra ID and Managed Identity. This setup allows conversational agents in Copilot Studio to interact with enterprise systems efficiently and securely. News from our community What Every Developer Needs to Know about AI! Post by Stephen W. Thomas AI is rapidly transforming integration development, and this guide offers a practical roadmap for developers to stay relevant. It emphasizes three focus areas: building foundational AI knowledge, boosting productivity with tools like GitHub Copilot, and mastering enterprise-ready skills through APIs, Microsoft tools, MCP, and agents. With curated resources and expert recommendations, developers can confidently navigate the evolving AI landscape and apply it effectively in business scenarios. BizTalk to Logic Apps with Harold Campos: Standard or Hybrid, what changes & what maps Video by Ahmed Bayoumy Migrating from BizTalk to Azure? In this conversation with Harold Campos, Principal PM - Azure Logic Apps at Microsoft, Ahmed unpacks Logic Apps Hybrid and Logic apps options. What it is, when to use it, and how it helps teams modernize without losing on-prem control. Determinism vs Nondeterminism Post by Håkan Åkerblom Determinism or nondeterminism, two concepts that shape everything from science to philosophy. But what do they mean for system integration? Track Down and Delete Unused Logic Apps in Azure Post by Dieter Gobeyn Azure Logic Apps (Consumption) charge per execution, but even inactive ones can still cost money. Triggers like polling or timers may continue to run, leading to hidden costs. Cleaning up unused Logic Apps improves governance and supports FinOps goals. Handling “When a Blob is Added or Modified” Trigger Limitation and Pick Files from Subfolders with File Extension Filters Post by Prashant Singh Learn how to overcome Logic Apps’ limitations with blob triggers in subfolders and file filters in this Post from Prashant, learn about trigger conditions and Event Grid options to make your workflows smarter and more scalable. Consume an MCP Endpoint from Azure Logic Apps — with an Agent Loop Post by Daniel Jonathan In this post, Daniel shows how to use Azure Logic Apps with an Agent Loop to discover MCP tools, select the right one, and invoke it—all in one flow. Perfect for building smart, dynamic integrations with MCP servers. Logic Apps ❤️ MCP — Expose Arithmetic Tools (Add & Subtract) Post by Daniel Jonathan And Daniel is in a roll. In this post, Daniel shows how to expose simple arithmetic operations—like add and subtract—as MCP tools using Logic Apps. With Postman support, testing these agent-ready workflows is now easier than ever. Logic Apps & MCP - Leverage Your Existing Integration Platform Post by Pim Simons and Michel Pauwels Learn how to turn existing Logic Apps into MCP servers, enabling secure access for AI agents to your APIs, workflows, and data—without changing your architecture. A smart way to future-proof your integrations. Building Approval Workflows with Logic Apps, Adaptive Cards, and Microsoft Teams Post by Saroj Kumar Learn how to build secure, reusable approval workflows using Azure Logic Apps, Adaptive Cards, and Microsoft Teams. Approvers stay in Teams, while automation handles the rest—fast, transparent, and scalable. Microsoft Introduces Logic Apps as MCP Servers in Public Preview Post by Steef-Jan Wiggers In this InfoQ article, Steef-Jan introduces the public preview support for Azure Logic Apps as MCP servers, enabling scalable, secure integration with AI agents and enterprise systems. Build reusable tools and workflows that agents can discover and invoke with ease. The 56 Resubmits Trap in Logic App Standard Post by Luis Rigueira Logic App Standard limits bulk resubmissions to 56 runs every 5 minutes. This post shows how to bypass that using callback URLs for faster, parallel recovery—ideal when dealing with thousands of failed runs. Stop Using Azure Logic Apps for Data Integration Post by Al Ghoniem Workflow tools like Azure Logic Apps aren’t built for heavy data processing. This post explains why ETL/ELT tools are better for data integration—and how to combine both for scalable, reliable pipelines. Integration Love Story with Tom Canter Video by Ahmed Bayoumy and Robin Wilde In this episode of 'Integration Love Story,' Ahmed and Robin interview the legendary Tom Cantor. He discusses his early involvement and long-standing focus on integration, dating back to 1998. Tom emphasizes the community-centric nature of the integration field, recounting personal stories that highlight the value of trust and collaboration among peers.361Views0likes0CommentsOptimizing Azure DevOps Jira Integration: 5 Practical Use Cases for DevOps Teams
Many teams rely on Azure DevOps (ADO) for development and Jira for project or product management. While each tool is powerful on its own, things often get messy when work items, statuses, and updates live in separate systems. Integrating the two platforms can remove a lot of friction. Below are six common use cases I have seen from real teams, with concrete problems and solutions to make the connection between Jira and Azure DevOps work smoothly. 1. Keeping User Stories and Bugs in Sync Challenge: Teams use Jira for user requests and Azure DevOps for development tasks. Manually updating both systems is tedious and error-prone. Solution: Enable two-way synchronization so that changes in Jira automatically reflect in Azure DevOps and vice versa (including comments and status updates). This keeps bugs and stories aligned without duplicate work. “Before we integrated Jira with Aure DevOps, I spent too much time manually updating task statuses in both systems. Now, with the automatic sync, my team is focused on actual coding work instead of managing project statuses across platforms.” — DevOps Engineer 2. One-Way Sync for Project Management–First Teams Challenge: Some organizations plan and track everything in Jira but manage code exclusively in Azure DevOps. Developers only need the essentials pushed across. Solution: Use a one-way sync from Jira → Azure DevOps to bring over metadata like titles, statuses, sprints, and due dates. Developers see the context they need without cluttering both systems with manual updates. “We rely on Jira for all project planning and management, but the developers need a clean workspace in Azure DevOps. A one-way sync from Jira to ADO helps us keep things efficient and ensures developers always have the latest information without double entry.” — Product Owner 3. Creating Jira Tickets from Azure DevOps Tasks or Bugs Challenge: External partners or stakeholders may only work in Jira Service Management to manage tickets. Developers in Azure DevOps often need their work mirrored for transparency. Solution: Configure automated ticket creation in Jira when certain ADO tasks are tagged. Both teams can track progress in their preferred tool without duplicating effort. “We use Azure DevOps internally, but our external stakeholders only work in Jira. Automating the creation of Jira tickets based on Azure DevOps tasks or bugs has made collaboration seamless and ensured no work is lost in translation.” — DevOps Lead 4. Syncing Epics, Features, and Work Items Challenge: High-level epics might live in Jira, while features and tasks are managed in Azure DevOps. Without integration, visibility across systems is fragmented. Solution: Sync epics and features so Jira provides portfolio-level visibility, while Azure DevOps remains the system of record for detailed development work. This keeps roadmaps and execution aligned. “Tracking epics in Jira while managing the technical work in Azure DevOps used to cause us to lose visibility. Now, everything from high-level epics to individual tasks is in sync, so we always know where we stand.” — Azure DevOps Product Manager 5. Managing Multiple Jira Projects with One Azure DevOps Project Challenge: Large organizations often run multiple Jira projects (by teams or business units) but only one Azure DevOPs project for development. Syncing everything consistently is tough. Solution: Map multiple Jira projects to a single Azure DevOps project, syncing only the key data (titles, statuses, sprints, custom fields). This creates a unified development view without losing project-specific details. “We have multiple teams using different Jira projects, but we consolidate all development work into a single Azure DevOps project. Syncing across these platforms used to be a nightmare, but now everything stays aligned, and we’re able to track all initiatives in one place.” — Azure DevOps Engineer 💬 Have you integrated Jira with Azure DevOps in your team? What worked well, and what challenges did you run into?110Views0likes1CommentCodeful Workflows: A New Authoring Model for Logic Apps Standard
📝 This blog introduce early concepts of a pre-release functionality and is subject to change. Azure Logic Apps Standard offers you a powerful cloud orchestration engine, enabling you to build and run automated workflows that effortlessly integrate resources from various services, systems, apps, and data sources. Whether you're looking to streamline processes across a complex enterprise or simply reduce the need for extensive coding, this platform provides a solution that's both efficient and flexible. For those of you who require more control over workflow designs or want to leverage your expertise in frameworks like .NET and the Durable Tasks framework, Logic Apps Standard now introduces an exciting new feature: Codeful Workflows. With Codeful Workflows, you can define workflows using an imperative programming style, blending the flexibility of coding with the simplicity and operational strengths of Logic Apps. This means you can structure your workflows the way that makes sense to you while still tapping into the rich ecosystem of connectors and tools built into Logic Apps. What Are Codeful Workflows? Codeful Workflows expand the authoring and execution models of a Logic Apps Standard, offering developers the ability to implement, test and run workflows using an imperative programming model both locally and in the cloud. Built on frameworks like .NET and the Durable Tasks framework, Codeful Workflows allow you to structure workflows in code while seamlessly integrating with Logic Apps Standard rich connector ecosystem, and leverage its operational capabilities. The core elements of a Logic App workflow—triggers, actions and connections —are translated into durable task concepts within this codeful model: Triggers are implemented as Client Functions that invoke durable orchestrations, which contain the body of the workflow, blending logic implemented by the language primitives, with connections actions for external connectivity. Connector actions are presented as Activity Functions. The Logic Apps Connector ecosystem is exposed to you via an SDK, bringing discoverability and rich support for intelisense when creating action inputs, invoking actions or reusing action outputs in later steps. The SDK vastly simplifies the execution of those connectors, by wrapping them internally on a Activity Function, so you don’t need to create new activities for each connector action you want to invoke. Connections, which manages the connectivitiy between actions and end systems, remains unchanged, allowing you to setup once and share connections between multiple orchestrations and logic apps declarative workflows. Connector actions uses a reference to a connection, providing flexibility between local and cloud configurations. Using those building blocks, you can create workflows using familiar programming paradigms, while still benefiting from the easy configuration and operational feature of Logic Apps Standard. If you are an existing Logic Apps Standard customer, your codeful and visual workflows can coexist within the same application, bridging the gap between pro-code and low-code approaches. With those two execution models working hand in hand on the same application, Logic Apps Standard becomes a comprehensive orchestration tool that caters to all developer personas, from integration specialists to enterprise teams, with no cliffs on their experience. Creating Codeful Workflows Designing codeful workflows begins with creating a new Logic Apps project within Visual Studio Code, configured for .NET and the Durable Tasks framework. From triggers to actions, developers gain full flexibility to define their workflows programmatically. Implementing Triggers Triggers are the entry points of workflows, and in Codeful Workflows, they are defined as Client Functions. For example, an HTTP trigger can start a workflow when a request is received: [FunctionName("HelloTrigger")] public static async Task<HttpResponseMessage> HttpStart( [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestMessage req, [DurableClient] IDurableOrchestrationClient starter, ILogger log) { var requestContent = await req.Content.ReadAsStringAsync(); var workflowInput = new HTTPHelloInput { Greeting = $"Hello from Codeful workflows. You said '{requestContent}'" }; log.LogInformation("Workflow Input = '{workflowInput}'.", JsonSerializer.Serialize(workflowInput)); string instanceId = await starter.StartNewAsync("HelloOrchestrator", workflowInput); log.LogInformation("Started orchestration with ID = '{instanceId}'.", instanceId); return await starter.WaitForCompletionOrCreateCheckStatusResponseAsync(req, instanceId); } Using Connector Actions Both Managed and Service Provider Actions are available to be used within your orchestrations. They are organized in the SDK by type making it easy to find the right connector to use. Once you identify the action to use, you can use the rich intelisense interface to generate inputs and call the action directly in your orchestration code. Deployment and Operations Deploying Logic Apps Standard that uses both codeful and codeless workflows follows the same practices already available in Logic Apps Standard. Operational insights, such as endpoint visibility and execution monitoring, are provided within the Azure Portal, ensuring parity with the functionality available for codeless workflows. This cohesive deployment model allows organizations to maximize their resources and cater to diverse development needs, whether they require quick prototyping via low-code tools or robust, scalable solutions through pro-code implementations. Codeful Workflows and Intelligent Agents You can take advantage of codeful workflows and Logic Apps Standard Agent Loop to create new intelligent applications that embed advanced AI decision-making directly into your processes – enabling your apps and automation to not just follow predefined steps, but to reason, adapt, and act autonomously towards goals. See this demo where we share two approaches to implement agent loops – combining codeful and codeless workflows, where you can reuse existing workflows as tools, and writing agent loop actions directly with code: Looking for feedback on Codeful Workflows We are looking for early feedback on this feature. If you are interested in participating on a private preview, please use the form below to register your interest and we will contact you to share the instructions. https://aka.ms/lacodeful/privatepreview/form2.8KViews4likes4CommentsLogic Apps Community Day 2025
We have delayed the Speaker announcements, as we had to keep the sessions open for an extra day. Speakers will now by notified by email by September 13th, 2025. Speakers and sessions will be published by September 19th, 2025 We are bringing Logic Apps Community Day again this year, on October 30, 2025 (Pacific Time) and we want you to join us as we host a full day of learning where you will be the star! The Logic Apps Community Day is a free event driven by Microsoft, for anyone who wants to learn more about Logic Apps and how it can help to solve real life integration problems. This year, we want to learn how you have been using AI with Logic Apps, so our themes for the sessions are: Creating Intelligent Applications with Logic Apps and AI: tell us how you have been using Logic Apps features to implement your intelligent application scenarios - from Agent Loops, to Logic Apps exposed as MCP tools, to improving your intelligent application knowledge in real time - we want to see the scenarios you created! Accelerating Logic Apps Developer Velocity with AI: How are you taking advantage of Gen AI to make your developer life easier using Logic Apps? From prompts to create test data, to automated creation of maps, unit test or custom code and anything in between. Maybe you have been using prompts, instructions or chat modes to make your development life easier? We want to see it all! Registration, Speakers and Sessions Want to To make sure you are notified on the day, and want to add the event on your calendar? You can register on our Reactor event page! Check out our agenda below - it will be a day packed with information and lots of amazing topics! Speaker Session Title Session Abstract Sebastian Meyer Intelligent Enterprise Integration – Automated Order Processing with AI and SAP This session explores how modern enterprises can streamline and safeguard their business operations through intelligent system integration. At the core is an AI-powered agent loop implemented in Logic Apps that automatically receives and parses flat files containing individual purchase orders. The Agent identifies critical order quantities and routes them to a business user for manual approval—ensuring compliance with internal control policies. Only approved Orders are transmitted to the SAP system. Finally, the enriched data—combining raw order details and SAP references—is automatically sent via email to the respective business partner. The result is a seamless, transparent, and scalable process that combines human oversight with AI-driven automation. Attendees will learn: - Create AI Agent Loops with Logic Apps - Bring in a Human in the loop - Re-using existing LA Workflows as Tools - Use Connectors to communicate with external Systems - Operational Aspects Toon Vanhoutte Smart invoice processing with Logic Apps Many organizations struggle with identifying the correct internal approver for incoming purchase invoices. It's a process that often involves time-consuming manual work, especially when no purchase order reference is available. In today’s era of generative AI, there must be a smarter way to streamline this challenge. Join me for an engaging session where I demonstrate how Microsoft Logic Apps Agent Loop can revolutionize invoice processing. You’ll see a live demo showcasing how structured data can be intelligently extracted from purchase invoices and how an AI-powered agent can reason over documented business rules to automatically determine the appropriate approver. No need for costly OCR solutions—just the power of Logic Apps and generative AI working together to simplify and accelerate your financial workflows. Michael Stephenson Can I unit test an agentic workflow with logic apps? In this session we will talk about the challenges of testing non deterministic workflows and look at how you might use the logic app unit test framework to be able to implement some testing strategies. Florian de Langhe Unlocking Excel Analytics: Logic Apps + Code Interpreter in Action Excel analysis has always been a challenge in Logic Apps workflows, limiting us to basic read/write operations while complex analytics required external services. This session demonstrates how the new Code Interpreter action, combined with intelligent looping patterns, unlocks powerful Python-driven analytical capabilities directly within your Logic Apps flows. Stephen W Thomas Exposing Your Logic App As An MCP Server – What, Why, and How? Have you heard all the buzz around MCP Servers, but don't really know how it can help you? This session will talk about why using MCP Servers with new and existing Logic Apps will open up new scenarios and gain maximum use across the enterprise. We will take a look at how easy it is to set this up today and expose your workflows to AI Agents. Stefano Demiliani Intelligent applications with Logic Apps and Private AI This session explores the powerful integration of the Azure Logic Apps platform with on-premises and private cloud AI solutions, enabling organizations to maintain data sovereignty while leveraging advanced AI features. You will learn practical implementation strategies and tips for creating intelligent workflows that incorporate private AI models (open LLMs hosted locally, like gpt-oss) for AI-enhanced business process automation, while respecting your organization's privacy and security requirements. Ahmed Bayoumy The “Quick Task” Trap and How Logic Apps agent loop Kills It A while ago i had a meeting with a new client in the cargo shipping business to map their integration landscape. Everything looked solid, most integrations had the usual needs, nothing out of the ordinary. Then they mentioned the “special bookings” that still land in an operator’s inbox. Yes, manual bookings still occur in the enterprise world. Each one seems tiny until you multiply it across a week, a month, and a year. Those interrupts also break the operations team’s focus. So why not build a Logic Apps agentic loop to take the ad hoc cases off people’s plates, while handing off the real edge cases to humans with full context? Sivaji Gullipalli Boosting Developer Productivity in Logic Apps with Generative AI In this lightning talk, I will share how Generative AI can be combined with Azure Logic Apps to accelerate integration development and simplify repetitive tasks. Drawing from real-world insurance project scenarios, I will demonstrate how AI-driven prompts can be used to generate realistic test data, assist in building maps, create unit tests, and even support custom code generation. By leveraging AI during the development lifecycle, Logic Apps developers can significantly reduce manual effort, improve accuracy, and deliver solutions faster. This session is designed to provide beginner-friendly, practical insights that attendees can immediately apply to their Logic Apps projects to enhance productivity and efficiency. Dan Probert What is a prompt - and can I automate it? In this introductory session, we look at what makes a good Gen AI prompt, and look at whether we could automate the generation of a prompt, and what that might take. This session is aimed at those devs that are new to Gen AI/LLM, and want to know more about what a prompt is, and how they might be able to simply prompt creation. Sovit Charak Agentic Document Processing with Logic Apps Standard + Document AI We will demo through logic apps: AI classification & dynamic schemas: identify doc type (contract/invoice/form) and pick the right JSON schema. Schema-aware extraction: Document AI + LLM returns exact keys; invalid outputs auto-repair via a validation loop. Agentic behaviors: tool selection & fallback OCR. Human-in-the-loop: low-confidence fields trigger a Teams Adaptive Card for approve/edit/reject. Business rules & routing. Storage & integrations: original docs to SharePoint/Blob, structured data to SQL/D365/ERP; notifications to Teams/Email. Cameron McKay Leveraging Logic Apps to build easy and effective AI Agents This session will provide an overview of Azure Logic Apps as an agent building tool that solves the problem related to organizations wanting to create agents, but not having a way to build effective agents due to disparate and segregated data sources. We will discuss Agent Loop as our tool to build agents using connectors. We will showcase demos of how to build your first agent, explore the different human in the loop capabilities that are available within agent loop and how we can use logic apps to build agents using connectors. The purpose of the session is to showcase logic apps as an agent building platform.1.1KViews0likes0CommentsHybrid Logic Apps deployment on Rancher K3s Kubernetes cluster
K3s is a lightweight Kubernetes distribution, certified by the Cloud Native Computing Foundation (CNCF) and originally developed by Rancher. It is optimized for on-premises environments with limited resources, making it ideal for edge computing and lightweight hybrid scenarios. Unlike a full Kubernetes distribution, K3s reduces overhead while maintaining full Kubernetes API compatibility. This makes K3s an ideal choice for hosting Logic Apps Standard near your data sources—such as on-premises SQL Server or local file shares—when you have lightweight workloads. There are 5 steps which are followed to setup the Hybrid Logic Apps including infrastructure which is illustrated in the following diagram. Most of these 5 steps are same as discussed in the Hybrid Logic Apps doc except the K3s Setup part Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn. Step 1: Prepare the K3s Cluster Docker desktop setup - In this case, the host machine is Windows 11 so decided to user Docker with WSL2 to setup the containers. Install the docker desktop using WSL2 Docker Desktop: The #1 Containerization Tool for Developers | Docker and make sure we select WSL2 Install K3s on your infrastructure and create single node cluster using k3d. #Install choco , kubectl and Helm Set-ExecutionPolicy Bypass -Scope Process -Force; [System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://community.chocolatey.org/install.ps1')) powershell choco install kubernetes-cli -y choco install kubernetes-helm -y choco install k3d -y # open in new powershell window powershell k3d cluster create # deleting the default load balancer Traefik as it conflicts with 80 and 443 port - we can configure the load balancer to other ports if needed kubectl delete svc traefik -n kube-system kubectl delete deployment traefik -n kube-system Next two steps are same as given Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn Step 2: Connect the Kubernetes cluster to Azure Arc Step 3: Setup the Azure Container Apps extension and environment You need to skip the core DNS setup required for Azure Local as given in Update CoreDNS Step 4: Conduct the Storage Configuration for SQL and SMB SQL Database (Runtime Store): Hybrid Logic Apps use SQL database for runtime operations and run history. In this scenario I used on-premise SQL server using SQL Authentication. I setup the SQL Server 2022 on the Windows host machine, enabled SQL server authentication and added new SQL admin user. Please follow the link for more details.. The SQL connection string can be validated using following PowerShell script $connectionString = "Server=<server IP address>;Initial Catalog=<databaseName>;Persist Security Info=False;User ID=<sqluser>;Password=<password>;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;" try { $connection = New-Object System.Data.SqlClient.SqlConnection $connection.ConnectionString = $connectionString $connection.Open() Write-Host "✅ Connection successful" $connection.Close() } catch { Write-Host "❌ Connection failed: $($_.Exception.Message)" } SMB is used as local file share on Windows host machine; it is advised to use a new user for the Windows SMB share $Username = "k3suser" $Password = ConvertTo-SecureString "<password complex>" -AsPlainText -Force $FullName = "K3s user" $Description = "Created via PowerShell" # Create the user New-LocalUser -Name $Username -Password $Password -FullName $FullName -Description $Description Add-LocalGroupMember -Group "Users" -Member $Username Once the above user is created you can use Windows hosted machine to create Artifacts folder and allow read and write access. Please follow the link for more details Step 5: Create your Logic App (Hybrid) With all prerequisites and infrastructure in place for creating Hybrid Logic Apps, the next step is to build the Logic Apps using the specified connection string and SMB share path. This can be accomplished through the Azure Portal, as outlined below. Now you can create Logic Apps workflows using the designer and execute the Logic Apps workflow.Enforce or Audit Policy Inheritance in API Management
We’re excited to announce a new Azure Policy definition that lets you enforce or audit policy inheritance in Azure API Management. With this capability, platform and governance teams can ensure that API Management policies are always inherited across all policy scopes — operations, APIs, products, and workspaces — strengthening consistency, compliance, and security across your API estate. Why this matters In Azure API Management, the <base /> policy element plays a critical role: it ensures that a runtime policy inherits policies defined at a higher scope, such as product, workspace, or all APIs (global). Without <base />, developers can inadvertently (or intentionally) bypass important platform rules, for example: Security controls like authentication or IP restrictions Operational requirements such as logging, tracing, or rate-limiting Business policies such as quota enforcement The result can be inconsistent behavior, compliance drift, and gaps in governance. How the new policy helps With the new Azure Policy definition, you can automatically ensure that <base /> is located at the start of each API Management policy section — <inbound>, <outbound>, <backend>, and <on-error> — across policies configured on operations, APIs, products, and workspaces. You can set the effect parameter to: Audit: Identify operation, API, product, or workspace policies where <base /> is missing. Deny: Prevent deployment of policies that do not include <base />. Get started To enable this new Azure Policy definition: Navigate to Azure Policy in the Azure portal. Select “Definitions” from the menu and choose “API Management policies should inherit parent scope policies using <base />”. In the policy definition view, select “Assign”. Configure the policy assignment scope, parameter (audit or deny), and other details. View built-in Azure Policy definitions for API Management.481Views0likes0Comments