integration
107 TopicsCodeful Workflows: A New Authoring Model for Logic Apps Standard
📝 This blog introduce early concepts of a pre-release functionality and is subject to change. Azure Logic Apps Standard offers you a powerful cloud orchestration engine, enabling you to build and run automated workflows that effortlessly integrate resources from various services, systems, apps, and data sources. Whether you're looking to streamline processes across a complex enterprise or simply reduce the need for extensive coding, this platform provides a solution that's both efficient and flexible. For those of you who require more control over workflow designs or want to leverage your expertise in frameworks like .NET and the Durable Tasks framework, Logic Apps Standard now introduces an exciting new feature: Codeful Workflows. With Codeful Workflows, you can define workflows using an imperative programming style, blending the flexibility of coding with the simplicity and operational strengths of Logic Apps. This means you can structure your workflows the way that makes sense to you while still tapping into the rich ecosystem of connectors and tools built into Logic Apps. What Are Codeful Workflows? Codeful Workflows expand the authoring and execution models of a Logic Apps Standard, offering developers the ability to implement, test and run workflows using an imperative programming model both locally and in the cloud. Built on frameworks like .NET and the Durable Tasks framework, Codeful Workflows allow you to structure workflows in code while seamlessly integrating with Logic Apps Standard rich connector ecosystem, and leverage its operational capabilities. The core elements of a Logic App workflow—triggers, actions and connections —are translated into durable task concepts within this codeful model: Triggers are implemented as Client Functions that invoke durable orchestrations, which contain the body of the workflow, blending logic implemented by the language primitives, with connections actions for external connectivity. Connector actions are presented as Activity Functions. The Logic Apps Connector ecosystem is exposed to you via an SDK, bringing discoverability and rich support for intelisense when creating action inputs, invoking actions or reusing action outputs in later steps. The SDK vastly simplifies the execution of those connectors, by wrapping them internally on a Activity Function, so you don’t need to create new activities for each connector action you want to invoke. Connections, which manages the connectivitiy between actions and end systems, remains unchanged, allowing you to setup once and share connections between multiple orchestrations and logic apps declarative workflows. Connector actions uses a reference to a connection, providing flexibility between local and cloud configurations. Using those building blocks, you can create workflows using familiar programming paradigms, while still benefiting from the easy configuration and operational feature of Logic Apps Standard. If you are an existing Logic Apps Standard customer, your codeful and visual workflows can coexist within the same application, bridging the gap between pro-code and low-code approaches. With those two execution models working hand in hand on the same application, Logic Apps Standard becomes a comprehensive orchestration tool that caters to all developer personas, from integration specialists to enterprise teams, with no cliffs on their experience. Creating Codeful Workflows Designing codeful workflows begins with creating a new Logic Apps project within Visual Studio Code, configured for .NET and the Durable Tasks framework. From triggers to actions, developers gain full flexibility to define their workflows programmatically. Implementing Triggers Triggers are the entry points of workflows, and in Codeful Workflows, they are defined as Client Functions. For example, an HTTP trigger can start a workflow when a request is received: [FunctionName("HelloTrigger")] public static async Task<HttpResponseMessage> HttpStart( [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestMessage req, [DurableClient] IDurableOrchestrationClient starter, ILogger log) { var requestContent = await req.Content.ReadAsStringAsync(); var workflowInput = new HTTPHelloInput { Greeting = $"Hello from Codeful workflows. You said '{requestContent}'" }; log.LogInformation("Workflow Input = '{workflowInput}'.", JsonSerializer.Serialize(workflowInput)); string instanceId = await starter.StartNewAsync("HelloOrchestrator", workflowInput); log.LogInformation("Started orchestration with ID = '{instanceId}'.", instanceId); return await starter.WaitForCompletionOrCreateCheckStatusResponseAsync(req, instanceId); } Using Connector Actions Both Managed and Service Provider Actions are available to be used within your orchestrations. They are organized in the SDK by type making it easy to find the right connector to use. Once you identify the action to use, you can use the rich intelisense interface to generate inputs and call the action directly in your orchestration code. Deployment and Operations Deploying Logic Apps Standard that uses both codeful and codeless workflows follows the same practices already available in Logic Apps Standard. Operational insights, such as endpoint visibility and execution monitoring, are provided within the Azure Portal, ensuring parity with the functionality available for codeless workflows. This cohesive deployment model allows organizations to maximize their resources and cater to diverse development needs, whether they require quick prototyping via low-code tools or robust, scalable solutions through pro-code implementations. Codeful Workflows and Intelligent Agents You can take advantage of codeful workflows and Logic Apps Standard Agent Loop to create new intelligent applications that embed advanced AI decision-making directly into your processes – enabling your apps and automation to not just follow predefined steps, but to reason, adapt, and act autonomously towards goals. See this demo where we share two approaches to implement agent loops – combining codeful and codeless workflows, where you can reuse existing workflows as tools, and writing agent loop actions directly with code: Looking for feedback on Codeful Workflows We are looking for early feedback on this feature. If you are interested in participating on a private preview, please use the form below to register your interest and we will contact you to share the instructions. https://aka.ms/lacodeful/privatepreview/form2.7KViews4likes4CommentsLogic Apps Community Day 2025
We have delayed the Speaker announcements, as we had to keep the sessions open for an extra day. Speakers will now by notified by email by September 13th, 2025. Speakers and sessions will be published by September 19th, 2025 We are bringing Logic Apps Community Day again this year, on October 30, 2025 (Pacific Time) and we want you to join us as we host a full day of learning where you will be the star! The Logic Apps Community Day is a free event driven by Microsoft, for anyone who wants to learn more about Logic Apps and how it can help to solve real life integration problems. This year, we want to learn how you have been using AI with Logic Apps, so our themes for the sessions are: Creating Intelligent Applications with Logic Apps and AI: tell us how you have been using Logic Apps features to implement your intelligent application scenarios - from Agent Loops, to Logic Apps exposed as MCP tools, to improving your intelligent application knowledge in real time - we want to see the scenarios you created! Accelerating Logic Apps Developer Velocity with AI: How are you taking advantage of Gen AI to make your developer life easier using Logic Apps? From prompts to create test data, to automated creation of maps, unit test or custom code and anything in between. Maybe you have been using prompts, instructions or chat modes to make your development life easier? We want to see it all! Registration, Speakers and Sessions Want to To make sure you are notified on the day, and want to add the event on your calendar? You can register on our Reactor event page! Check out our agenda below - it will be a day packed with information and lots of amazing topics! Speaker Session Title Session Abstract Sebastian Meyer Intelligent Enterprise Integration – Automated Order Processing with AI and SAP This session explores how modern enterprises can streamline and safeguard their business operations through intelligent system integration. At the core is an AI-powered agent loop implemented in Logic Apps that automatically receives and parses flat files containing individual purchase orders. The Agent identifies critical order quantities and routes them to a business user for manual approval—ensuring compliance with internal control policies. Only approved Orders are transmitted to the SAP system. Finally, the enriched data—combining raw order details and SAP references—is automatically sent via email to the respective business partner. The result is a seamless, transparent, and scalable process that combines human oversight with AI-driven automation. Attendees will learn: - Create AI Agent Loops with Logic Apps - Bring in a Human in the loop - Re-using existing LA Workflows as Tools - Use Connectors to communicate with external Systems - Operational Aspects Toon Vanhoutte Smart invoice processing with Logic Apps Many organizations struggle with identifying the correct internal approver for incoming purchase invoices. It's a process that often involves time-consuming manual work, especially when no purchase order reference is available. In today’s era of generative AI, there must be a smarter way to streamline this challenge. Join me for an engaging session where I demonstrate how Microsoft Logic Apps Agent Loop can revolutionize invoice processing. You’ll see a live demo showcasing how structured data can be intelligently extracted from purchase invoices and how an AI-powered agent can reason over documented business rules to automatically determine the appropriate approver. No need for costly OCR solutions—just the power of Logic Apps and generative AI working together to simplify and accelerate your financial workflows. Michael Stephenson Can I unit test an agentic workflow with logic apps? In this session we will talk about the challenges of testing non deterministic workflows and look at how you might use the logic app unit test framework to be able to implement some testing strategies. Florian de Langhe Unlocking Excel Analytics: Logic Apps + Code Interpreter in Action Excel analysis has always been a challenge in Logic Apps workflows, limiting us to basic read/write operations while complex analytics required external services. This session demonstrates how the new Code Interpreter action, combined with intelligent looping patterns, unlocks powerful Python-driven analytical capabilities directly within your Logic Apps flows. Stephen W Thomas Exposing Your Logic App As An MCP Server – What, Why, and How? Have you heard all the buzz around MCP Servers, but don't really know how it can help you? This session will talk about why using MCP Servers with new and existing Logic Apps will open up new scenarios and gain maximum use across the enterprise. We will take a look at how easy it is to set this up today and expose your workflows to AI Agents. Stefano Demiliani Intelligent applications with Logic Apps and Private AI This session explores the powerful integration of the Azure Logic Apps platform with on-premises and private cloud AI solutions, enabling organizations to maintain data sovereignty while leveraging advanced AI features. You will learn practical implementation strategies and tips for creating intelligent workflows that incorporate private AI models (open LLMs hosted locally, like gpt-oss) for AI-enhanced business process automation, while respecting your organization's privacy and security requirements. Ahmed Bayoumy The “Quick Task” Trap and How Logic Apps agent loop Kills It A while ago i had a meeting with a new client in the cargo shipping business to map their integration landscape. Everything looked solid, most integrations had the usual needs, nothing out of the ordinary. Then they mentioned the “special bookings” that still land in an operator’s inbox. Yes, manual bookings still occur in the enterprise world. Each one seems tiny until you multiply it across a week, a month, and a year. Those interrupts also break the operations team’s focus. So why not build a Logic Apps agentic loop to take the ad hoc cases off people’s plates, while handing off the real edge cases to humans with full context? Sivaji Gullipalli Boosting Developer Productivity in Logic Apps with Generative AI In this lightning talk, I will share how Generative AI can be combined with Azure Logic Apps to accelerate integration development and simplify repetitive tasks. Drawing from real-world insurance project scenarios, I will demonstrate how AI-driven prompts can be used to generate realistic test data, assist in building maps, create unit tests, and even support custom code generation. By leveraging AI during the development lifecycle, Logic Apps developers can significantly reduce manual effort, improve accuracy, and deliver solutions faster. This session is designed to provide beginner-friendly, practical insights that attendees can immediately apply to their Logic Apps projects to enhance productivity and efficiency. Dan Probert What is a prompt - and can I automate it? In this introductory session, we look at what makes a good Gen AI prompt, and look at whether we could automate the generation of a prompt, and what that might take. This session is aimed at those devs that are new to Gen AI/LLM, and want to know more about what a prompt is, and how they might be able to simply prompt creation. Sovit Charak Agentic Document Processing with Logic Apps Standard + Document AI We will demo through logic apps: AI classification & dynamic schemas: identify doc type (contract/invoice/form) and pick the right JSON schema. Schema-aware extraction: Document AI + LLM returns exact keys; invalid outputs auto-repair via a validation loop. Agentic behaviors: tool selection & fallback OCR. Human-in-the-loop: low-confidence fields trigger a Teams Adaptive Card for approve/edit/reject. Business rules & routing. Storage & integrations: original docs to SharePoint/Blob, structured data to SQL/D365/ERP; notifications to Teams/Email. Cameron McKay Leveraging Logic Apps to build easy and effective AI Agents This session will provide an overview of Azure Logic Apps as an agent building tool that solves the problem related to organizations wanting to create agents, but not having a way to build effective agents due to disparate and segregated data sources. We will discuss Agent Loop as our tool to build agents using connectors. We will showcase demos of how to build your first agent, explore the different human in the loop capabilities that are available within agent loop and how we can use logic apps to build agents using connectors. The purpose of the session is to showcase logic apps as an agent building platform.Hybrid Logic Apps deployment on Rancher K3s Kubernetes cluster
K3s is a lightweight Kubernetes distribution, certified by the Cloud Native Computing Foundation (CNCF) and originally developed by Rancher. It is optimized for on-premises environments with limited resources, making it ideal for edge computing and lightweight hybrid scenarios. Unlike a full Kubernetes distribution, K3s reduces overhead while maintaining full Kubernetes API compatibility. This makes K3s an ideal choice for hosting Logic Apps Standard near your data sources—such as on-premises SQL Server or local file shares—when you have lightweight workloads. There are 5 steps which are followed to setup the Hybrid Logic Apps including infrastructure which is illustrated in the following diagram. Most of these 5 steps are same as discussed in the Hybrid Logic Apps doc except the K3s Setup part Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn. Step 1: Prepare the K3s Cluster Docker desktop setup - In this case, the host machine is Windows 11 so decided to user Docker with WSL2 to setup the containers. Install the docker desktop using WSL2 Docker Desktop: The #1 Containerization Tool for Developers | Docker and make sure we select WSL2 Install K3s on your infrastructure and create single node cluster using k3d. #Install choco , kubectl and Helm Set-ExecutionPolicy Bypass -Scope Process -Force; [System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://community.chocolatey.org/install.ps1')) powershell choco install kubernetes-cli -y choco install kubernetes-helm -y choco install k3d -y # open in new powershell window powershell k3d cluster create # deleting the default load balancer Traefik as it conflicts with 80 and 443 port - we can configure the load balancer to other ports if needed kubectl delete svc traefik -n kube-system kubectl delete deployment traefik -n kube-system Next two steps are same as given Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn Step 2: Connect the Kubernetes cluster to Azure Arc Step 3: Setup the Azure Container Apps extension and environment You need to skip the core DNS setup required for Azure Local as given in Update CoreDNS Step 4: Conduct the Storage Configuration for SQL and SMB SQL Database (Runtime Store): Hybrid Logic Apps use SQL database for runtime operations and run history. In this scenario I used on-premise SQL server using SQL Authentication. I setup the SQL Server 2022 on the Windows host machine, enabled SQL server authentication and added new SQL admin user. Please follow the link for more details.. The SQL connection string can be validated using following PowerShell script $connectionString = "Server=<server IP address>;Initial Catalog=<databaseName>;Persist Security Info=False;User ID=<sqluser>;Password=<password>;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;" try { $connection = New-Object System.Data.SqlClient.SqlConnection $connection.ConnectionString = $connectionString $connection.Open() Write-Host "✅ Connection successful" $connection.Close() } catch { Write-Host "❌ Connection failed: $($_.Exception.Message)" } SMB is used as local file share on Windows host machine; it is advised to use a new user for the Windows SMB share $Username = "k3suser" $Password = ConvertTo-SecureString "<password complex>" -AsPlainText -Force $FullName = "K3s user" $Description = "Created via PowerShell" # Create the user New-LocalUser -Name $Username -Password $Password -FullName $FullName -Description $Description Add-LocalGroupMember -Group "Users" -Member $Username Once the above user is created you can use Windows hosted machine to create Artifacts folder and allow read and write access. Please follow the link for more details Step 5: Create your Logic App (Hybrid) With all prerequisites and infrastructure in place for creating Hybrid Logic Apps, the next step is to build the Logic Apps using the specified connection string and SMB share path. This can be accomplished through the Azure Portal, as outlined below. Now you can create Logic Apps workflows using the designer and execute the Logic Apps workflow.Enforce or Audit Policy Inheritance in API Management
We’re excited to announce a new Azure Policy definition that lets you enforce or audit policy inheritance in Azure API Management. With this capability, platform and governance teams can ensure that API Management policies are always inherited across all policy scopes — operations, APIs, products, and workspaces — strengthening consistency, compliance, and security across your API estate. Why this matters In Azure API Management, the <base /> policy element plays a critical role: it ensures that a runtime policy inherits policies defined at a higher scope, such as product, workspace, or all APIs (global). Without <base />, developers can inadvertently (or intentionally) bypass important platform rules, for example: Security controls like authentication or IP restrictions Operational requirements such as logging, tracing, or rate-limiting Business policies such as quota enforcement The result can be inconsistent behavior, compliance drift, and gaps in governance. How the new policy helps With the new Azure Policy definition, you can automatically ensure that <base /> is located at the start of each API Management policy section — <inbound>, <outbound>, <backend>, and <on-error> — across policies configured on operations, APIs, products, and workspaces. You can set the effect parameter to: Audit: Identify operation, API, product, or workspace policies where <base /> is missing. Deny: Prevent deployment of policies that do not include <base />. Get started To enable this new Azure Policy definition: Navigate to Azure Policy in the Azure portal. Select “Definitions” from the menu and choose “API Management policies should inherit parent scope policies using <base />”. In the policy definition view, select “Assign”. Configure the policy assignment scope, parameter (audit or deny), and other details. View built-in Azure Policy definitions for API Management.414Views0likes0CommentsAnnouncing General Availability: Azure Logic Apps Standard Automated Test Framework
We’re excited to announce the General Availability (GA) of the Azure Logic Apps Standard Automated Test Framework - a major step forward in enabling developers to build, test, and maintain enterprise-grade workflows with confidence and agility. Automated testing has become a cornerstone of modern development practices, and Logic Apps Standard now offers a robust framework to help you create unit tests for both workflow definitions and workflow runs directly within Visual Studio Code. This framework empowers teams to validate logic, simulate external dependencies, and ensure workflows behave as expected—before they’re deployed to production. Since the public preview, we’ve listened to your feedback and continued to enhance the framework. With GA, we’re introducing several key improvements that make testing even more powerful and flexible. What’s New in GA Support for More Mocked Actions You can now mock a broader range of built-in and managed connector actions, making it easier to isolate your workflow logic from external systems. With this release we unlocked support to mock actions for the following actions, unavailable during public preview: Call workflow in this logic app Execute inline code (JavaScript, C#, PowerShell Call Functions (Azure functions, local functions) XML Operations (transform, parse with schema) Liquid Operations (JSON to JSON, JSON to text, XML to JSON, XML to text) Data Mapper operations This enhancement allows for more comprehensive and reliable unit tests, providing more control to your workflow tests, especially in complex integration scenarios. Access to Workflow Settings for Assertions The framework now allows you to access and assert against workflow settings, such as parameters and app setting values. This means you can validate not just the behavior of your workflow, but also the environment in which it runs—ensuring logic consistency across different environments. Inline Script Actions Support Inline Code actions are now fully supported in test scenarios. JavaScript actions are now executed as part of the test workflow execution, since they are part of workflow logic. This improvement allows you to validate the logic of those scripts at part of your workflow scenarios. We are working on bringing similar support for C# and PowerShell scripts. Learn More To get started with the Azure Logic Apps Standard Automated Test Framework, check out the following Microsoft Learn articles: Create unit tests for workflow definitions in Visual Studio Code Create unit tests for workflow runs in Visual Studio Code Logic Apps Standard Automated Test SDK. Let us know what you think and stay tuned for more enhancements coming soon!🚀 General Availability: Enhanced Data Mapper Experience in Logic Apps (Standard)
We’re excited to announce the General Availability (GA) of the redesigned Data Mapper UX in the Azure Logic Apps (Standard) extension for Visual Studio Code. This release marks a major milestone in our journey to modernize and streamline data transformation workflows for integration developer. What's new The new UX, previously available in public preview, is now the default experience in the Logic Apps Standard extension. This GA release reflects direct feedback from our integration developer community. We’ve resolved blockers that we heard from customers and usability issues that impacted performance and stability, including: Opening V1 maps in V2: Seamlessly open and edit existing maps you have already created with latest visual capabilities. Load schemas on Mac: Addressed schema-related crashes on macOS for a smoother experience. Function documentation updates: Improved guidance and examples for built-in collection functions that apply on repeating nodes. Stay connected We would love to hear your feedback. Please use this form link to let us know if there are any missing gaps or scenarios that are not yet coveredLogic Apps Aviators Newsletter - September 25
In this issue: Ace Aviator of the Month News from our product group Community Playbook News from our community Ace Aviator of the Month September’s Ace Aviator: Kritika Singh Integration Architect & Sr. Consultant at Capgemini Norge AS What's your role and title? What are your responsibilities? I work as an Integration Architect & Sr. Consultant at Capgemini Norge AS. In my role I assist clients in addressing a wide range of integration challenges, with a particular emphasis on modernizing legacy systems, such as BizTalk, by transitioning them to cloud-native Azure iPaaS solutions. I’m responsible for architecting secure and scalable integration landscapes, designing and developing solutions, mentoring team members, and engaging with stakeholders and cross-functional teams. I work extensively with technologies such as Azure Logic Apps, Azure Functions, API Management, Service Bus, App Service Environment(ASEv3), Virtual Networks and CI/CD pipelines using GitHub. One of my proudest achievements was successfully delivering a complex BizTalk modernization project to Azure that required deep technical expertise and strategic coordination. Can you give us some insights into your day-to-day activities and what a typical day in your role looks like? As part of a distributed team across different locations and countries, my day starts with stand-ups involving both offshore and onshore team members to review progress and assign tasks. I spend time with developers, helping them navigate technical challenges and mentoring them through dedicated sessions—this has helped improve delivery quality and team confidence. I collaborate closely with cross-functional teams and stakeholders to align on requirements and solution design. Throughout the day, I work on resolving issues, improving existing solutions, work on innovation ideas and managing tasks. What motivates and inspires you to be an active member of the Aviators/Microsoft community? I am deeply passionate about technology—having worked with BizTalk throughout my career and, for the past 5+ years, diving into Azure iPaaS. Microsoft products evolve constantly, and that sparks my curiosity to explore, learn, and innovate every day. What truly drives me is the opportunity to give back to the community by sharing my learnings, challenges, and even failures. It’s all about growing together and inspiring others along the way. Looking back, what advice do you wish you had been given earlier that you'd now share with those looking to get into STEM/technology? Curiosity and consistency matter more than perfection. Don’t be afraid to ask questions, experiment, and fail, that’s where the real learning happens. Also, find a community that supports you; sharing your journey, both wins and setbacks, can inspire others and help you grow faster. What has helped you grow professionally? My professional growth has been shaped by a blend of curiosity, courage, and the right opportunities. Starting with BizTalk and evolving into Azure iPaaS, I’ve embraced every challenge as a chance to learn. What’s made the biggest difference is having the boldness to take on responsibility, the willingness to take risks, and the drive to keep growing. Sharing my journey with the community has not only helped others but also deepened my own learning. If you had a magic wand that could create a feature in Logic Apps, what would it be and why? If I had a magic wand to enhance Logic Apps, I’d bring in three powerful features to supercharge developer productivity and solution resilience: Seamless Version Control & Rollback Having Git-like capabilities built into Logic Apps—track every change, compare versions, and roll back instantly when needed. This would empower teams to experiment confidently and collaborate more effectively without fear of breaking production workflows. Effortless Disaster Recovery Setup Setting up DR should be as simple as a few clicks. A built-in, automated DR configuration for AIS would ensure business continuity, reduce downtime, and give developers peace of mind—especially in mission-critical environments. Native JSON Mapper(Not Liquid) A visual, intuitive JSON mapping tool would simplify complex data transformations, reduce manual coding, and speed up development. This would be a game-changer for integration scenarios, especially when working with dynamic schemas and APIs. Simplified Authorization like ClaimChecks for Logic Apps Standard (Beyond EasyAuth) A more developer-friendly authorization setup that minimizes manual configurations and integrates seamlessly with identity providers. This would make securing Logic Apps faster, easier, and more consistent across environments. News from our product group Logic Apps Live August 2025 Missed Logic Apps Live in August? You can watch it here. We had a recap on Logic Apps Hybrid, our special guest Kritika Singh talking about her learnings with BizTalk Migration to AIS, and updates on Data Mapper GA and Logic Apps Standard Deployment Center. Logic Apps Community Day 2025 We are bringing Logic Apps Community Day again this year, on October 30, 2025 (Pacific Time) and we want you to join us as we host a full day of learning where you will be the star! Call for Speakers is still open until September 07, 2025 – so hurry and submit your session! General Availability: Enhanced Data Mapper Experience in Logic Apps (Standard) We’re excited to announce the General Availability (GA) of the redesigned Data Mapper UX in the Azure Logic Apps (Standard) extension for Visual Studio Code. This release marks a major milestone in our journey to modernize and streamline data transformation workflows for integration developer. Announcing: Setup CD in Azure Logic Apps Standard with Deployment Center Looking to automate your Azure Logic Apps code deploymentsin a faster way? Deployment Center - a built-in feature in Azure Logic Apps Standard - in now available, with built-in support on your VS Code projects, making it easier to deploy Logic Apps from your source control repository. Deployment Center is designed to make deploying, updating, and managing your Logic Apps workflows simple and straightforward. Hybrid Logic Apps deployment on Rancher K3s Kubernetes cluster Explore how Hybrid Logic Apps run effortlessly on K3s—delivering the power of Hybrid Logic Apps without the complexity and heavy infrastructure demands of a full Kubernetes cluster! News from our community What gets returned to the LLM by my Logic App Agent Loop tool? Video by Michael Stephenson Michael has been experimenting with Logic App Agent loop and, following a discussion with Kent Weare about the interaction between the workflow and the LLM, aimed to understand what data is returned to the model, since the number of tokens influences cost. He summarizes his findings from that conversation in this brief video. SOAP 1.2 Calls from Logic Apps – Fixing Unsupported Media & WS-Addressing Errors Post by Prashant Singh Struggling with SOAP 1.2 in Azure Logic Apps? Learn how to fix Unsupported Media errors, decode MTOM responses, and handle WS-Addressing headers for seamless integration in this post by Prashant. Demystifying AI Agent Loops in Logic Apps: The Future of Integration (But Not Everywhere) Post by Al Ghoniem Explore how AI Agent Loops enhance Azure Logic Apps for non-deterministic tasks like anomaly detection and IT Ops triage—while knowing when traditional workflows are the better fit. Can I use AI to create and deploy an Azure Logic Apps with Business Central connector? Post by Stefano Demiliani Stefano is testing the boundaries of what AI can do, so you don’t have to – he ran a blind test showing that AI can deploy Azure resources well—but struggles with external connectors like Business Central. Learn what worked, what didn’t, and why better prompts matter. How to use ChatGPT Agent Mode with Azure! Video by Stephen W. Thomas And looks like August was the month to experiment with Azure Resources. Stephen did some research too and shows you how easily ChatGPT-5 Agent Mode can auto-provision resources in Azure. This video demonstrates how to use a single prompt to build a logic app and create a resource group. Follow his video along to see how to get ChatGPT-5 agent working for you! Integration Love Story - Andrew Wilson Video by Ahmed Bayoumy and Robin Wilde In this special fast-paced episode recorded at INTEGRATE, Ahmed and Robin sit down with the brilliant Andrew, newly awarded Microsoft MVP and Logic Apps Ace Aviator, to talk about his journey, passions, and why integration is the powerhouse behind every digital experience. Tips for Migrating SAP IDoc Reception Workloads from BizTalk to Azure Logic Apps Post by Francois Malgreve Learn how to reuse BizTalk XSLTs in Azure Logic Apps! In this post by Francois, you will learn hot to configure the SAP trigger with the right IDoc format and namespace settings—minimizing code changes and easing migration. Query Azure DevOps work items with Logic App and Managed Identity Post by Michael Stephenson Learn how to use a reusable Logic App and a user-assigned managed identity to securely query Azure DevOps work items using WIQL—ideal for building scalable, secure workflows. Understand Agent Loops in Azure Logic Apps Video by Srikanth Gunnala In this video, Srikanth explore Azure Logic Apps AI Agents — also known as Agent Workflows or Agent Loops — and how they’re redefining workflow automation with Azure OpenAI. You’ll learn what an AI agent is in Azure Logic Apps, how it works, and see a live demo of building an AI-powered, adaptive workflow. Automate Microsoft Fabric Cost Savings with Logic Apps Post by Sherry L. Robinson Learn how to pause and resume Microsoft Fabric capacity using Azure Logic Apps—cutting costs during off-hours with minimal code and seamless integration via REST API or Resource Manager, in this insightful post by Sherry. Use Graph API to send Emails in Logic Apps Post by Şahin Özdemir In this post, Şahin shows your hot to use Microsoft Graph API with service principals to securely send emails from Logic Apps using app registrations and access policies. This will be quite useful in cases where you can’t associate the calls to a user account, which is a requirement for the Office 365 connector.427Views0likes0CommentsReliable B2B Tracking using Premium SKU Integration Account
Reliable B2B Tracking using Premium SKU Integration Account In the world of enterprise integration, accurate and reliable tracking of B2B transactions is crucial for maintaining compliance, troubleshooting issues, and ensuring smooth business operations. Organizations that rely on Logic Apps for EDI X12, EDIFACT, or AS2 transactions need robust tracking capabilities to monitor their B2B exchanges effectively. Currently, in Logic Apps Consumption, B2B tracking is powered by Azure Log Analytics, which provides basic telemetry and logging capabilities. However, this approach has some key limitations: Limited Query Capabilities – Searching for transactions in Log Analytics can be cumbersome, especially when dealing with large-scale enterprise data. Retention and Performance Issues – Log Analytics is optimized for general telemetry, not for high-volume, structured B2B transaction tracking, making it challenging for organizations with strict compliance requirements. To address these challenges, we are introducing Reliable B2B Tracking in Logic Apps Standard using a Premium SKU Integration Account. This new feature ensures that all B2B transactions are reliably tracked and ingested into an Azure Data Explorer (ADX) cluster, providing a lossless tracking mechanism with powerful querying and visualization capabilities. With data in ADX, customers can extend their existing PowerBI dashboards or easily build custom dashboards on this data if they need detailed analysis on any issue. Additionally, a tracking dashboard is available, enabling customers to monitor, search, and analyze B2B transactions efficiently. This enhancement significantly improves reliability, visibility, and troubleshooting for mission-critical B2B integrations. How It Works Reliable B2B Tracking in Logic Apps Standard ensures that every AS2, X12, and EDIFACT transaction is accurately recorded and stored in an Azure Data Explorer (ADX) cluster database instead of relying on Azure Log Analytics, which may drop events. Here’s how the system works: Event Collection – Whenever a B2B transaction occurs, tracking data is generated from the built-in AS2, X12, and EDIFACT actions in Logic Apps Standard. Data Ingestion – Instead of sending logs to Log Analytics, the tracking data is directly pushed to an Azure Data Explorer (ADX) cluster via integration account transactionally to ensure reliable, lossless storage. Structured Storage – ADX provides fast indexing and query capabilities, allowing enterprises to search, filter, and analyze their transactions efficiently. Tracking Dashboard – A dedicated B2B monitoring dashboard visualizes transaction flow, helping customers track acknowledgments (997, MDN), detect failures, and troubleshoot issues in real time. Requirements for Using Reliable B2B Tracking To enable this feature, customers must meet the following prerequisites: Premium SKU Integration Account – This feature is only available with a Premium SKU Integration Account in Logic Apps Standard. Built-in AS2, X12, or EDIFACT Actions – Only transactions processed through Logic Apps Standard using built-in B2B actions will be tracked reliably. Azure Data Explorer Cluster – Customers must provide their own ADX cluster database, where all transaction logs will be stored and queried. Please note that B2B tracking for EDIFACT transactions are not supported yet and will be available in near future. How to Use Reliable B2B Tracking 1. Create a Tracking Store Artifact in the Integration Account In the Integration Account, create a tracking store artifact that points to an existing Azure Data Explorer (ADX) cluster database. Currently, only one default tracking store is supported per integration account. The ADX database must be pre-created before setting up the tracking store. 2. Enable or Disable Tracking in the Agreement Settings B2B tracking is also managed at the agreement level. By default, tracking is enabled for an agreement. To disable tracking, user can set a setting named TrackingState to Disabled in send or receive agreement. To enable again, TrackingState value needs to set to Enabled. Please note that this setting can be updated in JSON view only. For tracking to function correctly, both the tracking store in the integration account be configured and TrackingState needs to be set to Enabled in agreement. Using the Tracking Dashboard Before using tracking dashboard, please ensure that some B2B actions are executed so that tracking data is available in tracking store. To view the tracking dashboard, please click the When the B2B tracking dashboard is opened, the dashboard displays message overview data for 7 days by default. To change the data scope to a different time interval, use the TimeRange at the top of the page. After the Message Overview Status dashboard loads, users can drill down into specific message types (AS2 or X12) for a more detailed view. Selecting the AS2 or X12 tabs provides insights into message processing details, including transaction status, acknowledgments, and failures. Managing Tracking Stores via REST API Reliable B2B Tracking supports a REST API for managing tracking stores. Users can create, update, delete, and retrieve tracking stores programmatically using the following API endpoints. If users choose to use rest API to create tracking store in integration account, then two ADX database tables named AS2TrackRecords and EdiTrackRecords need to be created manually in the ADX database with a specific schema and integration account needs to have 'Ingester' permission to the database. 1. Get All Tracking Stores Retrieves all tracking stores configured in an Integration Account. GET https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/integrationAccounts/{integrationAccountName}/groups/default/trackingstores?api-version=2016-06-01 Parameters: {subscriptionId} – Azure subscription ID. {resourceGroupName} – Name of the resource group. {integrationAccountName} – Name of the integration account. Response: Returns a list of tracking stores associated with the integration account. Please note that currently only one tracking store per integration account is supported. 2. Get a Specific Tracking Store Retrieves details of a specific tracking store. GET https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/integrationAccounts/{integrationAccountName}/groups/default/trackingstores/{trackingstoreName}?api-version=2016-06-01 Parameters: {trackingstoreName} – Name of the tracking store to retrieve. Response: Returns details of the specified tracking store. 3. Create or Update a Tracking Store Creates a new tracking store or updates an existing one. PUT https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/integrationAccounts/{integrationAccountName}/groups/default/trackingstores/{trackingstoreName}?api-version=2016-06-01 Request Body: json { "properties": { "adxClusterUri": "https://youradxcluster.kusto.windows.net", "databaseName": "YourDatabaseName" } } Parameters: adxClusterUri – The Azure Data Explorer cluster URI. databaseName – The database name within the ADX cluster. Response: Returns the details of the created or updated tracking store. 4. Delete a Tracking Store Deletes an existing tracking store from the integration account. DELETE https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/integrationAccounts/{integrationAccountName}/groups/default/trackingstores/{trackingstoreName}?api-version=2016-06-01 Parameters: {trackingstoreName} – Name of the tracking store to delete. Response: Returns a success response if the tracking store is deleted successfully. Tracking Database Table Schema The Azure Data Explorer (ADX) cluster database used for Reliable B2B Tracking stores transaction data in a structured format. AS2 transactions are stored in a table named AS2TrackRecords. X12 and EDIFACT transactions are stored in a table named EdiTrackRecords. These tables enable efficient querying and retrieval of B2B tracking data, providing structured insights into message flow, processing status, and troubleshooting details. Since B2B tracking data is stored in an Azure Data Explorer database, users can leverage Azure Workbook to create visually rich custom dashboards for analyzing their B2B transactions. If users choose to use rest API to create tracking store in integration account, then these two tables need to be created manually in the ADX database and integration account needs to have 'Ingester' permission to the database. For more information, see Reliable B2B Tracking Database Schema Future Enhancements To further improve Reliable B2B Tracking, the following enhancements are planned for future releases: 1. Support for secure access ADX. Users will be able to access the tracking store securely without relying on the public ADX IP address. 2. EDIFACT Transaction Tracking In the future, EDIFACT message tracking will be fully supported, enabling detailed monitoring of EDIFACT interchanges, functional groups, and message transactions.381Views0likes0CommentsAnnouncing the availability of TLS 1.3 in Azure API Management in Preview
TLS 1.3 is the latest version of the internet’s most deployed security protocol, which encrypts data to provide a secure communication channel between two endpoints. TLS 1.3 support in Azure API Management is planned to rollout during the first week of February 2024. The rollout will happen in stages, this means some regions will get it first as we roll out globally.22KViews2likes6Comments