logic apps standard
141 TopicsLog Ingestion to Azure Log Analytics Workspace with Logic App Standard
Currently, to send logs to Azure Log Analytics, the recommended method involves using the Azure Log Analytics Data Collector. This is a managed connector that typically requires public access to your Log Analytics Workspace (LAW). Consequently, this connector does not function if your LAW has Virtual Network (VNet) integration, as outlined in the Azure Monitor private link security documentation. Solution: Logic App Standard for VNet Integrated Log Analytics Workspace To address this limitation, a solution has been developed using Logic App Standard to directly connect to the LAW ingestion http endpoint. The relevant API documentation for this endpoint can be found here: Log Analytics REST API | Microsoft Learn. It's important to note that the current version of this endpoint exclusively supports authentication via a shared key, as detailed in the Log Analytics REST API Reference | Microsoft Learn. Any request to the Log Analytics HTTP Data Collector API must include the Authorization header. To authenticate a request, you must sign the request with either the primary or secondary key for the workspace that is making the request and pass that signature as part of the request. Implementing Shared Key Authentication with C# Inline Script The proposed solution involves building a small C# inline script within the Logic App Standard to handle the shared key authentication process. Sample code for this implementation has been uploaded to my GitHub: LAWLogIngestUsingHttp string dateString = DateTime.UtcNow.ToString("r"); byte[] content = Encoding.UTF8.GetBytes(jsonData); int contentLength = content.Length; string method = "POST"; string contentType = "application/json"; string resource = "/api/logs"; string stringToSign = $"{method}\n{contentLength}\n{contentType}\nx-ms-date:{dateString}\n{resource}"; byte[] sharedKeyBytes = Convert.FromBase64String(connection.SharedKey); using HMACSHA256 hmac = new HMACSHA256(sharedKeyBytes); byte[] stringToSignBytes = Encoding.UTF8.GetBytes(stringToSign); byte[] signatureBytes = hmac.ComputeHash(stringToSignBytes); string signature = Convert.ToBase64String(signatureBytes); HTTP Action Configuration Subsequently, an HTTP action within the Logic App Standard is configured to call the Log Analytics ingestion endpoint using an HTTP POST method. The endpoint URL follows this format: https://{WorkspaceId}.ods.opinsights.azure.com/api/logs?api-version=2016-04-01 Remember to replace {WorkspaceId} with your actual Log Analytics Workspace ID. the custom table name will be in the log-type headerMigrate Data Ingestion from Data Collector to Log Ingestion
HTTP Data Collector API in Log Analytics workspaces is being deprecated, and will be totally out of support in September 2026. Data Collector actions in logic app using already created API connections (which uses workspace Id & Key) would still work against old custom log tables, however, newly created table will not be able to ingest data, although the connector would still succeed in logic app, but no data will be populated in newly created custom logs. In case new API connection is created for Data Collector action (using workspace Id & Key); these will fail with 403 - Forbidden action. Users should start using the Log Ingestion API to send data to custom tables, and this document will guide users on how to use Log Ingestion API in logic apps. Note: Azure portal currently is update so it doesn't show the Workspace keys in Log Analytics workspace page, however, Az CLI will still get the keys, but as stated, actions will fail with 403 when using them in Data Collector Action. Creating DCE & DCRs: To utilize the Log Ingestion API, Data Collection Endpoint & Data Collection Rule should be created first. DCE Creation is simple, from azure portal search for DCE, and then create a new one: For DCR creation, it can be either created from the DCR page in Azure Portal, or upon creating the custom log in Log Analytics workspace. DCR Popup You need to upload sample data file, so the custom log table has a schema, it needs to be JSON array. In case the sample log doesn't have a TimeGenerated field, you can easily add it using the mapping function as below: Add the below code in the Transformation box, then click run. Once you complete the DCR creation, we need to get the DCE full endpoint. Getting DCE Log Ingestion Full URL To get the full endpoint URL, please follow the below: 1. Get the DCE Log Ingestion URL from the DCE overview page: 2. On the DCR Page, get the immutable id for the DCR., then click on the JSON view of the DCR resource: 3. From the JSON view, get the stream name from the streamDeclarations field Now the full Log Ingestion URL is: DCE_URL/dataCollectionRules/{immutable_id}/streams/{streamName}?api-version=2023-01-01 It would be similar to: https://mshbou****.westeurope-1.ingest.monitor.azure.com/dataCollectionRules/dcr-7*****4e988bef2995cd52ae/streams/Custom-mshboulLogAPI_CL?api-version=2023-01-01 Granting Logic App MI needed IAM Roles: To call the ingestion endpoint using Logic Apps MI, we need to grant logic apps MI the role "Monitoring Metrics Publisher" over the DCR resource. To do this, open the DCR, from the blade choose Access Control (IAM), and then grant the logic app MI the role "Monitoring Metrics Publisher" Calling Log Ingestion Endpoint from logic apps: To call the ingestion endpoint from logic apps, we need to use the HTTP action, as below, the URI is the full DCE Endpoint we created before. Add the content-type headers, and the json body that contains the log data you want to send. For the authentication, it will be as below: Once executed, it should succeed, with status code 204. For more details on the Log Ingestion API, and the migration, please see our documentation: Migrate from the HTTP Data Collector API to the Log Ingestion API - Azure Monitor | Microsoft Learn Logs Ingestion API in Azure Monitor - Azure Monitor | Microsoft Learn Thanks.230Views0likes0CommentsA BizTalk Migration Tool: From Orchestrations to Logic Apps Workflows
As organizations move toward cloud-native architecture, this project addresses one of the most challenging aspects of modernization: converting existing BizTalk artifacts into their Azure Logic Apps equivalents while preserving business logic and integration patterns. Architecture and Components The BizTalk Migration Starter is available here: haroldcampos/BizTalkMigrationStarter and consists of four main projects and a test project, each targeting a specific aspect of BizTalk migration: BTMtoLMLMigrator - BizTalk Map Converter The BTMtoLMLMigrator is a tool that converts BizTalk Maps (.btm files) to the Logic Apps Mapping Language (.lml files). BizTalk maps define data transformations between different message formats, using XSLT and functoids to implement complex mapping logic. Input: Output: Key Capabilities: Parses BizTalk map XML structure to extract transformation logic. Translates BizTalk functoids (string manipulation, mathematical operations, logical operations, date/time functions, etc.) to equivalent LML syntax Preserves source and target schema references Generates Logic Apps-compatible Liquid maps that can be directly deployed to Azure Integration Accounts Core Components: BtmParser.cs: Extracts map structure, functoid definitions, and link connections from BTM files FunctoidTranslator.cs: Converts BizTalk functoid operations to Logic Apps Maps template equivalents LmlGenerator.cs: Generates the final LML output BtmMigrator.cs: Orchestrates the entire conversion process Models.cs: Defines data structures for representing maps, functoids, and links To convert a single map: BTMtoLMLMigrator.exe -btm "C:\BizTalkMaps\OrderToInvoice.btm" -source "C:\Schemas\Order.xsd" -target "C:\Schemas\Invoice.xsd" -output "C:\Output\OrderToInvoice.lml" To Convert all maps in a directory: Be mindful of having the right naming for schemas in the BizTalk maps to avoid the tool picking up the wrong schemas: BTMtoLMLMigrator.exe -batchDir "C:\BizTalkMaps" -schemasDir "C:\Schemas" -outputDir "C:\Output\LMLMaps" Recommendations and Troubleshooting: Make sure to use the real schemas, source and destination, and the corresponding map. Most BizTalk functoids are supported, however for those who don’t, like scripting, it will add the code into the lml file, expecting you to conduct a re-design of the scenario. Currently the Data Mapper does not have a direct function that replaces scripting functoids. We are exploring alternatives for this. Use GHCP agent to troubleshoot the tool if you run into issues. ODXtoWFMigrator - Orchestration to Workflow Converter The ODXtoWFMigrator tackles one of the most complex aspects of BizTalk migration: converting BizTalk Orchestrations (.odx files) to Azure Logic Apps workflow definitions. Orchestrations represent business processes with sequential, parallel, and conditional logic flows. It requires the orchestration and bindings file, exported from the BizTalk Application. If you don't have orchestration, for Content Routing Based scenarios, it uses the bindings file only. Go to BizTalk Central Admin. Select your BizTalk Application: Export all bindings. You can have multiple orchestrations in one binding file, so is important to export all the information available. Input: Output: Key Capabilities: Parses BizTalk orchestration XML to extract process flows, shapes, and connections Maps BizTalk orchestration shapes (Receive, Send, Decide, Parallel, Loop, etc.) to Logic Apps actions and control structures Generates connector configurations for common integration patterns Creates comprehensive migration reports documenting the conversion process and any limitations Produces standard Logic Apps JSON workflow definitions ready for deployment Core Components: BizTalkOrchestrationParser.cs: Analyzes orchestration structure and extracts workflow patterns LogicAppsMapper.cs: Maps BizTalk orchestration shapes to Logic Apps equivalents LogicAppJSONGenerator.cs: Generates the final Logic Apps workflow JSON OrchestrationReportGenerator.cs: Creates detailed migration reports. Schemas/Connectors/connector-registry.json: Registry of connector mappings and configurations Recommendations and Troubleshooting: Most BizTalk shapes are supported, however for those who don’t, it will default to compose actions and inject the code or a comment. It supports most BizTalk adapters. If you need to add support to a new Logic Apps connector/service provider, you can update the connector-registry.json file by adding the trigger or action following the pattern for the other entries. This tool has been tested with multiple patterns and orchestrations. Use GHCP agent to troubleshoot the tool if you run into issues. The following are some of the supported commands. Please run the command line and review the README files to see all supported commands. Command Sample Usage MIGRATE / CONVERT With output file ODXtoWFMigrator.exe convert "C:\BizTalk\InventorySync.odx" "C:\BizTalk\BindingInfo.xml" "C:\Output\InventorySync.workflow.json" With refactored generator ODXtoWFMigrator.exe migrate "C:\BizTalk\MessageRouter.odx" "C:\BizTalk\BindingInfo.xml" --refactor BINDINGS-ONLY Basic bindings-only ODXtoWFMigrator.exe bindings-only "C:\BizTalk\ProductionBindings.xml" With output directory ODXtoWFMigrator.exe bindings-only "C:\BizTalk\BindingInfo.xml" "C:\LogicApps\BindingsWorkflows" With refactored generator ODXtoWFMigrator.exe bindings-only "C:\BizTalk\BindingInfo.xml" --refactor REPORT / ANALYZE Basic HTML report ODXtoWFMigrator.exe report "C:\BizTalk\OrderProcessing.odx" With output file ODXtoWFMigrator.exe report "C:\BizTalk\OrderProcessing.odx" --output "C:\Reports\OrderProcessing_Analysis.html" BATCH REPORT Process directory ODXtoWFMigrator.exe batch report --directory "C:\BizTalk\Orchestrations" Short directory flag ODXtoWFMigrator.exe batch report -d "C:\BizTalk\ContosoProject\Orchestrations" BATCH CONVERT Basic batch convert ODXtoWFMigrator.exe batch convert --directory "C:\BizTalk\Orchestrations" --bindings "C:\BizTalk\BindingInfo.xml" Alternative migrate syntax ODXtoWFMigrator.exe batch migrate -d "C:\BizTalk\AllOrchestrations" -b "C:\BizTalk\BindingInfo.xml" Specific files ODXtoWFMigrator.exe batch convert --files "C:\BizTalk\Order.odx,C:\BizTalk\Invoice.odx" --bindings "C:\BizTalk\BindingInfo.xml" With output directory ODXtoWFMigrator.exe batch convert -d "C:\BizTalk\Orchestrations" -b "C:\BizTalk\BindingInfo.xml" -o "C:\LogicApps\Workflows" With refactored generator ODXtoWFMigrator.exe batch convert -d "C:\BizTalk\Orchestrations" -b "C:\BizTalk\BindingInfo.xml" --refactor GENERATE-PACKAGE Basic package generation ODXtoWFMigrator.exe generate-package "C:\BizTalk\OrderProcessing.odx" "C:\BizTalk\BindingInfo.xml" With output directory ODXtoWFMigrator.exe generate-package "C:\BizTalk\OrderProcessing.odx" "C:\BizTalk\BindingInfo.xml" "C:\Deploy\OrderProcessing" With refactored generator ODXtoWFMigrator.exe package "C:\BizTalk\CloudIntegration.odx" "C:\BizTalk\BindingInfo.xml" --refactor ANALYZE-ODX / GAP-ANALYSIS Analyze directory ODXtoWFMigrator.exe analyze-odx "C:\BizTalk\LegacyOrchestrations" With output report ODXtoWFMigrator.exe analyze-odx "C:\BizTalk\Orchestrations" --output "C:\Reports\gap_analysis.json" LEGACY MODE Legacy basic ODXtoWFMigrator.exe "C:\BizTalk\OrderProcessing.odx" "C:\BizTalk\BindingInfo.xml" "C:\Output\OrderProcessing.json" BTPtoLA - Pipeline to Logic Apps Converter BTPtoLA handles the conversion of BizTalk pipelines to Logic Apps components. BizTalk pipelines process messages as they enter or leave the messaging engine, performing operations like validation, decoding, and transformation. Key Capabilities: Converts BizTalk receive and send pipelines to Logic Apps processing patterns Maps pipeline components (decoders, validators, disassemblers, etc.) to Logic Apps actions Preserves pipeline stage configurations and component properties Generates appropriate connector configurations for pipeline operations Input: Output: Core Components: Pipeline parsing and analysis logic Connector registry (Schemas/Connectors/pipeline-connector-registry.json) for mapping pipeline components Logic Apps workflow generation for pipeline equivalents To convert a receive pipeline: BTPtoLA.exe -pipeline "C:\Pipelines\ReceiveOrderPipeline.btp" -type receive -output "C:\Output\ReceiveOrderPipeline.json" To Convert a send pipeline: BTPtoLA.exe -pipeline "C:\Pipelines\SendInvoicePipeline.btp" -type send -output "C:\Output\SendInvoicePipeline.json" BizTalktoLogicApps.MCP - Model Context Protocol Server The MCP (Model Context Protocol) server provides a standardized interface for AI-assisted migration workflows. This component enables integration with AI tools and assistants to provide intelligent migration suggestions and automation. Key Capabilities: Exposes migration tools through a standardized MCP interface Enables AI-driven migration assistance and recommendations Provides tool handlers for map conversion and other migration operations Facilitates interactive migration workflows with AI assistance Core Components: McpServer.cs: Main MCP server implementation Server/ToolHandlers/MapConversionToolHandler.cs: Handler for map conversion operations test-requests.json: Test request definitions for validating MCP functionality BizTalktoLogicApps.Tests - Test Project A complete test project ensuring the reliability and accuracy of all migration tools, with integration tests that validate end-to-end conversion scenarios. Key Capabilities: Integration tests for BTM to LML conversion across multiple map files Schema validation and error handling tests Batch processing tests for converting multiple artifacts Output verification and quality assurance Where to upload your data: Upload your BizTalk artifacts in the Data directory, and run your tests using the Test explorer. For a complete demonstration, watch the video below:918Views0likes1CommentAnnouncing General Availability: Azure Logic Apps Standard Custom Code with .NET 8
We’re excited to announce the General Availability (GA) of Custom Code support in Azure Logic Apps Standard with .NET 8. This release marks a significant step forward in enabling developers to build more powerful, flexible, and maintainable integration workflows using familiar .NET tools and practices. With this capability, developers can now embed custom .NET 8 code directly within their Logic Apps Standard workflows. This unlocks advanced logic scenarios, promotes code reuse, and allows seamless integration with existing .NET libraries and services—making it easier than ever to build enterprise-grade solutions on Azure. What’s New in GA This GA release introduces several key enhancements that improve the development experience and expand the capabilities of custom code in Logic Apps: Bring Your Own Packages Developers can now include and manage their own NuGet packages within custom code projects without having to resolve conflicts with the dependencies used by the language worker host. The update includes the ability to load the assembly dependencies of the custom code project into a separate Assembly context allowing you to bring any NET8 compatible dependent assembly versions that your project need. There are only three exceptions to this rule: Microsoft.Extensions.Logging.Abstractions Microsoft.Extensions.DependencyInjection.Abstractions Microsoft.Azure.Functions.Extensions.Workflows.Abstractions Dependency Injection Native Support Custom code now supports native Dependency Injection (DI), enabling better separation of concerns and more testable, maintainable code. This aligns with modern .NET development patterns and simplifies service management within your custom logic. To enable Dependency Injection, developers will need to provide a StartupConfiguration class, defining the list of dependencies: using Microsoft.Azure.Functions.Extensions.Workflows; using Microsoft.Extensions.DependencyInjection; public class StartupConfiguration : IConfigureStartup { /// <summary> /// Configures services for the Azure Functions application. /// </summary> /// <param name="services">The service collection to configure.</param> public void Configure(IServiceCollection services) { // Register the routing service with dependency injection services.AddSingleton<IRoutingService, OrderRoutingService>(); services.AddSingleton<IDiscountService, DiscountService>(); } } You will also need to initialize those register those services during your custom code class constructor: public class MySampleFunction { private readonly ILogger<MySampleFunction> logger; private readonly IRoutingService routingService; private readonly IDiscountService discountService; public MySampleFunction(ILoggerFactory loggerFactory, IRoutingService routingService, IDiscountService discountService) { this.logger = loggerFactory.CreateLogger<MySampleFunction>(); this.routingService = routingService; this.discountService = discountService; } // your function logic here } Improved Authoring Experience The development experience has been significantly enhanced with improved tooling and templates. Whether you're using Visual Studio or Visual Studio Code, you’ll benefit from streamlined scaffolding, local debugging, and deployment workflows that make building and managing custom code faster and more intuitive. The following user experience improvements were added: Local functions metadata are kept between VS Code sessions, so you don't receive validation errors when editing workflows that depend on the local functions. Projects are also built when designer starts, so you don't have to manually update references. New context menu gestures, allowing you to create new local functions or build your functions project directly from the explorer area Unified debugging experience, making it easer for you to debug. We have now a single task for debugging custom code and logic apps, which makes starting a new debug session as easy as pressing F5. Learn More To get started with custom code in Azure Logic Apps Standard, visit the official Microsoft Learn documentation: Create and run custom code in Azure Logic Apps Standard You can also find example code for Dependency injection wsilveiranz/CustomCode-Dependency-InjectionLogic Apps Aviators Newsletter - April 2026
In this issue: Ace Aviator of the Month News from our product group News from our community Ace Aviator of the Month April 2026's Ace Aviator: Marcelo Gomes What’s your role and title? What are your responsibilities? I’m an Integration Team Leader (Azure Integrations) at COFCO International, working within the Enterprise Integration Platform. My core responsibility is to design, architect, and operate integration solutions that connect multiple enterprise systems in a scalable, secure, and resilient way. I sit at the intersection of business, architecture, and engineering, ensuring that business requirements are correctly translated into technical workflows and integration patterns. From a practical standpoint, my responsibilities include: - Defining integration architecture standards and patterns across the organization - Designing end‑to‑end integration solutions using Azure Integration Services - Owning and evolving the API landscape (via Azure API Management) - Leading, mentoring, and supporting the integration team - Driving PoCs, experiments, and technical explorations to validate new approaches - Acting as a bridge between systems, teams, and business domains, ensuring alignment and clarity In short, my role is to make sure integrations are not just working — but are well‑designed, maintainable, and aligned with business goals. Can you give us some insights into your day‑to‑day activities and what a typical day looks like? My day‑to‑day work is a balance between technical leadership, architecture, and execution. A typical day usually involves: - Working closely with Business Analysts and Product Owners to understand integration requirements, constraints, and expected outcomes - Translating those requirements into integration flows, APIs, and orchestration logic - Defining or validating the architecture of integrations, including patterns, error handling, resiliency, and observability - Guiding developers during implementation, reviewing approaches, and helping them make architectural or design decisions - Managing and governing APIs through Azure API Management, ensuring consistency, security, and reusability - Unblocking team members by resolving technical issues, dependencies, or architectural doubts - Performing estimations, supporting planning, and aligning delivery expectations I’m also hands‑on. I actively build integrations myself — not just to help deliver, but to stay close to the platform, understand real challenges, and continuously improve our standards and practices. I strongly believe technical leadership requires staying connected to the actual implementation. What motivates and inspires you to be an active member of the Aviators / Microsoft community? What motivates me is knowledge sharing. A big part of what I know today comes from content shared by others — blog posts, samples, talks, community discussions, and real‑world experiences. Most of my learning followed a simple loop: someone shared → I tried it → I broke it → I fixed it → I learned. For me, learning only really completes its cycle when we share back. Explaining what worked (and what didn’t) helps others avoid the same mistakes and accelerates collective growth. Communities like Aviators and the Microsoft ecosystem create a space where learning is practical, honest, and experience‑driven — and that’s exactly the type of environment I want to contribute to. Looking back, what advice would you give to people getting into STEM or technology? My main advice is: start by doing. Don’t wait until you feel ready or confident — you won’t. When you start doing, you will fail. And that failure is not a problem; it’s part of the learning process. Each failure builds experience, confidence, and technical maturity. Another important point: ask questions. There is no such thing as a stupid question. Asking questions opens perspectives, challenges assumptions, and often triggers better solutions. Sometimes, a simple question from a fresh point of view can completely change how a problem is solved. Progress in technology comes from curiosity, iteration, and collaboration — not perfection. What has helped you grow professionally? Curiosity has been the biggest driver of my professional growth. I like to understand how things work under the hood, not just how to use them. When I’m curious about something, I try it myself, test different approaches, and build my own experience around it. That hands‑on curiosity helps me: - Develop stronger technical intuition - Understand trade‑offs instead of just following patterns blindly - Make better architectural decisions - Communicate more clearly with both technical and non‑technical stakeholders Having personal experience with successes and failures gives me clarity about what I’m really looking for in a solution — and that has been key to my growth. If you had a magic wand to create a new feature in Logic Apps, what would it be and why? I’d add real‑time debugging with execution control. Specifically, the ability to: - Pause a running Logic App execution - Inspect intermediate states, variables, and payloads in real time - Step through actions one by one, similar to a traditional debugger This would dramatically improve troubleshooting, learning, and optimization, especially in complex orchestrations. Today, we rely heavily on post‑execution inspection, which works — but real‑time visibility would be a huge leap forward in productivity and understanding. For integration engineers, that kind of feature would be a true game‑changer. News from our product group How to revoke connection OAuth programmatically in Logic Apps The post shows how to revoke an API connection’s OAuth tokens programmatically in Logic Apps, without using the portal. It covers two approaches: invoking the Revoke Connection Keys REST API directly from a Logic App using the 'Invoke an HTTP request' action, and using an Azure AD app registration to acquire a bearer token that authorizes the revoke call from Logic Apps or tools like Postman. Step-by-step guidance includes building the request URL, obtaining tokens with client credentials, parsing the token response, and setting the Authorization header. It also documents required permissions and a least-privilege custom RBAC role. Introducing Skills in Azure API Center This article introduces Skills in Azure API Center—registered, reusable capabilities that AI agents can discover and use alongside APIs, models, agents, and MCP servers. A skill describes what it does, its source repository, ownership, and which tools it is allowed to access, providing explicit governance. Teams can register skills manually in the Azure portal or automatically sync them from a Git repository, supporting GitOps workflows at scale. The portal offers discovery, filtering, and lifecycle visibility. Benefits include a single inventory for AI assets, better reuse, and controlled access via Allowed tools. Skills are available in preview with documentation links. Reliable blob processing using Azure Logic Apps: Recommended architecture The post explains limitations of the in‑app Azure Blob trigger in Logic Apps, which relies on polling and best‑effort storage logs that can miss events under load. For mission‑critical scenarios, it recommends a queue‑based pattern: have the source system emit a message to Azure Storage Queues after each blob upload, then trigger the Logic App from the queue and fetch the blob by metadata. Benefits include guaranteed triggering, decoupling, retries, and observability. As an alternative, it outlines using Event Grid with single‑tenant Logic App endpoints, plus caveats for private endpoints and subscription validation requirements. Implementing / Migrating the BizTalk Server Aggregator Pattern to Azure Logic Apps Standard This article shows how to implement or migrate the classic BizTalk Server Aggregator pattern to Azure Logic Apps Standard using a production-ready template available in the Azure portal. It maps BizTalk orchestration concepts (correlation sets, pipelines, MessageBox) to cloud-native equivalents: a stateful workflow, Azure Service Bus as the messaging backbone, CorrelationId-based grouping, and FlatFileDecoding for reusing existing BizTalk XSD schemas with zero refactoring. Step-by-step guidance covers triggering with the Service Bus connector, grouping messages by CorrelationId, decoding flat files, composing aggregated results, and delivering them via HTTP. A side‑by‑side comparison highlights architectural differences and migration considerations, aligned with BizTalk Server end‑of‑life timelines. News from our community Resilience for Azure IPaaS services Post by Stéphane Eyskens Stéphane Eyskens examines resilience patterns for Azure iPaaS workloads and how to design multi‑region architectures spanning stateless and stateful services. The article maps strategies across Service Bus, Event Hubs, Event Grid, Durable Functions, Logic Apps, and API Management, highlighting failover models, idempotency, partitioning, and retry considerations. It discusses trade‑offs between active‑active and active‑passive, the role of a governed API front door, and the importance of consistent telemetry for recovery and diagnostics. The piece offers pragmatic guidance for integration teams building high‑availability, fault‑tolerant solutions on Azure. From APIs to Agents: Rethinking Integration in the Agentic Era Post by Al Ghoniem, MBA This article frames AI agents as a new layer in enterprise integration rather than a replacement for existing platforms. It contrasts deterministic orchestration with agent‑mediated behavior, then proposes an Azure‑aligned architecture: Azure AI Agent Service as runtime, API Management as the governed tool gateway, Service Bus/Event Grid for events, Logic Apps for deterministic workflows, API Center as registry, and Entra for identity and control. It also outlines patterns—tool‑mediated access, hybrid orchestration, event+agent systems, and policy‑enforced interaction—plus anti‑patterns to avoid. DevUP Talks 01 - 2026 Q1 trends with Kent Weare Video by Mattias Lögdberg Mattias Lögdberg hosts Kent Weare for a concise discussion on early‑2026 trends affecting integration and cloud development. The conversation explores how AI is reshaping solution design, where new opportunities are emerging, and how teams can adapt practices for reliability, scalability, and speed. It emphasizes practical implications for developers and architects working with Azure services and modern integration stacks. The episode serves as a quick way to track directional changes and focus on skills that matter as agentic automation and platform capabilities evolve. Azure Logic Apps as MCP Servers: A Step-by-Step Guide Post by Stephen W Thomas Stephen W Thomas shows how to expose Azure Logic Apps (Standard) as MCP servers so AI agents can safely reuse existing enterprise workflows. The guide explains why this matters—reusing logic, tapping 1,400+ connectors, and applying key-based auth—and walks through creating an HTTP workflow, defining JSON schemas, connecting to SQL Server, and generating API keys from the MCP Servers blade. It closes with testing in VS Code, demonstrating how agents invoke Logic Apps tools to query live data with governance intact, without rewriting integration code. BizTalk to Azure Migration Roadmap: Integration Journey Post by Sandro Pereira This roadmap-style article distills lessons from BizTalk-to-Azure migrations into a structured journey. It outlines motivations for moving, capability mapping from BizTalk to Azure Integration Services, and phased strategies that reduce risk while modernizing. Readers get guidance on assessing dependencies, choosing target Azure services, designing hybrid or cloud‑native architectures, and sequencing workloads. The post emphasises that migration is not a lift‑and‑shift but a program of work aligned to business priorities, platform governance, and operational readiness. BizTalk Adapters to Azure Logic Apps Connectors Post by Michael Stephenson Michael Stephenson discusses how organizations migrating from BizTalk must rethink integration patterns when moving to Azure Logic Apps connectors. The post considers what maps well, where gaps and edge cases appear, and how real-world implementations often require re‑architecting around AIS capabilities rather than a one‑to‑one adapter swap. It highlights community perspectives and practical considerations for planning, governance, and operationalizing new designs beyond pure connector parity. Pro-Code Enterprise AI-Agents using MCP for Low-Code Integration Video by Sebastian Meyer This short video demonstrates bridging pro‑code and low‑code by using the Model Context Protocol (MCP) to let autonomous AI agents interact with enterprise systems via Logic Apps. It walks through the high‑level setup—agent, MCP server, and Logic Apps workflows—and shows how to connect to platforms like ServiceNow and SAP. The focus is on practical tool choice and architecture so teams can extend existing integration assets to agent‑driven use cases without rebuilding from scratch. Friday Fact: The Hidden Retry Behavior That Makes Logic Apps Feel Stuck Post by João Ferreira This Friday Fact explains why a Logic App can appear “stuck” when calling unstable APIs: hidden retry policies, exponential backoff, and looped actions can accumulate retries and slow runs dramatically. It lists default behaviors many miss, common causes like throttling, and mitigation steps such as setting explicit retry policies, using Configure run after for failure paths, and introducing circuit breakers for flaky backends. The takeaway: the workflow may not be broken—just retrying too aggressively—so design explicit limits and recovery paths. Your Logic App Is NOT Your Business Process (Here’s Why) Video by Al Ghoniem, MBA This short explainer argues that mapping Logic Apps directly to a business process produces brittle workflows. Real systems require retries, enrichment, and exception paths, so the design quickly diverges from a clean process diagram. The video proposes separating technical orchestration from business visibility using Business Process Tracking. That split yields clearer stakeholder views and more maintainable solutions, while keeping deterministic execution inside Logic Apps. It’s a practical reminder to design for operational reality rather than mirroring a whiteboard flow. BizTalk Server Migration to Azure Integration Services Architecture Guidance Post by Sandro Pereira A brief overview of Microsoft’s architecture guidance for migrating BizTalk Server to Azure Integration Services. The post explains the intent of the guidance, links to sections on reasons to migrate, AIS capabilities, BizTalk vs. AIS comparisons, and service selection. It highlights planning topics such as migration approaches, best practices, and a roadmap, helping teams frame decisions for hybrid or cloud‑native architectures as they modernize BizTalk workloads. Logic Apps & Power Automate Action Name to Code Translator Post by Sandro Pereira This post introduces a lightweight utility that converts Logic Apps and Power Automate action names into their code identifiers—useful when writing expressions or searching in Code View. It explains the difference between designer-friendly labels and underlying names (spaces become underscores and certain symbols are disallowed), why this causes friction, and how the tool streamlines the translation. It includes screenshots, usage notes, and the download link to the open-source project, making it a practical time-saver for developers moving between designer and code-based workflows. Logic Apps Consumption CI/CD from Zero to Hero Whitepaper Post by Sandro Pereira This whitepaper provides an end‑to‑end path to automate CI/CD for Logic Apps Consumption using Azure DevOps. It covers solution structure, parameterization, and environment promotion, then shows how to build reliable pipelines for packaging, deploying, and validating Logic Apps. The guidance targets teams standardizing delivery with repeatable patterns and governance. With templates and practical advice, it helps reduce manual steps, improve quality, and accelerate releases for Logic Apps workloads. Logic App Best practices, Tips and Tricks: #2 Actions Naming Convention Post by Sandro Pereira This best‑practices post focuses on action naming in Logic Apps. It explains why consistent, descriptive names improve readability, collaboration, and long‑term maintainability, then outlines rules and constraints on allowed characters. It shows common pitfalls—default names, uneditable trigger/branch labels—and practical tips for renaming while avoiding broken references. The guidance helps teams treat names as living documentation so workflows remain understandable without drilling into each action’s configuration. How to Expose and Protect Logic App Using Azure API Management (Whitepaper) Post by Sandro Pereira This whitepaper explains how to front Logic Apps with Azure API Management for governance and security. It covers publishing Logic Apps as APIs, restricting access, enforcing IP filtering, preventing direct calls to Logic Apps, and documenting operations. It also discusses combining multiple Logic Apps under a single API, naming conventions, and how to remove exposed operations safely. The paper provides step‑by‑step guidance and a download link to help teams standardize exposure and protection patterns. Logic apps – Check the empty result in SQL connector Post by Anitha Eswaran This post shows a practical pattern for handling empty SQL results in Logic Apps. Using the SQL connector’s output, it adds a Parse JSON step to normalize the result and then evaluates length() to short‑circuit execution when no rows are returned. Screenshots illustrate building the schema, wiring the content, and introducing a conditional branch that terminates the run when the array is empty. The approach avoids unnecessary downstream actions and reduces failures, providing a reusable, lightweight guard for query‑driven workflows. Azure Logic Apps Is Basically Power Automate on Steroids (And You Shouldn’t Be Afraid of It) Post by Kim Brian Kim Brian explains why Logic Apps feels familiar to Power Automate builders while removing ceilings that appear at scale. The article contrasts common limits in cloud flows with Standard/Consumption capabilities, highlights the designer vs. code‑view model, and calls out built‑in Azure management features such as versioning, monitoring, and CI/CD. It positions Logic Apps as the “bigger sibling” for enterprise‑grade integrations and data throughput, offering more control without abandoning the visual authoring experience. Logic Apps CSV Alphabetic Sorting Explained Post by Sandro Pereira Sandro Pereira describes why CSV headers and columns can appear in alphabetical order after deploying Logic Apps via ARM templates. He explains how JSON serialization and array ordering influence CSV generation, what triggers the sorting behavior, and practical workarounds to preserve intended column order. The article helps teams avoid subtle defects in data exports by aligning workflow design and deployment practices with how Logic Apps materializes CSV content at runtime. Azure Logic Apps Translation vs Transformation – Actions, Examples, and Schema Mapping Explained Post by Maheshkumar Tiwari Maheshkumar Tiwari clarifies the difference between translation (format change) and transformation (business logic) in Logic Apps, then maps each to concrete Azure capabilities. Using a purchase‑order scenario, he shows how to decode/encode flat files and EDI, convert XML↔JSON, and apply Liquid/XSLT, Select, Compose, and Filter Array for schema mapping and enrichment. A quick reference table ties common tasks to the right action, helping architects separate concerns so format changes don’t break business rules and workflow design remains maintainable.346Views0likes0CommentsHow to revoke connection OAuth programmatically in Logic Apps
There are multiple ways to revoke the OAuth of an API Connection other than clicking on the Revoke button in the portal: For using the "Invoke an HTTP request": Get the connection name: Create a Logic App (Consumption), use a trigger of your liking, then add the “Invoke an HTTP request” Action. Create a connection on the same Tenant that has the connection, then add the below URL to the action, and test it: https://management.azure.com/subscriptions/[SUBSCRIPTION_ID]/resourceGroups/[RESOURCE_GROUP]/providers/Microsoft.Web/connections/[NAME_OF_CONNECTION]/revokeConnectionKeys?api-version=2018-07-01-preview Your test should be successful. For using an App Registration to fetch the Token (which is how you can do this with Postman or similar as well): App Registration should include this permission: For your Logic App, or Postman, get the Bearer Token by calling this URL: https://login.microsoftonline.com/[TENANT_ID]/oauth2/v2.0/token With this in the Body: Client_Id=[CLIENT_ID_OF_THE_APP_REG]&Client_Secret=[CLIENT_SECRET_FROM_APP_REG]&grant_type=client_credentials&scope=https://management.azure.com/.default For the Header use: Content-Type = application/x-www-form-urlencoded If you’ll use a Logic App for this; Add a Parse JSON action, use the Body of the Get Bearer Token HTTP Action as an input to the Parse JSON Action, then use the below as the Schema: { "properties": { "access_token": { "type": "string" }, "expires_in": { "type": "integer" }, "ext_expires_in": { "type": "integer" }, "token_type": { "type": "string" } }, "type": "object" } Finally, add another HTTP Action (or call this in Postman or similar) to call the Revoke API. In the Header add “Authorization”key with a value of “Bearer” followed by a space then add the bearer token from the output of the Parse JSON Action. https://management.azure.com/subscriptions/[SUBSCRIPTION_ID]/resourceGroups/[RESOURCE_GROUP]/providers/Microsoft.Web/connections/[NAME_OF_CONNECTION]/revokeConnectionKeys?api-version=2018-07-01-preview If you want to use CURL: Request the Token Use the OAuth 2.0 client credentials flow to get the token: curl -X POST \ -H "Content-Type: application/x-www-form-urlencoded" \ -d "client_id=[CLIENT_ID_OF_APP_REG]" \ -d "scope= https://management.azure.com/.default" \ -d "client_secret=[CLIENT_SECRET_FROM_APP_REG]" \ -d "grant_type=client_credentials" \ https://login.microsoftonline.com/[TENANT_ID]/oauth2/v2.0/token The access_token in the response is your Bearer token. Call the Revoke API curl -X POST "https://management.azure.com/subscriptions/[SUBSCRIPTION_ID]/resourceGroups/[RESOURCE_GROUP]/providers/Microsoft.Web/connections/[NAME_OF_CONNECTION]/revokeConnectionKeys?api-version=2018-07-01-preview" \ -H "Authorization: Bearer <ACCESS_TOKEN>" \ -H "Content-Type: application/json" \ -d '{"key":"value"}' If you face the below error, you will need to grant Contributor Role to the App Registration on the Resource Group that contains the API Connection. (If you want “Least privilege” skip to below ) { "error": { "code": "AuthorizationFailed", "message": "The client '[App_reg_client_id]' with object id '[App_reg_object_id]' does not have authorization to perform action 'Microsoft.Web/connections/revokeConnectionKeys/action' over scope '/subscriptions/[subscription_id]/resourceGroups/[resource_group_id]/providers/Microsoft.Web/connections/[connection_name]' or the scope is invalid. If access was recently granted, please refresh your credentials." } } For “Least privilege” solution, create a Custom RBAC Role with the below Roles, and assign it to the App Registration Object ID same as above: { "actions": [ "Microsoft.Web/connections/read", "Microsoft.Web/connections/write", "Microsoft.Web/connections/delete", "Microsoft.Web/connections/revokeConnectionKeys/action" ] }Scaling Logic App Standard for High Throughput Scenarios
In this blog, we will delve into the scaling characteristics of Logic Apps standard and offer valuable insights on designing scalable Logic Apps for high throughput scenarios. This is the first of a blog series dedicated to exploring the scaling aspects of Logic Apps. We also plan to release subsequent blog posts featuring real-world designs and case studies of Logic Apps successfully handling high throughput workloads. Stay tuned for in-depth examples and practical applications!19KViews3likes1CommentReliable blob processing using Azure Logic Apps: Recommended architecture
Understanding the Limitations of In-App Blob Triggers The Logic App Blob - In-App trigger https://learn.microsoft.com/en-us/azure/logic-apps/connectors/built-in/reference/azureblob/ is built upon the same architecture as the Azure Function App Blob trigger and therefore inherits similar features and limitations. Notably, the in-app blob trigger uses polling as its core mechanism to detect changes in blob containers Polling and latency. Key behaviors and limitations include: Polling-Based Detection: The trigger polls the associated container rather than subscribing to events. This hybrid mechanism combines periodic container scans with Azure Storage analytics log inspection. Batch Scanning: Blobs are scanned in batches of up to 10,000 per interval, using a continuation token to track progress. Best-Effort Logging: The system relies on Azure Storage logs, which are generated on a best-effort basis. These logs are not guaranteed to capture all blob events. Event Loss Risk: Under specific conditions — such as high blob throughput, delayed log generation, or log retention settings — some blob creation events may be missed. Latency and Missed Triggers: Due to the asynchronous and non-deterministic nature of polling and logging, there may be latency in triggering workflows, and in some cases, triggers may not fire at all. Given these limitations, relying solely on the in-app blob trigger may not be suitable for mission-critical scenarios that require guaranteed blob processing. Recommended Alternative Approach: Queue-Based Reliable Triggering To ensure reliable and scalable blob processing without missing any events, we recommend an alternative architecture that introduces explicit event signaling via Azure Storage Queues. This approach provides greater reliability and control, especially in high-throughput or mission-critical scenarios. For more info on this pattern please refer Claim-Check pattern - Azure Architecture Center | Microsoft Learn In this approach, the Logic App uses a Storage Queue trigger, and the source system must send a queue message whenever a blob is uploaded. The source system should implement the following steps: File path or blob name Filename Additional context (for example, customer ID or upload timestamp) Blob Upload Upload the file to an Azure Blob Storage container as usual. Queue Message Creation (Source responsibility) Immediately after the blob upload completes, the source system constructs a queue message containing relevant blob metadata, such as: Send Message to Azure Storage Queue The source system sends the constructed metadata message to a pre‑configured Azure Storage Queue. This queue message acts as the explicit trigger for downstream processing. Trigger Logic App via Queue Message The Logic App is configured with a Storage Queue trigger and is triggered only when a new message arrives in the queue. Process Blob via Metadata Upon receiving the queue message, the Logic App: Parses the blob metadata (for example, file path and filename) Uses the “Get blob content” action (or equivalent) to retrieve and process the actual blob content from Azure Blob Storage Benefits of This Approach Guaranteed Triggering: The Logic App is reliably triggered by a queue message, ensuring no blob is missed. Decoupled Workflow: This architecture decouples blob creation from processing logic, enabling better scalability and fault isolation. Resilience and Retry Support: Azure Storage Queues provide built-in retry capabilities, message visibility timeouts, and dead-lettering to handle transient failures gracefully. Better Monitoring and Control: You can monitor the queue depth, message age, and processing status to track workflow health and throughput. Alternative Approach: Event Grid Integration Another reliable method for handling blob events is using Event Grid with Single Tenant Logic App workflow endpoints. Using Single Tenant Logic App Workflow Endpoint as Event Grid Subscription method allows you to handle events from either system or custom Event Grid topics. However, Event Grid topics won't be able to deliver events on private endpoints. Hence, if you have Logic App enabled with Private Endpoint, you wouldn't be able to use the workflow endpoints as Event Grid subscriptions In this approach, blob processing is triggered by events. Here, Azure Blob Storage acts as the event publisher, and Event Grid delivers blob lifecycle events to the Logic App endpoint, which then processes the blob. Steps to Implement Event Grid Integration: Create Event Grid Subscription Configure an Event Grid subscription on the Azure Storage account or specific blob container to listen for blob events Configure Logic App Endpoint Configure the Logic App Standard workflow endpoint (HTTP or Event Grid trigger) as the event subscriber. The endpoint must be publicly reachable, or an intermediate component must be used if private networking is required. Handle Subscription Handshake (Logic App responsibility) For Single‑Tenant Logic Apps, implement explicit Event Grid subscription validation (handshake) to confirm that the Logic App endpoint accepts events from Event Grid. When an Event Grid subscription is created, Event Grid sends a SubscriptionValidationEvent to the subscriber endpoint. The Logic App must: Detect the Microsoft.EventGrid.SubscriptionValidationEvent Extract the validation Code from the request payload Return an HTTP 200 OK response For detailed implementation guidance and example, see: Using Single‑Tenant Logic App Workflow Endpoint as Event Grid Subscription Trigger Logic App via Events Once the subscription is active, Event Grid automatically pushes blob events to the Logic App endpoint whenever a blob is created, updated. Process Blob Based on Event Data Upon receiving the event payload, the Logic App: Parses event data (container name, blob name, blob URL, event type) Retrieves the blob using actions such as “Read blob content” Processes the blob according to business logic Benefits of Event Grid Integration Event-Driven Architecture: Provides real-time event handling, reducing latency compared to polling mechanisms. Scalability: Event Grid can handle many events efficiently. Flexibility: Supports various event sources and custom topics. Note: If you have Logic App enabled with Private Endpoint, refer to how to trigger Azure Function from Event Grid over virtual network for alternative configurations449Views0likes0CommentsHow to Access a Shared OneDrive Folder in Azure Logic Apps
What is the problem? A common enterprise automation scenario involves copying files from a OneDrive folder shared by a colleague into another storage service such as SharePoint or Azure Blob Storage using Azure Logic Apps. However, when you configure the OneDrive for Business – “List files in folder” action in a Logic App, you quickly run into a limitation: The folder picker only shows: Root directory Subfolders of the authenticated user’s OneDrive Shared folders do not appear at all, even though you can access them in the OneDrive UI This makes it seem like Logic Apps cannot work with shared OneDrive folders—but that’s not entirely true. Why this happens The OneDrive for Business connector is user‑context scoped. It only enumerates folders that belong to the signed-in user’s drive and does not automatically surface folders that are shared with the user. Even though shared folders are visible under “Shared with me” in the OneDrive UI, they: Live in a different drive Have a different driveId Require explicit identification before Logic Apps can access them How to access a shared OneDrive folder There are two supported ways to access a shared OneDrive directory from Logic Apps. Option 1: Use Microsoft Graph APIs (Delegated permissions) You can invoke Microsoft Graph APIs directly using: HTTP with Microsoft Entra ID (preauthorized) Delegated permissions on behalf of the signed‑in user This requires: Admin consent or delegated consent workflows Additional Entra ID configuration 📘 Reference: HTTP with Microsoft Entra ID (preauthorized) - Connectors | Microsoft Learn While powerful, this approach adds setup complexity. Option 2: Use Graph Explorer to configure the OneDrive connector Instead of calling Graph from Logic Apps directly, you can: Use Graph Explorer to discover the shared folder metadata Manually configure the OneDrive action using that metadata Step-by-step: Using Graph Explorer to access a shared folder Scenario A colleague has shared a OneDrive folder named “Test” with me, and I need to process files inside it using a Logic App. Step 1: List shared folders using Microsoft Graph In Graph Explorer, run the following request: GET https://graph.microsoft.com/v1.0/users/{OneDrive shared folder owner username}/drive/root/children 📘Reference: List the contents of a folder - Microsoft Graph v1.0 | Microsoft Learn ✅This returns all root-level folders visible to the signed-in user, including folders shared with you. From the response, locate the shared folder. You only need two values: parentReference.driveId id (folder ID) Graph explorer snippet showing the request sent to the API to list the files & folders shared by a specific user on the root drive Step 2: Configure Logic App “List files in folder” action In your Logic App: Add OneDrive for Business → List files in folder Do not use the folder picker Manually enter the folder value using this format: {driveId}.{folderId} Once saved, the action successfully lists files from the shared OneDrive folder. Step 3: Build the rest of your workflow After the folder is resolved correctly: You can loop through files Copy them to SharePoint Upload them to Azure Blob Storage Apply filters, conditions, or transformations All standard OneDrive actions now work as expected. Troubleshooting: When Graph Explorer doesn’t help If you’re unable to find the driveId or folderId via Graph Explorer, there’s a reliable fallback. Use browser network tracing Open the shared folder in OneDrive (web) Open Browser Developer Tools → Network Look for requests like: & folderId In the response payload, extract: CurrentFolderUniqueId → folder ID drives/{driveId} from the CurrentFolderSpItemUrl This method is very effective when Graph results are incomplete or filtered.398Views2likes0CommentsImplementing / Migrating the BizTalk Server Aggregator Pattern to Azure Logic Apps Standard
While the article focuses on the migration path from BizTalk Server, the template is equally suited for new (greenfield) implementations any team looking to implement the Aggregator pattern natively in Azure Logic Apps can deploy it directly from the Azure portal without prior BizTalk experience. The template source code is open source and available in the Azure LogicAppsTemplates GitHub repository. For full details on the original BizTalk implementation, see the BizTalk Server Aggregator SDK sample. Why is it important? BizTalk Server End of life has been confirmed and if you have not started your migration to Logic Apps, you should start soon. This is one of many articles in BizTalk Migration. More information can be found here: https://aka.ms/biztalkeolblog. The migration at a glance: BizTalk orchestration vs. Logic Apps workflow The BizTalk SDK implements the pattern through an orchestration (Aggregate.odx) that uses correlation sets, receive shapes, loop constructs, and send pipelines. The Logic Apps Standard template replicates the same logic using a stateful workflow with Azure Service Bus and CorrelationId-based grouping. The BizTalk solution includes: Component Purpose Aggregate.odx Main orchestration that collects correlated messages and executes the send pipeline FFReceivePipeline.btp Receive pipeline with flat file disassembler Invoice.xsd Document schema for invoice messages InvoiceEnvelope.xsd Envelope schema for output interchange PropertySchema.xsd Property schema with promoted properties for correlation XMLAggregatingPipeline.btp Send pipeline to assemble collected messages into XML interchange The Azure Logic Apps Standard implementation The Logic Apps Standard workflow replicates the same Aggregator pattern using a stateful workflow with Azure Service Bus as the message source and CorrelationId-based grouping. The template is publicly available in the Azure portal templates gallery. Figure 2: The “Aggregate messages from Azure Service Bus by CorrelationId” template in the Azure portal templates gallery, published by Microsoft. Receives messages from Service Bus in batches, groups them by CorrelationId, decodes flat files, and responses with the aggregated result via HTTP. Side-by-side comparison: BizTalk Server vs. Azure Logic Apps Understanding how each component maps between platforms is essential for a smooth migration: Concept BizTalk Server (Aggregate.odx) Azure Logic Apps Standard Messaging infrastructure MessageBox database (SQL Server) Azure Service Bus (cloud-native PaaS) Message source Receive Port / Receive Location Service Bus trigger (peekLockQueueMessagesV2) Message decoding Receive Pipeline (Flat File Disassembler) Decode_Flat_File_Invoice action (FlatFileDecoding) Correlation mechanism Correlation Sets on promoted properties (DestinationPartnerURI) CorrelationId from Service Bus message properties Message accumulation Loop shape + Message Assignment shapes ForEach loop + CorrelationGroups dictionary variable Completion condition Loop exit (10 messages or 1-minute timeout) Batch-based: processes all messages in current batch Aggregated message construction Construct Message shape + XMLAggregatingPipeline Build_Aggregated_Messages ForEach + Compose actions Result delivery Send Port (file, HTTP, or other adapter) HTTP Response or any other regarding business need Error handling Exception Handler shapes + SuspendMessage.odx Scope + error handler actions Schema support BizTalk Flat File Schemas (XSD) Same XSD schemas in Artifacts/Schemas folder State management Orchestration dehydration/rehydration Stateful workflow with run history Key architectural differences Aspect BizTalk Server Azure Logic Apps Standard Processing model Convoy pattern (long-running, event-driven) Batch-based (processes N messages per trigger) Scalability BizTalk Host instances (manual scaling) Elastic scale (Azure App Service Plan) Retry logic Adapter-level retries Built-in HTTP retry policy (3 attempts, 10s interval) Architecture Monolithic orchestration Decoupled: aggregation + downstream processing Monitoring BizTalk Admin Console / HAT Azure portal run history + Azure Monitor Schema reuse BizTalk project schemas Direct XSD reuse in Artifacts/Schemas Deployment MSI / BizTalk deployment ARM templates, Azure DevOps, GitHub Actions How the workflow works 1. Trigger: Receive messages from Service Bus The workflow uses the built-in Service Bus trigger to retrieve messages in batches from a non-session queue. This is analogous to BizTalk's Receive Location polling the message source. 2. Process and correlate: Group messages by CorrelationId Each message is processed sequentially (like BizTalk's ordered delivery). The workflow: Extracts the CorrelationId from Service Bus message properties (equivalent to BizTalk's promoted property used in the Correlation Set) Decodes flat file content with zero refactoring using the XSD schema (equivalent to BizTalk's Flat File Disassembler pipeline component) Groups messages into a dictionary keyed by CorrelationId (equivalent to BizTalk's loop + message assignment pattern) 3. Build aggregated output Once all messages in the batch are processed, the workflow constructs a result object for each correlation group containing the CorrelationId, message count and the array of decoded messages. 4. Deliver results The aggregated output is sent to a target workflow via HTTP POST, following a decoupled architecture pattern. This is analogous to BizTalk's Send Port delivering the result to the destination system. You can substitute this action for another endpoint as needed. This, will depend on your business case. Azure Service Bus: The cloud-native replacement for BizTalk’s MessageBox In BizTalk Server, the MessageBox database is the central hub for all message routing, subscription-based delivery, and correlation. It’s the engine that enables patterns like the Aggregator — messages are published to the MessageBox, and the orchestration subscribes to them based on promoted properties and correlation sets. In Azure Logic Apps Standard, there is no MessaeBox equivalent. Instead, Azure Service Bus serves as the cloud-native messaging backbone. Service Bus provides the same publish/subscribe semantics, message correlation (via the built-in CorrelationId property), peek-lock delivery, and reliable queuing — but as a fully managed, elastically scalable PaaS service with no infrastructure to maintain. This is a fundamental shift in architecture: you move from a centralized SQL Server-based message broker (MessageBox) to a distributed, cloud-native messaging service (Service Bus) that scales independently and integrates natively with Logic Apps through the built-in Service Bus connector. Important: Service Bus is not available on-premises. However, RabbitMQ is available to cover these needs, on-premises. RabbitMQ offers a fantastic alternative for customers looking to replicate BizTalk message routing, subscription-based delivery, and correlation. Decode Flat File Invoice: Reuse your BizTalk schemas with zero refactoring One of the biggest concerns during any BizTalk migration is: “What happens to our flat file schemas and message formats?” The workflow template includes a Decode Flat File action (type FlatFileDecoding) that converts positional or delimited flat file content into XML — exactly like BizTalk’s Flat File Disassembler pipeline component. The key advantage: your original BizTalk XSD flat file schemas work as-is. Upload them to the Logic Apps Artifacts/Schemas folder and reference them by name in the workflow — no modifications, no refactoring. This means: Your existing message formats don’t change — upstream and downstream systems continue sending and receiving the same flat file messages Your BizTalk schemas are directly reusable — the same Invoice.xsd from your BizTalk project works seamlessly with the FlatFileDecoding action Migration effort is significantly reduced — no need to redesign schemas, re-validate message structures, or update trading partner agreements Time-to-production is faster — focus on workflow logic and connectivity instead of rewriting message definitions Notice that, if you need to process XML data, as your data might arrive in XML format, use the XML Operations: Validate, Transform, Parse, and Compose XML with schema. You can find more information at Compose XML using Schemas in Standard Workflows - Azure Logic Apps | Microsoft Learn. The message with correlation Id Each message in the Service Bus queue is a flat file invoice the same positional/delimited text format used in the BizTalk SDK sample. Here's an example: INVOICE12345 DestinationPartnerURI:http://www.contoso.com?ID=1E1B9646-48CF-41dd-A0C0-1014B1CE5064 BILLTO,US,John Connor,123 Cedar Street,Mill Valley,CA,90952 101-TT Plastic flowers 10 4.99 Fragile handle with care 202-RR Fertilizer 1 10.99 Lawn fertilizer 453-XS Weed killer 1 5.99 Lawn weed killer The message structure combines positional and delimited fields: Line 1: Invoice identifier (fixed-length record) Line 2: Destination partner URI — in BizTalk, this value is promoted as a context property and used in the Correlation Set to group related messages Line 3: Bill-to header (comma-delimited: country, name, address, city, state, ZIP) Line 4: Line items (positional fields: item code, description, quantity, unit price, notes) Why CorrelationId is essential In BizTalk Server, the orchestration promotes `DestinationPartnerURI` from the message body into a context property and uses it as the Correlation Set to match related messages. This requires a Property Schema, promoted properties, and pipeline configuration. In Azure Logic Apps Standard, correlation is decoupled from the message body. The `CorrelationId` is a native Azure Service Bus message property with a first-class header set by the message producer when sending to the queue. This means: No schema changes needed: the flat file content stays exactly the same No property promotion: Service Bus provides the correlation identifier out of the box Simpler architecture: the workflow reads `CorrelationId` directly from the message metadata, not from the payload Producer flexibility any system sending to Service Bus can set the `CorrelationId` header using standard SDK methods, without modifying the message body This is why the Aggregator pattern maps so naturally to Service Bus: the correlation mechanism that BizTalk implements through promoted properties and correlation sets is already built into the messaging infrastructure. Step-by-step guide: Deploy the template from the Azure portal The “Aggregate messages from Azure Service Bus by CorrelationId” template is publicly available in the Azure Logic Apps Standard templates gallery. Follow these steps to deploy it: Prerequisites Before you begin, make sure you have: An Azure subscription. If you don’t have one, sign up for a free Azure account . An Azure Logic Apps Standard resource deployed in your subscription. If you need to create one, see Create an example Standard logic app workflow . An Azure Service Bus namespace with a non-session queue configured. A flat file XSD schema (for example, Invoice.xsd) ready to upload to the logic app’s Artifacts/Schemas folder. A target workflow with an HTTP trigger to receive the aggregated results (optional, can be created after deployment). Step 1: Open the templates gallery Sign in to the Azure portal. Navigate to your Standard logic app resource. On the logic app sidebar menu, select Workflows. On the logic app sidebar menu, select Workflows. On the Workflows page, select + Create to create a new workflow. In the “Create a new workflow” pane, select Use Template to open the templates gallery and select Create button. Step 2: Find the Aggregator template In the templates gallery, use the search box and type “Aggregate” or “Aggregator”. Optionally, filter by: o Connectors: Select Azure Service Bus o Categories: All Locate the template named “Aggregate messages from Azure Service Bus by CorrelationId”. o The template card shows the labels Workflow and Event as the solution type and trigger type. o The template is published by Microsoft. Step 3: Review the template details Select the template card to open the template overview pane. On the Summary tab, review: o Connections included in this template: Azure Service Bus (in-app connector) o Prerequisites: Requirements for Azure Service Bus, flat file schema, and connection configuration o Details: Description of the Aggregator enterprise integration pattern implementation Source code: Link to the GitHub repository Select the Workflow tab to preview the workflow design that the template creates and when you are ready select Use this template. Step 4: Provide workflow information In the Create a new workflow from template pane, on the Basics tab: o Workflow name: Enter a name for your workflow, for example, wf-aggregator-invoices o State type: Select Stateful (recommended for aggregation scenarios that require run history and reliable processing) Select Next. Step 5: Create connections On the Connections tab, create the Azure Service Bus connection: o Select Connect next to the Service Bus connection. o Provide your Service Bus connection string or select the managed identity authentication option. For managed identity (recommended), make sure your logic app’s managed identity has the Azure Service Bus Data Receiver role on the Service Bus namespace. 2. Select Next. Step 6: Configure parameters On the Parameters tab, provide values for the workflow parameters: Parameter Description Example value Azure Service Bus queue name The queue to monitor for incoming messages invoice-queue Maximum batch size Number of messages per batch (1-100) 10 Flat file schema name XSD schema name in Artifacts/Schemas Invoice.xsd Default CorrelationId Fallback value for messages without CorrelationId NO_CORRELATION_ID Target workflow URL HTTP endpoint of the downstream workflow https://your-logicapp.azurewebsites.net/... Target workflow timeout HTTP call timeout in seconds 60 Enable sequential processing Maintain message order true 2. Select Next. Step 7: Review and create On the Review + create tab, review all the provided information. Select Create. When the deployment completes, select Go to my workflow. Step 8: Upload the flat file schema Navigate to your logic app resource in the Azure portal. On the sidebar menu, under Artifacts, select Schemas. Select + Add and upload your Invoice.xsd. Confirm the schema appears in the list. Notice that: for this scenario we are using the Invoice.xsd schema, you can/must use the schema your scenario needs. Step 9: Verify and test On the workflow sidebar, select Designer to review the created workflow. Verify all actions are configured correctly. Send test messages to your Service Bus queue with different CorrelationId values. Monitor the Run history to verify successful execution and aggregation. For more information on creating workflows from templates, see Create workflows from prebuilt templates in Azure Logic Apps. Conclusion The Aggregator pattern is a cornerstone of enterprise integration, and migrating it from BizTalk Server to Azure Logic Apps Standard doesn’t mean starting from scratch. By using this template, you can: Reuse your existing XSD flat file schemas directly from your BizTalk projects Replace BizTalk Correlation Sets with CorrelationId-based message grouping via Azure Service Bus Deploy in minutes from the Azure portal templates gallery Scale elastically with Azure App Service Plan Monitor with Azure-native tools instead of the BizTalk Admin Console The template is open source and available at: GitHub PR: Azure/LogicAppsTemplates#108 Template name in Azure portal: “Aggregate messages from Azure Service Bus by CorrelationId” Source code: GitHub repository Whether you’re migrating from BizTalk Server or building a new integration solution from scratch, this template gives you a solid, production-ready starting point. I encourage you to try it, customize it for your scenarios, and contribute back to the community. Resources BizTalk Server Aggregator SDK sample Create workflows from prebuilt templates in Azure Logic Apps Create and publish workflow templates for Azure Logic Apps Flat file encoding and decoding in Logic Apps Azure Service Bus connector overview BizTalk to Azure migration guide BizTalk Migration Starter tool541Views0likes0Comments