biztalk migration
28 TopicsMicrosoft BizTalk Server Product Lifecycle Update
For more than 25 years, Microsoft BizTalk Server has supported mission-critical integration workloads for organizations around the world. From business process automation and B2B messaging to connectivity across industries such as financial services, healthcare, manufacturing, and government, BizTalk Server has played a foundational role in enterprise integration strategies. To help customers plan confidently for the future, Microsoft is sharing an update to the BizTalk Server product lifecycle and long-term support timelines. BizTalk Server 2020 will be the final version of BizTalk Server. Guidance to support long-term planning for mission-critical workloads This announcement does not change existing support commitments. Customers can continue to rely on BizTalk Server for many years ahead, with a clear and predictable runway to plan modernization at a pace that aligns with their business and regulatory needs. Lifecycle Phase End Date What’s Included Mainstream Support April 11, 2028 Security + non-security updates and Customer Service & Support (CSS) support Extended Support April 9, 2030 CSS support, Security updates, and paid support for fixes (*) End of Support April 10, 2030 No further updates or support (*) Paid Extended Support will be available for BizTalk Server 2020 between April 2028 and April 2030 for customers requiring hotfixes for non-security updates. CSS will continue providing their typical support. BizTalk Server 2016 is already out of mainstream support, and we recommend those customers evaluate a direct modernization path to Azure Logic Apps. Continued Commitment to Enterprise Integration Microsoft remains fully committed to supporting mission-critical integration, including hybrid connectivity, future-ready orchestration, and B2B/EDI modernization. Azure Logic Apps, part of Azure Integration Services — which includes API Management, Service Bus, and Event Grid — delivers the comprehensive integration platform for the next decade of enterprise connectivity. Host Integration Server: Continued Support for Mainframe Workloads Host Integration Server (HIS) has long provided essential connectivity for organizations with mainframe and midrange systems. To ensure continued support for those workloads, Host Integration Server 2028 will ship as a standalone product with its own lifecycle, decoupled from BizTalk Server. This provides customers with more flexibility and a longer planning horizon. Recognizing Mainframe modernization customers might be looking to integrate with their mainframes from Azure, Microsoft provides Logic Apps connectors for mainframe and midrange systems, and we are keen on adding more connectors in this space. Let us know about your HIS plans, and if you require specific features for Mainframe and midranges integration from Logic Apps at: https://aka.ms/lamainframe Azure Logic Apps: The Successor to BizTalk Server Azure Logic Apps, part of Azure Integration Services, is the modern integration platform that carries forward what customers value in BizTalk while unlocking new innovation, scale, and intelligence. With 1,400+ out-of-box connectors supporting enterprise, SaaS, legacy, and mainframe systems, organizations can reuse existing BizTalk maps, schemas, rules, and custom code to accelerate modernization while preserving prior investments including B2B/EDI and healthcare transactions. Logic Apps delivers elastic scalability, enterprise-grade security and compliance, and built-in cost efficiency without the overhead of managing infrastructure. Modern DevOps tooling, Visual Studio Code support, and infrastructure-as-code (ARM/Bicep) ensure consistent, governed deployments with end-to-end observability using Azure Monitor and OpenTelemetry. Modernizing Logic Apps also unlocks agentic business processes, enabling AI-driven routing, predictive insights, and context-aware automation without redesigning existing integrations. Logic Apps adapts to business and regulatory needs, running fully managed in Azure, hybrid via Arc-enabled Kubernetes, or evaluated for air-gapped environments. Throughout this lifecycle transition, customers can continue to rely on the BizTalk investments they have made while moving toward a platform ready for the next decade of integration and AI-driven business. Charting Your Modernization Path Microsoft remains fully committed to supporting customers through this transition. We recognize that BizTalk systems support highly customized and mission-critical business operations. Modernization requires time, planning, and precision. We hope to provide: Proven guidance and recommended design patterns A growing ecosystem of tooling supporting artifact reuse Unified Support engagements for deep migration assistance A strong partner ecosystem specializing in BizTalk modernization Potential incentive programs to help facilitate migration for eligible customers (details forthcoming) Customers can take a phased approach — starting with new workloads while incrementally modernizing existing BizTalk deployments. We’re Here to Help Migration resources are available today: Overview: https://aka.ms/btmig Best practices: https://aka.ms/BizTalkServerMigrationResources Video series: https://aka.ms/btmigvideo Feature request survey: https://aka.ms/logicappsneeds Reactor session: Modernizing BizTalk: Accelerate Migration with Logic Apps - YouTube We encourage customers to engage their Microsoft accounts team early to assess readiness, identify modernization opportunities, and explore assistance programs. Your Modernization Journey Starts Now BizTalk Server has played a foundational role in enterprise integration success for more than two decades. As you plan ahead, Microsoft is here to partner with you every step of the way, ensuring operational continuity today while unlocking innovation tomorrow. To begin your transition, please contact your Microsoft account team or visit our migration hub. Thank you for your continued trust in Microsoft and BizTalk Server. We look forward to partnering closely with you as you plan the future of your integration platforms. Frequently Asked Questions Do I need to migrate now? No. BizTalk Server 2020 is fully supported through April 11, 2028, with paid Extended Support available through April 9, 2030, for non-security hotfixes. CSS will continue providing their typical support. You have a long and predictable runway to plan your transition. Will there be a new BizTalk Server version? No. BizTalk Server 2020 is the final version of the product. What happens after April 9, 2030? BizTalk Server will reach End of Support, and security updates or technical assistance will no longer be provided. Workloads will continue running but without Microsoft servicing. Is paid support available past 2028? Yes. Paid extended support will be available through April 2030 for BizTalk Server 2020 customers looking for non-security hotfixes. CSS will continue to provide the typical support. What about BizTalk Server 2016 or earlier versions? Those versions are already out of mainstream support. We strongly encourage moving directly to Logic Apps rather than upgrading to BizTalk Server 2020. Will Host Integration Server continue? Yes. Host Integration Server (HIS) 2028 will be released as a standalone product with its own lifecycle and support commitments. Can I reuse BizTalk Server artifacts in Logic Apps? Yes. Most of BizTalk maps, schemas, rules, assemblies, and custom code can be reused with minimal effort using Microsoft and partner migration tooling. We welcome feature requests here: https://aka.ms/logicappsneeds Does modernization require moving fully to the cloud? No. Logic Apps supports hybrid deployments for scenarios requiring local processing or regulatory compliance, and fully disconnected environments are under evaluation. More information of the Hybrid deployment model here: https://aka.ms/lahybrid. Does modernization unlock AI capabilities? Yes. Logic Apps enables AI-driven automations through Agent Loop, improving routing, decisioning, and operational intelligence. Where do I get planning support? Your Microsoft account team can assist with assessment and planning. Migration resources are also linked in this announcement to help you get started. Microsoft Corporation3.4KViews3likes1CommentAnnouncement: General Availability of Logic Apps Hybrid Deployment Model
We are thrilled to announce the General Availability of the Logic Apps Hybrid Deployment Model, a groundbreaking feature that offers unparalleled flexibility and control to our customers. This innovative deployment model allows you to run Logic Apps workloads on customer-managed infrastructure, providing you with the option to host your integration solutions on-premises, in a private cloud, or even in a third-party public cloud. With the Logic Apps Hybrid Deployment Model, you can tailor your integration solutions to meet your specific needs, whether it's for regulatory compliance, data privacy, or network restrictions. This model ensures that you have the freedom to choose the best environment for your workflows, while still leveraging the powerful capabilities of Azure Logic Apps. The Hybrid Deployment Model supports a semi-connected architecture, offering local processing of workflows, local storage, and local network access. This means that the data processed by the workflows remains in your local SQL Server, and you have the ability to connect to local networks. Additionally, the built-in connectors will execute in your local compute, giving you access to local data sources and higher throughput. Since we launched the public preview, we have received an overwhelmingly positive response from customers across various industries. Many customers, including those looking to migrate from BizTalk Server, have expressed interest in this offering due to its ability to co-locate integration platforms near key lines of business systems, avoiding dependencies on public internet to process transactions. Journey of the Hybrid Deployment Model Feature At the Integrate 2024 event, we announced the early access preview of the Hybrid Deployment model for Logic Apps Standard. This initial phase allowed interested parties to nominate themselves for early access and provided valuable feedback on the model's functionality and benefits. Following the private preview, we launched the public preview, which empowered our customers with additional flexibility and control. This phase allowed customers to build and deploy workflows on customer-managed infrastructure, offering the option to run Logic Apps on-premises, in a private cloud, or in a third-party public cloud. The public preview also introduced the semi-connected architecture, enabling local processing of workflows and access to local data sources. In October 2024, we refreshed the public preview and received an overwhelmingly positive response from customers across various industries. This feedback highlighted the model's ability to meet specific use cases, such as migrating from BizTalk Server and co-locating integration platforms near key lines of business systems. The public preview refresh also emphasized the model's alignment with our promise of providing customers with more options to meet their business needs. We are excited to see how our customers will leverage the Logic Apps Hybrid Deployment Model to meet their business needs and drive innovation. Thank you for your continued support and feedback. New features in the GA release: Open Telemetry support: Open telemetry is a vendor-neutral open-source Observability framework for instrumenting, generating, collecting, and exporting telemetry data. The support for Open Telemetry in Hybrid deployment model ensures the seamless logging in the semi-connected scenarios and provides the ability to choose any observability platform as a telemetry endpoint. More details here. To set up Open Telemetry capability from Azure portal, follow these steps: Open the host.json in the root directory of SMB file share path configured in your logic app. In the host.json file, at the root level, add the following telemetryMode setting with the OpenTelemetry value, for example: { "version": "2.0", "extensionBundle": { "id": "Microsoft.Azure.Functions.ExtensionBundle.Workflows", "version": "[1.*, 2.0.0)" }, "telemetryMode": "OpenTelemetry" } When you enable Open Telemetry in the host.json file, your logic app exports telemetry based on the Open Telemetry-supported app settings that you define in the environment. Add below app settings from portal by navigating to Containers-->Environment variables-->edit and deploy. App setting Description OTEL_EXPORTER_OTLP_ENDPOINT The online transaction processing (OTLP) exporter endpoint URL for where to send the telemetry data. OTEL_EXPORTER_OTLP_HEADERS (optional) A list of headers to apply to all outgoing data. Commonly used to pass authentication keys or tokens to your observability backend. If your Open Telemetry endpoint requires other Open Telemetry related settings, include these settings in the app settings too. Support for Zip deployment through VSCode: The support for Zip deployment in VSCode deployment has enhanced the deployment experience with more reliability. This feature uses Azure Entra authentication for deployment, hence the VSCode machine doesn’t require to have permissions on the SMB share and the user need not to provide SMB credentials in subsequent deployments. To use Zip deployment, follow below steps: create an app registration. In the VSCode deployment, provide Client ID, Object ID and Client secret values. If there are any concerns with creating App registration, you can continue to use SMB deployment option by choosing "Use SMBDeployment For Hybrid" in the Extensions configuration of VSCode If you would like to use zip deployment in an existing Logic App, you will need to manually add the app settings as indicated here. The Zip deployment APIs can be used in CI/CD pipelines as well for DevOps deployment. We will be publishing another blog with detailed steps on the DevOps process. Support for more regions: We are pleased to announce the expansion of our hybrid deployment support to additional regions, in response to valuable customer feedback. This enhancement aims to better meet the diverse geographic and operational requirements of your businesses. The hybrid deployment is now available in the following regions: Central US, East Asia, East US, North Central US, Southeast Asia, Sweden Central, UK South, West Europe, and West US. Logic Apps Rules Engine Support on Linux containers: In this release, we have added support for Azure Logic Apps Rules Engine to run on Linux containers which enables customers to use the Rules Engine capabilities in Hybrid Logic Apps. Improvements for Effective Scaling and Performance: We have introduced few improvements in the runtime storage and the scaling behaviour aimed at improving the performance and achieving effective scaling. Please refer to the following articles: Scaling mechanism in hybrid deployment model for Azure Logic Apps Standard | Microsoft Community Hub Hybrid deployment model for Logic Apps- Performance Analysis and Optimization recommendations | Microsoft Community Hub Diagnostic tool: To assist with troubleshooting the environment configuration issues, we have created a troubleshooting tool, which will help you review the health of all the components of the hybrid deployment and provide insights. You can find the script in our GitHub repository. Select the troubleshoot.ps1 file and copy it to a folder and run the script using PowerShell. This script should be run where you have access to kubectl. References: Create Standard logic app workflows for hybrid deployment - Azure Logic Apps | Microsoft Learn Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn Set up and view enhanced telemetry for Standard workflows - Azure Logic Apps | Microsoft Learn2.2KViews1like0CommentsAnnouncing: Unleash AI Innovation with a Modern Integration Platform and an API-First Strategy
As AI technologies continue to evolve, they offer businesses a unique opportunity to modernize operations, accelerate innovation, and unlock new growth potential. To stay ahead of the curve, organizations need a comprehensive integration and API strategy that seamlessly connects data, applications, and AI across their entire ecosystem. We’re excited to announce the "Unleash AI Innovation with a Modern Integration Platform and an API-First Strategy" event. Over two action-packed days, you'll gain valuable insights from Azure leaders, industry analysts, and enterprise customers about how Azure Integration Services and Azure API Management are driving efficiency, agility, and fueling business growth in the AI-powered era. Why Attend? From security to development, customer success stories to expert analyst insights, this event will highlight why APIs and integration are critical for success now and in the future. Get exclusive industry insights: Gain expert perspectives from IDC’s Shari Lava, Azure product leaders, and Forrester consultant Andrew Nadler on the latest trends shaping enterprise integration and API strategies. Learn from real-world customer stories: Hear firsthand from organizations like DocuSign, Visa, LyondellBasell, Metcash, Khoj, Brisbane City Council, Moneris, Heineken, Transcard, and CareFirst BlueCross BlueShield on how they are transforming operations with Azure Integration Services and Azure API Management. Accelerate your AI and integration strategy: Learn how Azure Logic Apps make AI-driven automation more accessible than ever, and how Azure API Management empowers businesses to securely scale AI-powered APIs. Event Highlights Day 1: Drive Business Growth with a Modern Integration Platform In today’s competitive landscape, businesses must seamlessly connect data, applications, and AI. On Day 1, you'll explore how Azure Integration Services help organizations break down data silos, unlock real-time insights, and optimize operations. Learn how connected data streams enable smarter, faster decision-making, while AI-powered workflows reduce complexity and drive operational efficiency. We’ll also explore how businesses are modernizing legacy systems by migrating from BizTalk and other on-premises integration solutions to Azure Integration Services, providing greater scalability, agility, and business continuity. Day 2: Power AI and Enterprise Innovation with an API-First Strategy On Day 2, you'll dive deep into how APIs are the backbone of modern digital ecosystems. APIs enable businesses to scale faster, enhance developer experiences, and create new revenue streams. Learn how Azure API Management helps you secure, manage, and monetize APIs while accelerating AI adoption. You’ll also discover best practices for securing and governing APIs across distributed environments, ensuring that your AI-powered ecosystem remains secure, scalable, and compliant. Streamed Live Across Multiple Time Zones Join us no matter where you are! We’re streaming live across multiple time zones, so you can participate at a time that works best for you. US/Canada: Reserve your seat today! Day 1: Tuesday, 29 April 2025 | 9:00 AM – 12:30 PM PDT Day 2: Wednesday, 30 April 2025 | 9:00 AM – 12:30 PM PDT Australia/New Zealand: Reserve Your Seat Today! Day 1: Wednesday, 30 April 2025 | 9:00 AM – 12:30 PM AEDT Day 2: Thursday, 1 May 2025 | 9:00 AM – 12:30 PM AEDT Europe: Reserve Your Seat Today! Day 1: Tuesday, 29 April 2025 | 9:00am – 12:30pm BST Day 2: Wednesday, 30 April 2025 | 9:00am – 12:30pm BST Ready to Future-Proof Your Integration and API Strategy? Don’t miss this exclusive opportunity to learn from industry experts, Azure leaders, and top enterprises. Discover how to future-proof your integration and API strategy to drive AI-powered growth and business success.1.9KViews1like0Comments🚀 General Availability: Enhanced Data Mapper Experience in Logic Apps (Standard)
We’re excited to announce the General Availability (GA) of the redesigned Data Mapper UX in the Azure Logic Apps (Standard) extension for Visual Studio Code. This release marks a major milestone in our journey to modernize and streamline data transformation workflows for integration developer. What's new The new UX, previously available in public preview, is now the default experience in the Logic Apps Standard extension. This GA release reflects direct feedback from our integration developer community. We’ve resolved blockers that we heard from customers and usability issues that impacted performance and stability, including: Opening V1 maps in V2: Seamlessly open and edit existing maps you have already created with latest visual capabilities. Load schemas on Mac: Addressed schema-related crashes on macOS for a smoother experience. Function documentation updates: Improved guidance and examples for built-in collection functions that apply on repeating nodes. Stay connected We would love to hear your feedback. Please use this form link to let us know if there are any missing gaps or scenarios that are not yet covered1.1KViews1like0CommentsAnnouncing the BizTalk Server 2020 Cumulative Update 7
The BizTalk Server product team has released the Cumulative Update 7 for BizTalk Server 2020. The Cumulative Update 7 contains all released functional and security fixes for customer-reported issues for BizTalk Server 2020. Also, CU7 includes support for the following new Microsoft platforms: Microsoft Visual Studio 2022 Microsoft Windows Server 2022 Microsoft SQL Server 2022 Microsoft Windows 11 BizTalk Server 2016 is currently out of support with its end of life in 2027. If you are running BizTalk 2016, or earlier versions of the product, you must upgrade to BizTalk Server 2020 CU6 or strongly consider migrating to Azure Logic Apps. Please fill this survey: https://aka.ms/biztalklogicapps. More Information about the CU7: This is an optional update only if you require VS 2022. If you don’t need VS2022, you can continue running on BizTalk Server CU6. CU7 requires that you re-create BizTalk groups for BizTalk Server instances that already were part of a BizTalk group before the installation. Existing BizTalk groups can't have different instances at different cumulative update level. We will provide support to our CU6 and CU7 customers. You can obtain the software from the Microsoft 365 admin center or the Visual Studio Subscriber site. For more information about the BizTalk Server 2020 CU7, read the Microsoft Knowledgebase article posted to https://aka.ms/BTS2020CU7KB .1.1KViews3likes2CommentsAnnouncing the BizTalk Server 2020 Cumulative Update 6
The BizTalk Server product team has released the Cumulative Update 6 for BizTalk Server 2020. The Cumulative Update 6 contains all released functional and security fixes for customer-reported issues for BizTalk Server 2020. Also, CU6 includes support for the following new Microsoft platforms: Microsoft Windows Server 2022 Microsoft SQL Server 2022 Microsoft Windows 11 BizTalk Server 2016 is currently out of support with its end of life in 2027. If you are running BizTalk 2016, or earlier versions of the product, you must upgrade to BizTalk Server 2020 CU6 or strongly consider migrating to Azure Logic Apps. Please fill this survey: https://aka.ms/biztalklogicapps. More Information about the CU6: This cumulative update includes all the product components. However, only those components that are currently installed on the system are updated. This CU6 includes fixes for the following areas: BizTalk Server Adapters Updates WCF-SAP adapter SFTP adapter BizTalk Server Administration Tools and Management APIs Lost changes to SQL Server Agent jobs You can obtain the software from the Microsoft Download Center, at https://aka.ms/BTS2020CU6. For more information about the BizTalk Server 2020 CU6, read the Microsoft Knowledgebase article posted to https://aka.ms/BTS2020CU6KB.968Views3likes1Comment🔁 Public Preview Refresh: More Power to Data Mapper in Azure Logic Apps
We’re back with a Public Preview refresh for the Data Mapper in Azure Logic Apps (Standard) — bringing forward some long-standing capabilities that are now fully supported in the new UX. In our initial announcement, we introduced a redesigned experience focused on usability, error handling, and improved mapping for complex schemas. As we continue evolving the tool, we’re working to bring feature parity with the classic experience, while layering in modern enhancements along the way. With this update, several existing capabilities from the legacy Data Mapper are now available in the new preview version — so you can bring your advanced scenarios forward with confidence. 🛠️ Run XSLT Inside Your Data Map The ability to apply XSLT has long been a powerful feature in Logic Apps, and we’re excited to bring Run XSLT support into the new UX. You can now invoke reusable transformation logic from your map, including: Enterprise-grade XSLT Predefined templates or logic from your BizTalk workflows How to try it out: Create a new data map. Right-click on the MapDefintions or Maps folder and click Create new data map Store the XSLT file under Artifacts -> DataMapper/Extension -> InlineXslt. Open the data map and search for Run XSLT in the functions panel. Select the function and simply select the function you want to run from the dropdown Connect to desired destination node. In my case, the function simply adds a "Placeholder" value for the Name node at destination, alongside an "EmployeeType" node. Note that you do not need to connect any source node to the XSLT function given this is custom XSLT logic that will be applied directly at destination node. Upon testing the map, right value is generated in the destination schema 🔍 Execute XPath to Extract Targeted Values Execute XPath is now supported in the new experience, giving you control to extract specific values from nested XML structures. This function is particularly useful for: Accessing attributes and nested elements Applying logic based on the structure or content of incoming data How to try it out: Search for Execute XPath in the functions panel. Select the function and add the expression you want to extract Map it to destination node. Here is what the map will look like: The test payload correctly creates multiple Address nodes at destination based on the Address node at source. 🧩 Use Custom XML Functions Custom XML functions allow you to define and reuse logic across your map. This helps reduce duplication and supports schema-specific transformations. Now that support is available in the new UX, you can: Wrap complex logic into manageable components Handle schema-specific edge cases with ease How to try it out: Add the .xml function file under Artifacts -> DataMapper/Extension -> Functions Open the data map and under Utility category of functions, select the new function. In our case, the xml function is called Age Connect function input to Date_of_Birth node at source and output to Age node at destination. The map will look something like this Test the map and notice that the age is calculated correctly at the destination node 🌒 Dark Mode Support in VS Code The new UX now respects Dark Mode in VS Code, giving you a visually cohesive and low-contrast authoring experience — perfect for long mapping sessions. No extra steps needed — Dark Mode works automatically based on your VS Code theme settings. ⚙️ How to Enable the New Experience If you haven’t yet tried the new UX: Open your Logic Apps (Standard) project in VS Code Go to Logic Apps (Standard) extension → Settings → Data Mapper Select Version ~2 You’ll find detailed walkthroughs in the initial preview announcement blog. 💬 We’d Love Your Feedback We’re continuously evolving the Data Mapper, and your feedback is key to getting it right — especially as we bring more advanced transformation scenarios into the new experience. 👉 Submit your feedback here 🐛 Found an issue or have a specific feature request? Let us know on GitHub Issues Thanks again for being part of the journey — more updates coming soon! 🚀Hybrid deployment model for Logic Apps- Performance Analysis and Optimization recommendations
A few weeks ago, we announced the Public Preview Refresh release of Logic Apps hybrid deployment model that allows customers to run Logic Apps workloads on a customer managed infrastructure. This model provides the flexibility to execute workflows, either on-premises or in any cloud environment, thereby offering enhanced control over the operation of logic apps. By utilizing customer-managed infrastructure, organizations can adhere to regulatory compliance requirements and optimize performance according to their specific needs. As customers consider leveraging hybrid environments, understanding the performance of logic apps under various configurations and scenarios becomes critical. This document offers an in-depth performance evaluation of Azure Logic Apps within a hybrid deployment framework. It examines, several key factors such as CPU and memory allocation and scaling mechanisms, providing valuable insights aimed at maximizing the application’s efficiency and performance. Achieving Optimal Logic Apps Performance in Hybrid Deployments In this section, we will explore the key aspects that affect Logic Apps performance when deployed in a hybrid environment. Factors such as the underlying infrastructure of the Kubernetes environment, SQL configuration and scaling configuration can significantly impact the efficiency of workflows and the overall performance of the applications. The following blog entry provides details of the scaling mechanism of Hybrid deployment model - Scaling mechanism in hybrid deployment model for Azure Logic Apps Standard | Microsoft Community Hub Configure Container Resource allocation: When you create a Logic App, a default value of 0.5 vCPU and 1GiB of memory would be allocated. From the Azure Portal, you can modify this allocation from the Container blade. - Create Standard logic app workflows for hybrid deployment - Azure Logic Apps | Microsoft Learn Currently, the maximum allocation is set to 2vCPU and 4 GiB memory per app. In the future, there would be a provision made to choose higher allocations. For CPU intense/memory intense processing like custom code executions, select a higher value for these parameters. In the next section, we will be comparing the performance with different values of the CPU and memory allocation. This allocation would impact the billing calculation of the Logic App resource. Refer vCPU calculation for more details on the billing impact. Optimize the node count and size in the Kubernetes cluster. Kubernetes runs application workloads by placing containers into Pods to run on Nodes. A node may be a virtual or physical machine, depending on the cluster. A node pool is a group of nodes that share the same configuration (CPU, Memory, Networking, OS, maximum number of pods, etc.). You can choose the capacity (cores and memory), minimum node count and maximum node count for each node pool of the Kubernetes cluster. We recommend allocating a higher capacity for processing CPU intense, or memory intense applications Configure Scale rule settings: For a Logic App resource, we recommend you configure the maximum and minimum replicas which could be scaled out when a scale event occurs. A higher value for the max replicas helps in sudden spikes in the number of application requests. The interval with which the scaler checks for the scaling event and the cooldown period for the scaling event can also be configured from the Scale blade of Logic Apps resource. These parameters impact the scaling pattern. Optimize the SQL server configuration: The hybrid deployment model uses Microsoft SQL for runtime storage. As such, there are lot of SQL operations performed throughout the execution of the workflow and SQL capacity has a significant impact on the performance of the app. Microsoft SQL server could either be a SQL server on Windows, or an Azure SQL database. Few recommendations on the SQL configuration for better performance: If you are using, Azure SQL database, run it on a SQL elastic pool. If you are using SQL server on Windows, run with at least 4vCPU configuration. Scale out the SQL server once the CPU usage of the SQL server hits 60-70% of the total available CPU. Performance analysis: For this performance analysis exercise, we have used a typical enterprise integration scenario which includes the below components. Data transformation: XSLT transformation, validation, and XML parsing actions Data routing: File system connector for storing the transformed content in a file share. Message queuing: RabbitMQ connector for sending the transformation result to Rabbit MQ queue endpoint. Control operations: For-each loop for looping through multiple records, condition execution, Scope, and error handling blocks. Request response: The XML data transmitted via HTTP request, and the status returned as a response. Summary: For these tests, we used the following environment settings: Kubernetes cluster: AKS cluster with Standard D2sV3 (2vCPU, 8GiBmemory) Max replicas: 20 Cooldown period: 300 seconds Polling interval: 30 With the above environment and settings, we have performed multiple application tests with different configuration of SQL server, resource allocation and test durations using Azure load testing tool. In the following table, we have summarized the response time, throughput, and the total vCPU consumption for each of these configurations. You can check each scenario for detailed information. Configuration Results Scenario SQL CPU and Memory allocation per Logic App Test duration Load 90 th Percentile Response time Throughput Total vCPU consumed Scenario 1 SQL general purpose V2 1vCPU/2GiB Memory 10 minutes with 50 users 503 requests 68.62 seconds 0.84/s 3.42 Scenario 2 SQL Elastic pool-4000DTU 1vCPU/2GiB Memory 10 minutes with 50 users 1004 requests 40.74 seconds 1.65/s 3 Scenario 3 SQL Elastic pool-4000DTU 2vCPU/4GiB Memory 10 minutes with 50 users 997 requests 40.63 seconds 1.66/s 4 Scenario 4 SQL Elastic pool-4000DTU 2vCPU/4GiB Memory 30 minutes with 50 users 3421 requests 26.6Seconds 1.9/s 18.6 Scenario 5 SQL Elastic pool-4000DTU 0.5vCPU/1GiB Memory 30 minutes with 50 users 3055 requests 31.38 seconds 1.7/s 12.4 Scenario 6 SQL 2022 Enterprise on Standard D4s V3 VM 0.5vCPU/1GiB Memory 30 minutes with 50 users 4105 requests 27.15 seconds 2.28/s 10 Scenario 1: SQL general purpose V2 with 1vCPU and 2 GiB Memory – 10 minutes test with 50 users In this scenario, we conducted a load test for 10 minutes with 50 users with the Logic App configuration of: 1 vCPU and 2 GiB Memory and Azure SQL database running on General purpose V2 plan. There were 503 requests with multiple records in each payload and it achieved the 68.62 seconds as the 90 th percentile response time and a throughput of 0.84 requests per second. Scaling: The Kubernetes nodes scaled out to 12 nodes and in total 3.42 vCPUs used by the app for the test duration. SQL Metrics: The CPU usage of the SQL server reached 90% of CPU usage quite early and stayed above 90% for the remaining duration of the test. From our backend telemetry as well, we observed that the actions executions were faster, but there was latency between the actions, which indicates SQL bottlenecks. Scenario 2: SQL elastic pool, with 1vCPU and 2 GiB memory- 10 minutes test with 50 users In this scenario, we conducted a load test for 10 minutes with 50 users with the Logic App configuration of: 1 vCPU and 2 GiB Memory and Azure SQL database running on a SQL elastic pool with 4000 DTU. There were 1004 requests with multiple records in each payload and it achieved the 40.74 seconds as the 90 th percentile response time and a throughput of 1.65 requests per second. Scaling: The Kubernetes nodes scaled out to 15 nodes and in total 3 vCPUs used by the app for the test duration. SQL Metrics: The SQL server’s CPU utilization peaked to 2% of the elastic pool. Scenario 3: SQL elastic pool, with 2vCPU and 4 GiB memory- 10 minutes test with 50 users In this scenario, we conducted a load test for 10 minutes with 50 users with the Logic App configuration of 2 vCPU and 4 GiB Memory and Azure SQL database running on a SQL elastic pool with 4000 DTU. There were 997 requests with multiple records in each payload and it achieved the 40.63 seconds as the 90 th percentile response time and a throughput of 1.66 requests per second. Scaling: The Kubernetes nodes scaled out to 21 nodes and in total 4 vCPUs used by the app for the test duration. SQL Metrics: The SQL server’s CPU utilization peaked to 5% of the elastic pool. Scenario 4: SQL elastic pool, with 2vCPU and 4 GiB memory- 30 minutes test with 50 users In this scenario, we conducted a load test for 30 minutes with 50 users with the Logic App configuration of: 2 vCPU and 4 GiB Memory and Azure SQL database running on a SQL elastic pool with 4000 DTU. There were 3421 requests with multiple records in each payload and it achieved the 26.67 seconds as the 90 th percentile response time and a throughput of 1.90 requests per second. Scaling: The Kubernetes nodes scaled out to 20 nodes and in total 18.6 vCPUs used by the app for the test duration. SQL Metrics: The SQL server’s CPU utilization peaked to 4.7% of the elastic pool. Scenario 5: SQL Elastic pool, with 0.5vCPU and 1 GiB memory- 30 minutes test with 50 users In this scenario, we have conducted a load test for 30 minutes with 50 users with the Logic App configuration of 0.5 vCPU and 1 GiB Memory and Azure SQL database running on a SQL elastic pool with 4000 DTU. There were 3055 requests with multiple records in each payload and it achieved the 31.38 seconds as the 90 th percentile response time and a throughput of 1.70 requests per second. Scaling: The Kubernetes nodes scaled out to 18 nodes and in total 12.4 vCPUs used by the app for the test duration. SQL Metrics: The SQL server’s CPU utilization peaked to 8.6% of the elastic pool CPU. Scenario 6: SQL 2022 Enterprise Gen2 on Windows 2022 on Standard D4s v3 image, with 0.5vCPU and 1 GiB memory- 30 minutes test with 50 users In this scenario, we conducted a load test for 30 minutes with 50 users with the Logic App configuration of: 0.5 vCPU and 1 GiB Memory and Azure SQL database running on an on-premises SQL 2022 Enterprise Gen2 version running on a Windows 2022 OS with Standard D4s v3 image (4 vCPU and 16GIB memory) There were 4105 requests with multiple records in each payload and it achieved the 27.15 seconds as the 90 th percentile response time and a throughput of 2.28 requests per second. Scaling: The Kubernetes nodes scaled out to 8 nodes and in total 10 vCPUs used by the app for the test duration. SQL metrics: The CPU usage of the SQL server went above 90% after few minutes and there was latency on few runs. Findings and recommendations: The following are the findings and recommendations for this performance exercise. Consider that this load test was conducted using unique conditions. If you conduct a similar test, the results and findings might vary, depending on factors such as workflow complexity, configuration, resource allocation and network configuration. The KEDA scaler performs the scale-out and scale-in operations faster, as such, while the total vCPU usage remains quite low, though the nodes scaled out in the range of 1-20 nodes. The SQL configuration plays a crucial role in reducing the latency between the action executions. For a satisfactory load test, we recommend starting with at least 4vCPU configuration on SQL server and scale out once CPU usage of the SQL server hits 60-70% of the total available CPU. For critical applications, we recommend having a dedicated SQL database for better performance. Increasing the dedicated vCPU allocation of the Logic App resource is helpful for the SAP connector, Rules Engine, .NET Framework based custom code operations and for the applications with many complex workflows. As a general recommendation, regularly monitor performance metrics and adjust configurations to meet evolving requirements and follow the coding best practices of Logic Apps standard. Consider reviewing the following article, for recommendations to optimize your Azure Logic Apps workloads: https://techcommunity.microsoft.com/blog/integrationsonazureblog/logic-apps-standard-hosting--performance-tips/3956971
