integration
167 TopicsAnnouncing General Availability: Azure Logic Apps Standard Automated Test Framework
We’re excited to announce the General Availability (GA) of the Azure Logic Apps Standard Automated Test Framework - a major step forward in enabling developers to build, test, and maintain enterprise-grade workflows with confidence and agility. Automated testing has become a cornerstone of modern development practices, and Logic Apps Standard now offers a robust framework to help you create unit tests for both workflow definitions and workflow runs directly within Visual Studio Code. This framework empowers teams to validate logic, simulate external dependencies, and ensure workflows behave as expected—before they’re deployed to production. Since the public preview, we’ve listened to your feedback and continued to enhance the framework. With GA, we’re introducing several key improvements that make testing even more powerful and flexible. What’s New in GA Support for More Mocked Actions You can now mock a broader range of built-in and managed connector actions, making it easier to isolate your workflow logic from external systems. With this release we unlocked support to mock actions for the following actions, unavailable during public preview: Call workflow in this logic app Execute inline code (JavaScript, C#, PowerShell Call Functions (Azure functions, local functions) XML Operations (transform, parse with schema) Liquid Operations (JSON to JSON, JSON to text, XML to JSON, XML to text) Data Mapper operations This enhancement allows for more comprehensive and reliable unit tests, providing more control to your workflow tests, especially in complex integration scenarios. Access to Workflow Settings for Assertions The framework now allows you to access and assert against workflow settings, such as parameters and app setting values. This means you can validate not just the behavior of your workflow, but also the environment in which it runs—ensuring logic consistency across different environments. Inline Script Actions Support Inline Code actions are now fully supported in test scenarios. JavaScript actions are now executed as part of the test workflow execution, since they are part of workflow logic. This improvement allows you to validate the logic of those scripts at part of your workflow scenarios. We are working on bringing similar support for C# and PowerShell scripts. Learn More To get started with the Azure Logic Apps Standard Automated Test Framework, check out the following Microsoft Learn articles: Create unit tests for workflow definitions in Visual Studio Code Create unit tests for workflow runs in Visual Studio Code Logic Apps Standard Automated Test SDK. Let us know what you think and stay tuned for more enhancements coming soon!🚀 General Availability: Enhanced Data Mapper Experience in Logic Apps (Standard)
We’re excited to announce the General Availability (GA) of the redesigned Data Mapper UX in the Azure Logic Apps (Standard) extension for Visual Studio Code. This release marks a major milestone in our journey to modernize and streamline data transformation workflows for integration developer. What's new The new UX, previously available in public preview, is now the default experience in the Logic Apps Standard extension. This GA release reflects direct feedback from our integration developer community. We’ve resolved blockers that we heard from customers and usability issues that impacted performance and stability, including: Opening V1 maps in V2: Seamlessly open and edit existing maps you have already created with latest visual capabilities. Load schemas on Mac: Addressed schema-related crashes on macOS for a smoother experience. Function documentation updates: Improved guidance and examples for built-in collection functions that apply on repeating nodes. Stay connected We would love to hear your feedback. Please use this form link to let us know if there are any missing gaps or scenarios that are not yet coveredLogic Apps Aviators Newsletter - September 25
In this issue: Ace Aviator of the Month News from our product group Community Playbook News from our community Ace Aviator of the Month September’s Ace Aviator: Kritika Singh Integration Architect & Sr. Consultant at Capgemini Norge AS What's your role and title? What are your responsibilities? I work as an Integration Architect & Sr. Consultant at Capgemini Norge AS. In my role I assist clients in addressing a wide range of integration challenges, with a particular emphasis on modernizing legacy systems, such as BizTalk, by transitioning them to cloud-native Azure iPaaS solutions. I’m responsible for architecting secure and scalable integration landscapes, designing and developing solutions, mentoring team members, and engaging with stakeholders and cross-functional teams. I work extensively with technologies such as Azure Logic Apps, Azure Functions, API Management, Service Bus, App Service Environment(ASEv3), Virtual Networks and CI/CD pipelines using GitHub. One of my proudest achievements was successfully delivering a complex BizTalk modernization project to Azure that required deep technical expertise and strategic coordination. Can you give us some insights into your day-to-day activities and what a typical day in your role looks like? As part of a distributed team across different locations and countries, my day starts with stand-ups involving both offshore and onshore team members to review progress and assign tasks. I spend time with developers, helping them navigate technical challenges and mentoring them through dedicated sessions—this has helped improve delivery quality and team confidence. I collaborate closely with cross-functional teams and stakeholders to align on requirements and solution design. Throughout the day, I work on resolving issues, improving existing solutions, work on innovation ideas and managing tasks. What motivates and inspires you to be an active member of the Aviators/Microsoft community? I am deeply passionate about technology—having worked with BizTalk throughout my career and, for the past 5+ years, diving into Azure iPaaS. Microsoft products evolve constantly, and that sparks my curiosity to explore, learn, and innovate every day. What truly drives me is the opportunity to give back to the community by sharing my learnings, challenges, and even failures. It’s all about growing together and inspiring others along the way. Looking back, what advice do you wish you had been given earlier that you'd now share with those looking to get into STEM/technology? Curiosity and consistency matter more than perfection. Don’t be afraid to ask questions, experiment, and fail, that’s where the real learning happens. Also, find a community that supports you; sharing your journey, both wins and setbacks, can inspire others and help you grow faster. What has helped you grow professionally? My professional growth has been shaped by a blend of curiosity, courage, and the right opportunities. Starting with BizTalk and evolving into Azure iPaaS, I’ve embraced every challenge as a chance to learn. What’s made the biggest difference is having the boldness to take on responsibility, the willingness to take risks, and the drive to keep growing. Sharing my journey with the community has not only helped others but also deepened my own learning. If you had a magic wand that could create a feature in Logic Apps, what would it be and why? If I had a magic wand to enhance Logic Apps, I’d bring in three powerful features to supercharge developer productivity and solution resilience: Seamless Version Control & Rollback Having Git-like capabilities built into Logic Apps—track every change, compare versions, and roll back instantly when needed. This would empower teams to experiment confidently and collaborate more effectively without fear of breaking production workflows. Effortless Disaster Recovery Setup Setting up DR should be as simple as a few clicks. A built-in, automated DR configuration for AIS would ensure business continuity, reduce downtime, and give developers peace of mind—especially in mission-critical environments. Native JSON Mapper(Not Liquid) A visual, intuitive JSON mapping tool would simplify complex data transformations, reduce manual coding, and speed up development. This would be a game-changer for integration scenarios, especially when working with dynamic schemas and APIs. Simplified Authorization like ClaimChecks for Logic Apps Standard (Beyond EasyAuth) A more developer-friendly authorization setup that minimizes manual configurations and integrates seamlessly with identity providers. This would make securing Logic Apps faster, easier, and more consistent across environments. News from our product group Logic Apps Live August 2025 Missed Logic Apps Live in August? You can watch it here. We had a recap on Logic Apps Hybrid, our special guest Kritika Singh talking about her learnings with BizTalk Migration to AIS, and updates on Data Mapper GA and Logic Apps Standard Deployment Center. Logic Apps Community Day 2025 We are bringing Logic Apps Community Day again this year, on October 30, 2025 (Pacific Time) and we want you to join us as we host a full day of learning where you will be the star! Call for Speakers is still open until September 07, 2025 – so hurry and submit your session! General Availability: Enhanced Data Mapper Experience in Logic Apps (Standard) We’re excited to announce the General Availability (GA) of the redesigned Data Mapper UX in the Azure Logic Apps (Standard) extension for Visual Studio Code. This release marks a major milestone in our journey to modernize and streamline data transformation workflows for integration developer. Announcing: Setup CD in Azure Logic Apps Standard with Deployment Center Looking to automate your Azure Logic Apps code deploymentsin a faster way? Deployment Center - a built-in feature in Azure Logic Apps Standard - in now available, with built-in support on your VS Code projects, making it easier to deploy Logic Apps from your source control repository. Deployment Center is designed to make deploying, updating, and managing your Logic Apps workflows simple and straightforward. Hybrid Logic Apps deployment on Rancher K3s Kubernetes cluster Explore how Hybrid Logic Apps run effortlessly on K3s—delivering the power of Hybrid Logic Apps without the complexity and heavy infrastructure demands of a full Kubernetes cluster! News from our community What gets returned to the LLM by my Logic App Agent Loop tool? Video by Michael Stephenson Michael has been experimenting with Logic App Agent loop and, following a discussion with Kent Weare about the interaction between the workflow and the LLM, aimed to understand what data is returned to the model, since the number of tokens influences cost. He summarizes his findings from that conversation in this brief video. SOAP 1.2 Calls from Logic Apps – Fixing Unsupported Media & WS-Addressing Errors Post by Prashant Singh Struggling with SOAP 1.2 in Azure Logic Apps? Learn how to fix Unsupported Media errors, decode MTOM responses, and handle WS-Addressing headers for seamless integration in this post by Prashant. Demystifying AI Agent Loops in Logic Apps: The Future of Integration (But Not Everywhere) Post by Al Ghoniem Explore how AI Agent Loops enhance Azure Logic Apps for non-deterministic tasks like anomaly detection and IT Ops triage—while knowing when traditional workflows are the better fit. Can I use AI to create and deploy an Azure Logic Apps with Business Central connector? Post by Stefano Demiliani Stefano is testing the boundaries of what AI can do, so you don’t have to – he ran a blind test showing that AI can deploy Azure resources well—but struggles with external connectors like Business Central. Learn what worked, what didn’t, and why better prompts matter. How to use ChatGPT Agent Mode with Azure! Video by Stephen W. Thomas And looks like August was the month to experiment with Azure Resources. Stephen did some research too and shows you how easily ChatGPT-5 Agent Mode can auto-provision resources in Azure. This video demonstrates how to use a single prompt to build a logic app and create a resource group. Follow his video along to see how to get ChatGPT-5 agent working for you! Integration Love Story - Andrew Wilson Video by Ahmed Bayoumy and Robin Wilde In this special fast-paced episode recorded at INTEGRATE, Ahmed and Robin sit down with the brilliant Andrew, newly awarded Microsoft MVP and Logic Apps Ace Aviator, to talk about his journey, passions, and why integration is the powerhouse behind every digital experience. Tips for Migrating SAP IDoc Reception Workloads from BizTalk to Azure Logic Apps Post by Francois Malgreve Learn how to reuse BizTalk XSLTs in Azure Logic Apps! In this post by Francois, you will learn hot to configure the SAP trigger with the right IDoc format and namespace settings—minimizing code changes and easing migration. Query Azure DevOps work items with Logic App and Managed Identity Post by Michael Stephenson Learn how to use a reusable Logic App and a user-assigned managed identity to securely query Azure DevOps work items using WIQL—ideal for building scalable, secure workflows. Understand Agent Loops in Azure Logic Apps Video by Srikanth Gunnala In this video, Srikanth explore Azure Logic Apps AI Agents — also known as Agent Workflows or Agent Loops — and how they’re redefining workflow automation with Azure OpenAI. You’ll learn what an AI agent is in Azure Logic Apps, how it works, and see a live demo of building an AI-powered, adaptive workflow. Automate Microsoft Fabric Cost Savings with Logic Apps Post by Sherry L. Robinson Learn how to pause and resume Microsoft Fabric capacity using Azure Logic Apps—cutting costs during off-hours with minimal code and seamless integration via REST API or Resource Manager, in this insightful post by Sherry. Use Graph API to send Emails in Logic Apps Post by Şahin Özdemir In this post, Şahin shows your hot to use Microsoft Graph API with service principals to securely send emails from Logic Apps using app registrations and access policies. This will be quite useful in cases where you can’t associate the calls to a user account, which is a requirement for the Office 365 connector.253Views0likes0CommentsHybrid Logic Apps deployment on Rancher K3s Kubernetes cluster
K3s is a lightweight Kubernetes distribution, certified by the Cloud Native Computing Foundation (CNCF) and originally developed by Rancher. It is optimized for on-premises environments with limited resources, making it ideal for edge computing and hybrid scenarios. Unlike a full Kubernetes distribution, K3s reduces overhead while maintaining full Kubernetes API compatibility. This makes K3s a perfect choice for running Azure Logic Apps Standard close to your data sources (e.g., on-prem SQL Server, local file shares) without the heavy infrastructure requirements of a full Kubernetes cluster. There are 5 steps which are followed to setup the Hybrid Logic Apps including infrastructure which is illustrated in the following diagram. Most of these 5 steps are same as discussed in the Hybrid Logic Apps doc except the K3s Setup part Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn. Step 1: Prepare the K3s Cluster Docker desktop setup - In this case, the host machine is Windows 11 so decided to user Docker with WSL2 to setup the containers. Install the docker desktop using WSL2 Docker Desktop: The #1 Containerization Tool for Developers | Docker and make sure we select WSL2 Install K3s on your infrastructure and create single node cluster using k3d. #Install choco , kubectl and Helm Set-ExecutionPolicy Bypass -Scope Process -Force; [System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://community.chocolatey.org/install.ps1')) powershell choco install kubernetes-cli -y choco install kubernetes-helm -y choco install k3d -y # open in new powershell window powershell k3d cluster create # deleting the default load balancer Traefik as it conflicts with 80 and 443 port - we can configure the load balancer to other ports if needed kubectl delete svc traefik -n kube-system kubectl delete deployment traefik -n kube-system Next two steps are same as given Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn Step 2: Connect the Kubernetes cluster to Azure Arc Step 3: Setup the Azure Container Apps extension and environment You need to skip the core DNS setup required for Azure Local as given in Update CoreDNS Step 4: Conduct the Storage Configuration for SQL and SMB SQL Database (Runtime Store): Hybrid Logic Apps use SQL database for runtime operations and run history. In this scenario I used on-premise SQL server using SQL Authentication. I setup the SQL Server 2022 on the Windows host machine, enabled SQL server authentication and added new SQL admin user. Please follow the link for more details.. The SQL connection string can be validated using following PowerShell script $connectionString = "Server=<server IP address>;Initial Catalog=<databaseName>;Persist Security Info=False;User ID=<sqluser>;Password=<password>;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;" try { $connection = New-Object System.Data.SqlClient.SqlConnection $connection.ConnectionString = $connectionString $connection.Open() Write-Host "✅ Connection successful" $connection.Close() } catch { Write-Host "❌ Connection failed: $($_.Exception.Message)" } SMB is used as local file share on Windows host machine; it is advised to use a new user for the Windows SMB share $Username = "k3suser" $Password = ConvertTo-SecureString "<password complex>" -AsPlainText -Force $FullName = "K3s user" $Description = "Created via PowerShell" # Create the user New-LocalUser -Name $Username -Password $Password -FullName $FullName -Description $Description Add-LocalGroupMember -Group "Users" -Member $Username Once the above user is created you can use Windows hosted machine to create Artifacts folder and allow read and write access. Please follow the link for more details Step 5: Create your Logic App (Hybrid) With all prerequisites and infrastructure in place for creating Hybrid Logic Apps, the next step is to build the Logic Apps using the specified connection string and SMB share path. This can be accomplished through the Azure Portal, as outlined below. Now you can create Logic Apps workflows using the designer and execute the Logic Apps workflow.Reliable B2B Tracking using Premium SKU Integration Account
Reliable B2B Tracking using Premium SKU Integration Account In the world of enterprise integration, accurate and reliable tracking of B2B transactions is crucial for maintaining compliance, troubleshooting issues, and ensuring smooth business operations. Organizations that rely on Logic Apps for EDI X12, EDIFACT, or AS2 transactions need robust tracking capabilities to monitor their B2B exchanges effectively. Currently, in Logic Apps Consumption, B2B tracking is powered by Azure Log Analytics, which provides basic telemetry and logging capabilities. However, this approach has some key limitations: Limited Query Capabilities – Searching for transactions in Log Analytics can be cumbersome, especially when dealing with large-scale enterprise data. Retention and Performance Issues – Log Analytics is optimized for general telemetry, not for high-volume, structured B2B transaction tracking, making it challenging for organizations with strict compliance requirements. To address these challenges, we are introducing Reliable B2B Tracking in Logic Apps Standard using a Premium SKU Integration Account. This new feature ensures that all B2B transactions are reliably tracked and ingested into an Azure Data Explorer (ADX) cluster, providing a lossless tracking mechanism with powerful querying and visualization capabilities. With data in ADX, customers can extend their existing PowerBI dashboards or easily build custom dashboards on this data if they need detailed analysis on any issue. Additionally, a tracking dashboard is available, enabling customers to monitor, search, and analyze B2B transactions efficiently. This enhancement significantly improves reliability, visibility, and troubleshooting for mission-critical B2B integrations. How It Works Reliable B2B Tracking in Logic Apps Standard ensures that every AS2, X12, and EDIFACT transaction is accurately recorded and stored in an Azure Data Explorer (ADX) cluster database instead of relying on Azure Log Analytics, which may drop events. Here’s how the system works: Event Collection – Whenever a B2B transaction occurs, tracking data is generated from the built-in AS2, X12, and EDIFACT actions in Logic Apps Standard. Data Ingestion – Instead of sending logs to Log Analytics, the tracking data is directly pushed to an Azure Data Explorer (ADX) cluster via integration account transactionally to ensure reliable, lossless storage. Structured Storage – ADX provides fast indexing and query capabilities, allowing enterprises to search, filter, and analyze their transactions efficiently. Tracking Dashboard – A dedicated B2B monitoring dashboard visualizes transaction flow, helping customers track acknowledgments (997, MDN), detect failures, and troubleshoot issues in real time. Requirements for Using Reliable B2B Tracking To enable this feature, customers must meet the following prerequisites: Premium SKU Integration Account – This feature is only available with a Premium SKU Integration Account in Logic Apps Standard. Built-in AS2, X12, or EDIFACT Actions – Only transactions processed through Logic Apps Standard using built-in B2B actions will be tracked reliably. Azure Data Explorer Cluster – Customers must provide their own ADX cluster database, where all transaction logs will be stored and queried. Please note that B2B tracking for EDIFACT transactions are not supported yet and will be available in near future. How to Use Reliable B2B Tracking 1. Create a Tracking Store Artifact in the Integration Account In the Integration Account, create a tracking store artifact that points to an existing Azure Data Explorer (ADX) cluster database. Currently, only one default tracking store is supported per integration account. The ADX database must be pre-created before setting up the tracking store. 2. Enable or Disable Tracking in the Agreement Settings B2B tracking is also managed at the agreement level. By default, tracking is enabled for an agreement. To disable tracking, user can set a setting named TrackingState to Disabled in send or receive agreement. To enable again, TrackingState value needs to set to Enabled. Please note that this setting can be updated in JSON view only. For tracking to function correctly, both the tracking store in the integration account be configured and TrackingState needs to be set to Enabled in agreement. Using the Tracking Dashboard Before using tracking dashboard, please ensure that some B2B actions are executed so that tracking data is available in tracking store. To view the tracking dashboard, please click the When the B2B tracking dashboard is opened, the dashboard displays message overview data for 7 days by default. To change the data scope to a different time interval, use the TimeRange at the top of the page. After the Message Overview Status dashboard loads, users can drill down into specific message types (AS2 or X12) for a more detailed view. Selecting the AS2 or X12 tabs provides insights into message processing details, including transaction status, acknowledgments, and failures. Managing Tracking Stores via REST API Reliable B2B Tracking supports a REST API for managing tracking stores. Users can create, update, delete, and retrieve tracking stores programmatically using the following API endpoints. If users choose to use rest API to create tracking store in integration account, then two ADX database tables named AS2TrackRecords and EdiTrackRecords need to be created manually in the ADX database with a specific schema and integration account needs to have 'Ingester' permission to the database. 1. Get All Tracking Stores Retrieves all tracking stores configured in an Integration Account. GET https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/integrationAccounts/{integrationAccountName}/groups/default/trackingstores?api-version=2016-06-01 Parameters: {subscriptionId} – Azure subscription ID. {resourceGroupName} – Name of the resource group. {integrationAccountName} – Name of the integration account. Response: Returns a list of tracking stores associated with the integration account. Please note that currently only one tracking store per integration account is supported. 2. Get a Specific Tracking Store Retrieves details of a specific tracking store. GET https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/integrationAccounts/{integrationAccountName}/groups/default/trackingstores/{trackingstoreName}?api-version=2016-06-01 Parameters: {trackingstoreName} – Name of the tracking store to retrieve. Response: Returns details of the specified tracking store. 3. Create or Update a Tracking Store Creates a new tracking store or updates an existing one. PUT https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/integrationAccounts/{integrationAccountName}/groups/default/trackingstores/{trackingstoreName}?api-version=2016-06-01 Request Body: json { "properties": { "adxClusterUri": "https://youradxcluster.kusto.windows.net", "databaseName": "YourDatabaseName" } } Parameters: adxClusterUri – The Azure Data Explorer cluster URI. databaseName – The database name within the ADX cluster. Response: Returns the details of the created or updated tracking store. 4. Delete a Tracking Store Deletes an existing tracking store from the integration account. DELETE https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/integrationAccounts/{integrationAccountName}/groups/default/trackingstores/{trackingstoreName}?api-version=2016-06-01 Parameters: {trackingstoreName} – Name of the tracking store to delete. Response: Returns a success response if the tracking store is deleted successfully. Tracking Database Table Schema The Azure Data Explorer (ADX) cluster database used for Reliable B2B Tracking stores transaction data in a structured format. AS2 transactions are stored in a table named AS2TrackRecords. X12 and EDIFACT transactions are stored in a table named EdiTrackRecords. These tables enable efficient querying and retrieval of B2B tracking data, providing structured insights into message flow, processing status, and troubleshooting details. Since B2B tracking data is stored in an Azure Data Explorer database, users can leverage Azure Workbook to create visually rich custom dashboards for analyzing their B2B transactions. If users choose to use rest API to create tracking store in integration account, then these two tables need to be created manually in the ADX database and integration account needs to have 'Ingester' permission to the database. For more information, see Reliable B2B Tracking Database Schema Future Enhancements To further improve Reliable B2B Tracking, the following enhancements are planned for future releases: 1. Support for secure access ADX. Users will be able to access the tracking store securely without relying on the public ADX IP address. 2. EDIFACT Transaction Tracking In the future, EDIFACT message tracking will be fully supported, enabling detailed monitoring of EDIFACT interchanges, functional groups, and message transactions.365Views0likes0CommentsAnnouncing the availability of TLS 1.3 in Azure API Management in Preview
TLS 1.3 is the latest version of the internet’s most deployed security protocol, which encrypts data to provide a secure communication channel between two endpoints. TLS 1.3 support in Azure API Management is planned to rollout during the first week of February 2024. The rollout will happen in stages, this means some regions will get it first as we roll out globally.22KViews2likes6CommentsOptimizing Azure DevOps Jira Integration: 5 Practical Use Cases for DevOps Teams
Many teams rely on Azure DevOps (ADO) for development and Jira for project or product management. While each tool is powerful on its own, things often get messy when work items, statuses, and updates live in separate systems. Integrating the two platforms can remove a lot of friction. Below are six common use cases I have seen from real teams, with concrete problems and solutions to make the connection between Jira and Azure DevOps work smoothly. 1. Keeping User Stories and Bugs in Sync Challenge: Teams use Jira for user requests and Azure DevOps for development tasks. Manually updating both systems is tedious and error-prone. Solution: Enable two-way synchronization so that changes in Jira automatically reflect in Azure DevOps and vice versa (including comments and status updates). This keeps bugs and stories aligned without duplicate work. “Before we integrated Jira with Aure DevOps, I spent too much time manually updating task statuses in both systems. Now, with the automatic sync, my team is focused on actual coding work instead of managing project statuses across platforms.” — DevOps Engineer 2. One-Way Sync for Project Management–First Teams Challenge: Some organizations plan and track everything in Jira but manage code exclusively in Azure DevOps. Developers only need the essentials pushed across. Solution: Use a one-way sync from Jira → Azure DevOps to bring over metadata like titles, statuses, sprints, and due dates. Developers see the context they need without cluttering both systems with manual updates. “We rely on Jira for all project planning and management, but the developers need a clean workspace in Azure DevOps. A one-way sync from Jira to ADO helps us keep things efficient and ensures developers always have the latest information without double entry.” — Product Owner 3. Creating Jira Tickets from Azure DevOps Tasks or Bugs Challenge: External partners or stakeholders may only work in Jira Service Management to manage tickets. Developers in Azure DevOps often need their work mirrored for transparency. Solution: Configure automated ticket creation in Jira when certain ADO tasks are tagged. Both teams can track progress in their preferred tool without duplicating effort. “We use Azure DevOps internally, but our external stakeholders only work in Jira. Automating the creation of Jira tickets based on Azure DevOps tasks or bugs has made collaboration seamless and ensured no work is lost in translation.” — DevOps Lead 4. Syncing Epics, Features, and Work Items Challenge: High-level epics might live in Jira, while features and tasks are managed in Azure DevOps. Without integration, visibility across systems is fragmented. Solution: Sync epics and features so Jira provides portfolio-level visibility, while Azure DevOps remains the system of record for detailed development work. This keeps roadmaps and execution aligned. “Tracking epics in Jira while managing the technical work in Azure DevOps used to cause us to lose visibility. Now, everything from high-level epics to individual tasks is in sync, so we always know where we stand.” — Azure DevOps Product Manager 5. Managing Multiple Jira Projects with One Azure DevOps Project Challenge: Large organizations often run multiple Jira projects (by teams or business units) but only one Azure DevOPs project for development. Syncing everything consistently is tough. Solution: Map multiple Jira projects to a single Azure DevOps project, syncing only the key data (titles, statuses, sprints, custom fields). This creates a unified development view without losing project-specific details. “We have multiple teams using different Jira projects, but we consolidate all development work into a single Azure DevOps project. Syncing across these platforms used to be a nightmare, but now everything stays aligned, and we’re able to track all initiatives in one place.” — Azure DevOps Engineer 💬 Have you integrated Jira with Azure DevOps in your team? What worked well, and what challenges did you run into?56Views0likes0CommentsExpose REST APIs as MCP servers with Azure API Management and API Center (now in preview)
As AI-powered agents and large language models (LLMs) become central to modern application experiences, developers and enterprises need seamless, secure ways to connect these models to real-world data and capabilities. Today, we’re excited to introduce two powerful preview capabilities in the Azure API Management Platform: Expose REST APIs in Azure API Management as remote Model Context Protocol (MCP) servers Discover and manage MCP servers using API Center as a centralized enterprise registry Together, these updates help customers securely operationalize APIs for AI workloads and improve how APIs are managed and shared across organizations. Unlocking the value of AI through secure API integration While LLMs are incredibly capable, they are stateless and isolated unless connected to external tools and systems. Model Context Protocol (MCP) is an open standard designed to bridge this gap by allowing agents to invoke tools—such as APIs—via a standardized, JSON-RPC-based interface. With this release, Azure empowers you to operationalize your APIs for AI integration—securely, observably, and at scale. 1. Expose REST APIs as MCP servers with Azure API Management An MCP server exposes selected API operations to AI clients over JSON-RPC via HTTP or Server-Sent Events (SSE). These operations, referred to as “tools,” can be invoked by AI agents through natural language prompts. With this new capability, you can expose your existing REST APIs in Azure API Management as MCP servers—without rebuilding or rehosting them. Addressing common challenges Before this capability, customers faced several challenges when implementing MCP support: Duplicating development efforts: Building MCP servers from scratch often led to unnecessary work when existing REST APIs already provided much of the needed functionality. Security concerns: Server trust: Malicious servers could impersonate trusted ones. Credential management: Self-hosted MCP implementations often had to manage sensitive credentials like OAuth tokens. Registry and discovery: Without a centralized registry, discovering and managing MCP tools was manual and fragmented, making it hard to scale securely across teams. API Management now addresses these concerns by serving as a managed, policy-enforced hosting surface for MCP tools—offering centralized control, observability, and security. Benefits of using Azure API Management with MCP By exposing MCP servers through Azure API Management, customers gain: Centralized governance for API access, authentication, and usage policies Secure connectivity using OAuth 2.0 and subscription keys Granular control over which API operations are exposed to AI agents as tools Built-in observability through APIM’s monitoring and diagnostics features How it works MCP servers: In your API Management instance navigate to MCP servers Choose an API: + Create a new MCP Server and select the REST API you wish to expose. Configure the MCP Server: Select the API operations you want to expose as tools. These can be all or a subset of your API’s methods. Test and Integrate: Use tools like MCP Inspector or Visual Studio Code (in agent mode) to connect, test, and invoke the tools from your AI host. Getting started and availability This feature is now in public preview and being gradually rolled out to early access customers. To use the MCP server capability in Azure API Management: Prerequisites Your APIM instance must be on a SKUv1 tier: Premium, Standard, or Basic Your service must be enrolled in the AI Gateway early update group (activation may take up to 2 hours) Use the Azure Portal with feature flag: ➤ Append ?Microsoft_Azure_ApiManagement=mcp to your portal URL to access the MCP server configuration experience Note: Support for SKUv2 and broader availability will follow in upcoming updates. Full setup instructions and test guidance can be found via aka.ms/apimdocs/exportmcp. 2. Centralized MCP registry and discovery with Azure API Center As enterprises adopt MCP servers at scale, the need for a centralized, governed registry becomes critical. Azure API Center now provides this capability—serving as a single, enterprise-grade system of record for managing MCP endpoints. With API Center, teams can: Maintain a comprehensive inventory of MCP servers. Track version history, ownership, and metadata. Enforce governance policies across environments. Simplify compliance and reduce operational overhead. API Center also addresses enterprise-grade security by allowing administrators to define who can discover, access, and consume specific MCP servers—ensuring only authorized users can interact with sensitive tools. To support developer adoption, API Center includes: Semantic search and a modern discovery UI. Easy filtering based on capabilities, metadata, and usage context. Tight integration with Copilot Studio and GitHub Copilot, enabling developers to use MCP tools directly within their coding workflows. These capabilities reduce duplication, streamline workflows, and help teams securely scale MCP usage across the organization. Getting started This feature is now in preview and accessible to customers: https://aka.ms/apicenter/docs/mcp AI Gateway Lab | MCP Registry 3. What’s next These new previews are just the beginning. We're already working on: Azure API Management (APIM) Passthrough MCP server support We’re enabling APIM to act as a transparent proxy between your APIs and AI agents—no custom server logic needed. This will simplify onboarding and reduce operational overhead. Azure API Center (APIC) Deeper integration with Copilot Studio and VS Code Today, developers must perform manual steps to surface API Center data in Copilot workflows. We’re working to make this experience more visual and seamless, allowing developers to discover and consume MCP servers directly from familiar tools like VS Code and Copilot Studio. For questions or feedback, reach out to your Microsoft account team or visit: Azure API Management documentation Azure API Center documentation — The Azure API Management & API Center Teams7.3KViews5likes7CommentsLogic Apps Community Day 2025
We are bringing Logic Apps Community Day again this year, on October 30, 2025 (Pacific Time) and we want you to join us as we host a full day of learning where you will be the star! The Logic Apps Community Day is a free event driven by Microsoft, for anyone who wants to learn more about Logic Apps and how it can help to solve real life integration problems. This year, we want to learn how you have been using AI with Logic Apps, so our themes for the sessions are: Creating Intelligent Applications with Logic Apps and AI: tell us how you have been using Logic Apps features to implement your intelligent application scenarios - from Agent Loops, to Logic Apps exposed as MCP tools, to improving your intelligent application knowledge in real time - we want to see the scenarios you created! Accelerating Logic Apps Developer Velocity with AI: How are you taking advantage of Gen AI to make your developer life easier using Logic Apps? From prompts to create test data, to automated creation of maps, unit test or custom code and anything in between. Maybe you have been using prompts, instructions or chat modes to make your development life easier? We want to see it all! Call for papers Within the themes above we are looking for breakout sessions (around 28 minutes in lenght) or lightning sessions (around 12 minutes in length). The call for papers are open until September 7, 2025 - 11:00 PM (PST). Selected sessions and speakers will be informed by email by September 10, 2025. Click on the image below to submit your session to Logic Apps Community Day 2025! We are looking forward your proposals!607Views0likes0Comments🚀 New in Azure API Management: MCP in v2 SKUs + external MCP-compliant server support
Your APIs are becoming tools. Your users are becoming agents. Your platform needs to adapt. Azure API Management is becoming the secure, scalable control plane for connecting agents, tools, and APIs — with governance built in. -------------------------------------------------------------------------------------------------------------------------------------------------------------------- Today, we’re announcing two major updates to bring the power of the Model Context Protocol (MCP) in Azure API Management to more environments and scenarios: MCP support in v2 SKUs — now in public preview Expose existing MCP-compliant servers through API Management These features make it easier than ever to connect APIs and agents with enterprise-grade control—without rewriting your backends. Why MCP? MCP is an open protocol that enables AI agents—like GitHub Copilot, ChatGPT, and Azure OpenAI—to discover and invoke APIs as tools. It turns traditional REST APIs into structured, secure tools that agents can call during execution — powering real-time, context-aware workflows. Why API Management for MCP? Azure API Management is the single, secure control plane for exposing and governing MCP capabilities — whether from your REST APIs, Azure-hosted services, or external MCP-compliant runtimes. With built-in support for: Security using OAuth 2.1, Microsoft Entra ID, API keys, IP filtering, and rate limiting. Outbound token injection via Credential Manager with policy-based routing. Monitoring and diagnostics using Azure Monitor, Logs, and Application Insights. Discovery and reuse with Azure API Center integration. Comprehensive policy engine for request/response transformation, caching, validation, header manipulation, throttling, and more. …you get end-to-end governance for both inbound and outbound agent interactions — with no new infrastructure or code rewrites. ✅ What’s New? 1. MCP support in v2 SKUs Previously available only in classic tiers (Basic, Standard, Premium), MCP support is now in public preview for v2 SKUs — Basic v2, Standard v2, and Premium v2 — with no pre-requisites or manual enablement required. You can now: Expose any REST API as an MCP server in v2 SKUs Protect it with Microsoft Entra ID, keys or tokens Register tools in Azure API Center 2. Expose existing MCP-compliant servers (pass-through scenario) Already using tools hosted in Logic Apps, Azure Functions, LangChain or custom runtimes? Now you can govern those external tool servers by exposing them through API Management. Use API Management to: Secure external MCP servers with OAuth, rate limits, and Credential Manager Monitor and log usage with Azure Monitor and Application Insights Unify discovery with internal tools via Azure API Center 🔗 You bring the tools. API Management brings the governance. 🧭 What’s Next We’re actively expanding MCP capabilities in API Management: Tool-level access policies for granular governance Support for MCP resources and prompts to expand beyond tools 📚 Get Started 📘 Expose APIs as MCP servers 🌐 Connect external MCP servers 🔐 Secure access to MCP servers 🔎 Discover tools in API Center Summary Azure API Management is your single control plane for agents, tools and APIs — whether you're building internal copilots or connecting external toolchains. This preview unlocks more flexibility, less friction, and a secure foundation for the next wave of agent-powered applications. No new infrastructure. Secure by default. Built for the future.2KViews2likes3Comments