integration
359 TopicsUnlocking Advanced Data Analytics & AI with Azure NetApp Files object REST API
Azure NetApp Files object REST API enables object access to enterprise file data stored on Azure NetApp Files, without copying, moving, or restructuring that data. This capability allows analytics and AI platforms that expect object storage to work directly against existing NFS based datasets, while preserving Azure NetApp Files’ performance, security, and governance characteristics.342Views0likes0CommentsAzure Arc Jumpstart Template for Hybrid Logic Apps Deployment
Today I’m sharing a new Azure Arc Jumpstart template (Jumpstart Drop) that implements a hybrid deployment for Azure Logic Apps (Standard) on Azure Arc-enabled AKS cluster which enables deployment with a single command that sets up a complete, working environment for testing your scenarios. Jumpstart Drop link: Azure Arc Jumpstart Hybrid deployment model allows you to run Logic Apps workloads on your own infrastructure, providing you with the option to host your integration solutions on premises, in a private cloud, or even in a third-party public cloud. You can find more information in this article:Set up your own infrastructure for Standard logic app workflows - Azure Logic Apps | Microsoft Learn Demo video: Given that Hybrid Logic Apps is designed to support mission-critical, enterprise-level integration scenarios, with provision to configure on custom infrastructure across both Azure and Kubernetes environments—the setup can naturally be complex. This Jumpstart template addresses these intricacies by simplifying onboarding and providing an all-in-one script for comprehensive testing and validation. This Jumpstart template deploys the following components and verifies each deployment stage. You can customize each step as per your requirements by modifying the script. An AKS cluster as the Kubernetes substrate for the scenario. Azure Arc enablement for Kubernetes to onboard the cluster into Azure for centralized governance and operations. ACA extension, Custom location and Connected Environment. Azure SQL Server for runtime storage. Azure storage account for SMB artifacts storage. A hybrid logic Apps resource. Getting started : Visit the Jumpstart URL Azure Arc Jumpstart and follow the steps below. Check prerequisites and permissions (see Prerequisites tab). Download the GitHub repo and run the deployment command (see Quick Start tab). After deployment, run test commands to confirm everything works (see Testing tab). Feedback and contributions Jumpstart templates are built for the community and improved with community input. If you try this drop and have suggestions (documentation clarity, additional validation, new scenarios, or fixes), please share feedback through the https://github.com/ShreeDivyaMV/LogicApps_Jumpstart_Templates repository associated with the drop. References: Announcement: General Availability of Logic Apps Hybrid Deployment Model | Microsoft Community Hub Create Standard logic app workflows for hybrid deployment - Azure Logic Apps | Microsoft Learn https://github.com/ShreeDivyaMV/LogicApps_Jumpstart_Templates Azure Arc JumpstartAutomated Test Framework - Missing Tests in Test Explorer
Have your tests created using the Logic Apps Standard Automated Test Framework disappeared from the Test Explorer in VS Code all of a sudden? The answer is a mismatch between the MSTest versions used. A recent update in the C# DevKit extension changed the minimum requirements for the MSTest library - it now requires the minimum of 3.7.*. The project scaffolding created byt the Logic Apps Standard extension uses 3.0.2. The good news is that you can fix this by just changing package versions on your project. Follow the instructions below to have this fixed: Open the .csproj that the extension created (e.g. LogicApps.csproj). Find the ItemGroup containing your package references. It should look like this: <ItemGroup> <PackageReference Include="MSTest" Version="3.2.0"/> <PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.8.0"/> <PackageReference Include="MSTest.TestAdapter" Version="3.2.0"/> <PackageReference Include="MSTest.TestFramework" Version="3.2.0"/> <PackageReference Include="Microsoft.Azure.Workflows.WebJobs.Tests.Extension" Version="1.0.0"/> <PackageReference Include="coverlet.collector" Version="3.1.2"/> </ItemGroup> Replace the package references with this new version: <ItemGroup> <PackageReference Include="Microsoft.Azure.Workflows.WebJobs.Tests.Extension" Version="1.*" /> <PackageReference Include="MSTest" Version="4.0.2" /> <PackageReference Include="coverlet.collector" Version="3.2.0" /> </ItemGroup> Once you make this change, restart VSCode window and rebuild your project - Test Explorer will start showing your tests again. Notice that this package reference is simplified as some of the previous references are already in the dependency chain, so don't need to be explicitly added. ℹ️ We are updating our extension to make sure that new projects are being generated with the new values, but you should make those changes manually on existing projects.314Views0likes0CommentsLogic Apps Aviators Newsletter - February 2026
In this issue: Ace Aviator of the Month News from our product group News from our community Ace Aviator of the Month February 2026’s Ace Aviator: Camilla Bielk What's your role and title? What are your responsibilities? I’m a developer and solution architect, working as a consultant (at XLENT) in teams involved across the integration lifecycle—from understanding business needs and building new solutions to operations. Previously I worked a lot with BizTalk and with customers using both Azure and BizTalk. In my current role, I work exclusively with Azure. Can you give us some insights into your day-to-day activities and what a typical day in your role looks like? A typical day starts by syncing operational status with the team to ensure everything is running as expected (Logic Apps, Service Bus, Functions, API Management, etc.) and discussing any findings from the previous day. Then I continue with ongoing work, ranging from stakeholder meetings and designing new integration flows to C# development, operations, and maintenance of the existing landscape. I enjoy being involved across the entire chain. Working closely with operational issues provides valuable input into design decisions, and vice versa, which strengthens me as a designer and architect. What motivates and inspires you to be an active member of the Aviators/Microsoft community? The openness to feedback and knowledge sharing from both community developers and the product team motivates me to stay active, continuously learn, and become the best consultant I can be. Passing knowledge to new team members is rewarding and completes the circle. Integration conferences are inspiring - networking with amazing people and discussing the technology we use every day. That’s also why we started the Nordic Integration Summit: to give more people the chance to take part in such an inspiring event. Looking back, what advice do you wish you had been given earlier that you'd now share with those looking to get into STEM/technology? Your social skills are more valuable than you might think. Teamwork is everything, and understanding business needs—even from non-technical colleagues—is just as important as technology. Listen to your heart and find environments where you can thrive. Programming languages change over time; the real takeaway is mastering common concepts and patterns and developing the ability to learn continuously. Stay curious and embrace new things as they come! What has helped you grow professionally? 100% of my professional growth comes from working with kind, humble, and collaborative people—teams where knowledge is shared freely in all directions, mistakes are allowed, and learning from them is encouraged. If you had a magic wand that could create a feature in Logic Apps, what would it be and why? Better cost transparency—clearly showing which actions drive costs (executions, retries, data volume, logging) with visibility before production deployment. Design-time guidance that warns about expensive operations, anti-patterns, and inefficient configurations. News from our product group Introducing Unit Test Agent Profiles for Logic Apps & Data Maps Focused unit test agent profiles help Logic Apps Standard teams discover workflows and data maps, write reusable specifications, generate typed mocks/test data, and implement MSTest suites against the Automated Test SDK. The approach promotes spec-first testing, consistent artifacts, and reliable validations across scenarios (happy path, error handling, boundaries). The project demonstrates how GitHub Copilot custom agents and prompts can accelerate unit test authoring while enforcing constraints for maintainability in enterprise integrations. Automated Test Framework - Missing Tests in Test Explorer If Logic Apps Standard tests disappear from VS Code’s Test Explorer, the cause is typically a MSTest version mismatch introduced by a recent C# DevKit update. The fix is to update package references in the project (MSTest, Test SDK, and related dependencies) to supported versions. After adjusting packages and restarting VS Code, tests reappear and run as expected. The extension is being updated, but existing projects should apply the manual changes to restore a stable testing experience. Upcoming Agentic Azure Logic Apps Workshops Free workshops introduce Agentic Business Processes in Azure Logic Apps, including MCP Server integrations and agent loop patterns. Sessions cover connecting the enterprise with MCP Servers from Copilot Studio and building agentic workflows in a day using Logic Apps (Standard). Registration links are provided, with dates set in January. These events help practitioners explore agent tools, orchestration patterns, and practical scenarios for intelligent automation in modern integration landscapes. News from our community Enterprise AI ≠ Copilot Post by Al Ghoniem, MBA Deploying Copilot is not the same as building an enterprise AI strategy. This article distinguishes between adopting a product versus developing a core organizational capability. It explores why many AI rollouts stall when treated as point solutions and outlines what it takes to embed AI as a strategic, scalable function across the enterprise. For leaders evaluating their AI roadmaps, this piece offers a framework for thinking beyond tools toward sustainable transformation. BizTalk Server and WinSCP Error Post by Sandro Pereira A common SFTP adapter issue resurfaces when BizTalk Server cumulative updates are applied. This troubleshooting guide explains why the WinSCPnet assembly fails to load after upgrading from CU5 to CU6 and provides a version compatibility matrix for BizTalk Server 2020. It includes step-by-step instructions to resolve the error by copying the correct WinSCP binaries to the BizTalk installation folder, along with an automated PowerShell script for streamlined remediation. More on finding application registrations used by Logic Apps Post by Mikael Sand Building on earlier work around API connection discovery, this post extends KQL queries to find Logic Apps that authenticate via Client ID and secret in HTTP actions. Using Azure Resource Graph Explorer, the technique scans workflow parameters to identify application registration references. While the approach assumes parameters are used for credentials, it provides a practical method for auditing which Logic Apps depend on a given app registration across subscriptions. MCP Servers in Azure Logic Apps Agent Loops (Step-by-Step) Video by Stephen W. Thomas This walkthrough demonstrates how to move tool logic out of Logic App Agent loops and into reusable Model Context Protocol servers. The video covers setting up an MCP server, registering tools, and invoking them from within an agent workflow. By decoupling tool definitions from orchestration logic, developers can build modular, maintainable agentic systems that scale across multiple workflows and scenarios. From Rigid Choreography to Intelligent Collaboration: Agentic Orchestration as the Evolution of SOA Post by Steef-Jan Wiggers Integration has evolved from rigid, deterministic workflows to adaptive, goal-driven orchestrations powered by intelligent agents. This article traces the journey from BizTalk-era SOA composites to modern agentic patterns, explaining how AI-enabled orchestrators dynamically plan, self-correct, and communicate intent rather than follow fixed sequences. With examples from Azure Logic Apps Agent Loop, it shows how integration professionals can leverage existing connectors and APIs as tools within reasoning-based architectures. Azure Integrations That Actually Work in Production Post by Devarajan Gurusamy Architecture diagrams often look perfect in presentations but fail under real-world conditions. This article examines what it takes to build Azure integrations that survive production workloads, covering resilience patterns, error handling strategies, and operational considerations that separate proof-of-concepts from enterprise-grade solutions. It offers practical guidance for teams moving beyond demos toward reliable, maintainable integration implementations. Configuring BizTalk Server Backup Jobs for Azure Blob Storage Post by Sandro Pereira Modernizing BizTalk infrastructure can start with small, targeted improvements. This guide shows how to redirect native BizTalk backup jobs to Azure Blob Storage using SQL Server’s BACKUP TO URL feature. It covers generating SAS tokens, creating SQL credentials, updating job steps, and validating backups. The approach improves disaster recovery posture with geo-redundant cloud storage while keeping the BizTalk environment unchanged—no new components or custom code required. Building an AI Agent with Logic Apps Consumption and Agent Loop Post by Stephen W. Thomas Logic Apps Consumption with AgentLoop offers a fast path to practical AI agents, particularly for experimentation and low-volume workloads. This post walks through building a stateful poker-playing agent, demonstrating tool orchestration, agent instructions, and execution tracing. It compares Consumption and Standard tiers, explores cost efficiency, and shows how to offload reasoning to external models. At roughly a dollar for 25 runs, it’s an accessible entry point for agentic development. Microsoft Agent Framework Post by Al Ghoniem, MBA Microsoft’s open-source Agent Framework provides a comprehensive toolkit for building, orchestrating, and deploying AI agents in Python and .NET. It features graph-based workflows with streaming and checkpointing, multi-agent patterns, built-in observability via OpenTelemetry, and support for multiple LLM providers. The repository includes quickstart samples, migration guides from Semantic Kernel and AutoGen, and experimental labs for benchmarking and reinforcement learning. Storage Account: You do not have permissions to list the data using your user account with Entra ID Post by Sandro Pereira Being an Owner of an Azure Storage Account does not automatically grant access to its data. This common confusion trips up many developers working with Logic Apps and blob storage. The article explains the distinction between management roles and data-plane permissions, then walks through assigning Storage Blob Data roles via Access Control (IAM) to resolve the “You do not have permissions” error when authenticating with Microsoft Entra ID. Using Office 365 Outlook Connector in Logic Apps Standard Post by Srikanth Gunnala When configuring Office 365 Outlook (V2) with Managed Identity in Logic Apps Standard, the workflow designer may fail to auto-bind existing API connections—even when runtime works correctly. This post details the issue, explaining how a null referenceName in the workflow JSON causes the trigger to appear unbound. The fix involves manually setting the connection reference name in the workflow definition, after which the designer immediately recognizes the connection. Inside Logic Apps Standard: Understanding Compute Units (CU) for Storage Scaling Post by Daniel Jonathan High-volume Logic Apps can hit Azure Storage throttling limits. This deep-dive explains how Compute Units enable horizontal storage scaling by distributing workflow execution data across up to 32 storage accounts. It covers CU routing via Run ID suffixes, table distribution patterns, and configuration via host.json. Understanding this architecture is essential for building monitoring tools, querying run histories programmatically, or troubleshooting performance at scale. E59 - Roles & Personas Video by Sebastian Meyer This episode recaps highlights from SAP TechEd and Microsoft Ignite, with a focus on the SAP Innovation Guide and Logic Apps announcements. It explores roles and personas in the integration space, discussing how different team members contribute to enterprise integration projects. The conversation provides context for practitioners following both SAP and Microsoft ecosystems and offers insights into recent platform developments relevant to hybrid integration scenarios. Architecting Trust: Leveraging Microsoft Foundry to Solve AI Governance Challenges Post by Kent Weare As organizations scale AI deployments, governance becomes critical. This article explores how Microsoft Foundry addresses common AI governance challenges including model management, access controls, auditability, and compliance. It outlines architectural patterns for building trust into AI systems from the ground up and provides guidance for teams navigating the balance between innovation velocity and responsible AI practices in enterprise environments.Introducing Unit Test Agent Profiles for Logic Apps & Data Maps
I did some experimentation this week with GitHub Copilot custom agents and custom prompts. The result was a couple of agents to help with creation of unit tests for Logic Apps Standard workflows and data maps. These agent profiles are designed to keep test artifacts consistent and maintainable by separating discovery, specification, data generation, and implementation. Specifications serve as the single source of truth; generated code and data follow from the specs. Discover: Enumerate workflows/maps, triggers/actions, dependencies, and control flow to build a testability inventory. Spec (Speckit): Write reusable test specifications (intent, inputs/mocks, expected outcomes) stored in a predictable location. Cases: Design scenario catalogs with categories (Happy Path, Error Handling, Boundary Conditions, etc.). Test Data: Generate typed mocks (Logic Apps) or input/expected data files (Data Maps) per scenario. Implement: Produce MSTest code against the Automated Test SDK on net8.0, enforcing strict constraints for reliability. Project Batch: Run operations across all workflows/maps with pre-checks, scaffolding verification, and summary reporting. ℹ️ You can find this project on GitHub: https://github.com/wsilveiranz/logicapps-unittest-custom-agent. This is an example of how those agents can be created using what is available in GitHub Copilot, so feel free to fork it and adjust it to your own requirements. Agent Profiles Available Logic Apps Workflow Unit Test Author Specialist agent for authoring Logic Apps Standard workflow unit tests using MSTest (net8.0) and the Automated Test SDK. Produces spec-first test cases, typed mocks, and runnable MSTest implementations. How It Works Discover workflows: Locate workflow.json, summarize triggers/actions, and identify external dependencies to mock. Spec-first: Write reusable specs capturing intent, mock plan, inputs, and expected outcomes. Create cases: Build scenario catalogs (Happy Path, Error Handling, Boundary Conditions, etc.). Generate test data: Produce typed MockOutput and ActionMock classes instances mapped to action names. This step is optional, as implementation will create test data if not available. Implement tests: Generate MSTest code enforcing SDK constraints and required annotations. Validate: Build and run tests; refine specs and data as needed. Batch (optional): Apply the flow across all workflows once scaffolding is verified. Skill Description Prompt File Discover Workflows Discover workflow definitions, summarize triggers/actions, and produce a mock inventory. .github/prompts/la-unit-tests-discover.prompt.md Create Test Cases Create unit test cases from a workflow definition; define intent, inputs, mock plan, and expected outcomes. .github/prompts/la-unit-tests-create-cases.prompt.md Write Speckit Specs Write and maintain Speckit-style specification documents for reusable scenarios. .github/prompts/la-unit-tests-speckit-specs.prompt.md Implement Tests Implement MSTest unit tests using the Automated Test SDK with enforced constraints. .github/prompts/la-unit-tests-implement.prompt.md Generate Test Data Generate trigger/action mock payloads and typed outputs for success and failure variants. .github/prompts/la-unit-tests-generate-test-data.prompt.md Project Batch Execute project-wide operations with scaffolding verification and roll-up reporting. .github/prompts/la-unit-tests-project-batch.prompt.md Logic Apps Data Map Unit Test Author Specialist agent for authoring Data Map unit tests from LML or XSLT using MSTest (net8.0). Produces spec-first test cases, sample input/expected data, and MSTest implementations. How It Works Discover data maps: Locate Artifacts/MapDefinitions/*.lml and Artifacts/Maps/*.xslt; summarize transformations and schema requirements. Spec-first: Write reusable specs capturing intent, input data, expected output, and validation criteria. Create cases: Build scenario catalogs (Happy Path, Empty/Missing, Boundary Values, Special Characters, Large Data, etc.). Generate test data: Produce representative input files and expected output files aligned to schemas and mapping rules. This step is optional, as implementation will create test data if not available. Implement tests: Generate MSTest classes using DataMapTestExecutor. Validate: Build and run tests; refine specs and data as needed. Batch (optional): Apply the flow across all maps; verify project scaffolding first. Skill Description Prompt File Discover Data Maps Discover LML/XSLT maps and schemas; summarize transformation structure and complexity. .github/prompts/dm-unit-tests-discover.prompt.md Create Test Cases Design scenarios covering happy path, edge cases, and error handling. .github/prompts/dm-unit-tests-create-cases.prompt.md Write Speckit Specs Write and maintain Speckit-style specs with mapping rules and validations. .github/prompts/dm-unit-tests-speckit-specs.prompt.md Implement Tests Implement MSTest suites using DataMapTestExecutor with required defaults. .github/prompts/dm-unit-tests-implement.prompt.md Generate Test Data Generate input and expected output files matched to map rules and schemas. .github/prompts/dm-unit-tests-generate-test-data.prompt.md Project Batch Execute project-wide operations with scaffolding and summary reporting. .github/prompts/dm-unit-tests-project-batch.prompt.md FAQ Why Speckit-style specs? Specs are human-readable contracts that drive consistent test implementation across teams and time. They decouple scenario design from code generation, improving reuse. Can I run batch operations? Yes. Use the project-batch profiles to orchestrate discovery and implementation across all workflows/maps once scaffolding is verified. What if scaffolding is missing? The Implement and Project Batch agents will stop and report the exact folders/files you need to create (via the Logic Apps Designer’s Create Unit Test command or manual steps).495Views2likes0CommentsBoosting Hybrid Cloud Data Efficiency for EDA: The Power of Azure NetApp Files cache volumes
Electronic Design Automation (EDA) is the foundation of modern semiconductor innovation, enabling engineers to design, simulate, and validate increasingly sophisticated chip architectures. As designs push the boundaries of PPA (Power, Performance, and reduced Area) to meet escalating market demands, the volume of associated design data has surged exponentially with a single System-on-Chip (SoC) project generating multiple petabytes of data during its development lifecycle, making data mobility and accessibility critical bottlenecks. To overcome these challenges, Azure NetApp Files (ANF) cache volumes are purpose-built to optimize data movement and minimize latency, delivering high-speed access to massive design datasets across distributed environments. By mitigating data gravity, Azure NetApp Files cache volumes empower chip designers to leverage cloud-scale compute resources on demand and at scale, thus accelerating innovation without being constrained by physical infrastructure.576Views0likes0CommentsIssue connecting Azure Sentinel GitHub app to Sentinel Instance when IP allow list is enabled
Hi everyone, I’m running into an issue connecting the Azure Sentinel GitHub app to my Sentinel workspace in order to create our CI/CD pipelines for our detection rules, and I’m hoping someone can point me in the right direction. Symptoms: When configuring the GitHub connection in Sentinel, the repository dropdown does not populate. There are no explicit errors, but the connection clearly isn’t completing. If I disable my organization’s IP allow list, everything works as expected and the repos appear immediately. I’ve seen that some GitHub Apps automatically add the IP ranges they require to an organization’s allow list. However, from what I can tell, the Azure Sentinel GitHub app does not seem to have this capability, and requires manual allow listing instead. What I’ve tried / researched: Reviewed Microsoft documentation for Sentinel ↔ GitHub integrations Looked through Azure IP range and Service Tag documentation I’ve seen recommendations to allow list the IP ranges published at //api.github.com/meta, as many GitHub apps rely on these ranges I’ve already tried allow listing multiple ranges from the GitHub meta endpoint, but the issue persists My questions: Does anyone know which IP ranges are used by the Azure Sentinel GitHub app specifically? Is there an official or recommended approach for using this integration in environments with strict IP allow lists? Has anyone successfully configured this integration without fully disabling IP restrictions? Any insight, references, or firsthand experience would be greatly appreciated. Thanks in advance!84Views0likes0CommentsDeploy PostgreSQL on Azure VMs with Azure NetApp Files: Production-Ready Infrastructure as Code
PostgreSQL is a popular open‑source cloud database for modern web applications and AI/ML workloads, and deploying it on Azure VMs with high‑performance storage should be simple. In practice, however, using Azure NetApp Files requires many coordinated steps—from provisioning networking and storage to configuring NFS, installing and initializing PostgreSQL, and maintaining consistent, secure, and high‑performance environments across development, test, and production. To address this complexity, we’ve built production‑ready Infrastructure as Code templates that fully automate the deployment, from infrastructure setup to database initialization, ensuring PostgreSQL runs on high‑performance Azure NetApp Files storage from day one.356Views1like0CommentsWhat's New with Azure NetApp Files VS Code Extension
The latest update to the Azure NetApp Files (ANF) VS Code Extension introduces powerful enhancements designed to simplify cloud storage management for developers. From multi-tenant support to intuitive right-click mounting and AI-powered commands, this release focuses on improving productivity and streamlining workflows within Visual Studio Code. Explore the new features, learn how they accelerate development, and see why this extension is becoming an essential tool for cloud-native applications.221Views0likes0CommentsIntroducing native Service Bus message publishing from Azure API Management (Preview)
We’re excited to announce a preview capability in Azure API Management (APIM) — you can now send messages directly to Azure Service Bus from your APIs using a built-in policy. This enhancement, currently in public preview, simplifies how you connect your API layer with event-driven and asynchronous systems, helping you build more scalable, resilient, and loosely coupled architectures across your enterprise. Why this matters? Modern applications increasingly rely on asynchronous communication and event-driven designs. With this new integration: Any API hosted in API Management can publish to Service Bus — no SDKs, custom code, or middleware required. Partners, clients, and IoT devices can send data through standard HTTP calls, even if they don’t support AMQP natively. You stay in full control with authentication, throttling, and logging managed centrally in API Management. Your systems scale more smoothly by decoupling front-end requests from backend processing. How it works The new send-service-bus-message policy allows API Management to forward payloads from API calls directly into Service Bus queues or topics. High-level flow A client sends a standard HTTP request to your API endpoint in API Management. The policy executes and sends the payload as a message to Service Bus. Downstream consumers such as Logic Apps, Azure Functions, or microservices process those messages asynchronously. All configurations happen in API Management — no code changes or new infrastructure are required. Getting started You can try it out in minutes: Set up a Service Bus namespace and create a queue or topic. Enable a managed identity (system-assigned or user-assigned) on your API Management instance. Grant the identity the “Service Bus data sender” role in Azure RBAC, scoped to your queue/ topic. Add the policy to your API operation: <send-service-bus-message queue-name="orders"> <payload>@(context.Request.Body.As<string>())</payload> </send-service-bus-message> Once saved, each API call publishes its payload to the Service Bus queue or topic. 📖 Learn more. Common use cases This capability makes it easy to integrate your APIs into event-driven workflows: Order processing – Queue incoming orders for fulfillment or billing. Event notifications – Trigger internal workflows across multiple applications. Telemetry ingestion – Forward IoT or mobile app data to Service Bus for analytics. Partner integrations – Offer REST-based endpoints for external systems while maintaining policy-based control. Each of these scenarios benefits from simplified integration, centralized governance, and improved reliability. Secure and governed by design The integration uses managed identities for secure communication between API Management and Service Bus — no secrets required. You can further apply enterprise-grade controls: Enforce rate limits, quotas, and authorization through APIM policies. Gain API-level logging and tracing for each message sent. Use Service Bus metrics to monitor downstream processing. Together, these tools help you maintain a consistent security posture across your APIs and messaging layer. Build modern, event-driven architectures With this feature, API Management can serve as a bridge to your event-driven backbone. Start small by queuing a single API’s workload, or extend to enterprise-wide event distribution using topics and subscriptions. You’ll reduce architectural complexity while enabling more flexible, scalable, and decoupled application patterns. Learn more: Get the full walkthrough and examples in the documentation 👉 here3.9KViews2likes6Comments