Discover your next integration inspiration at this year’s Ignite!
Published Nov 15 2023 10:00 AM 3,330 Views

Get ready for an exhilarating digital experience at Microsoft Ignite 2023. This year's event is a deep dive into the cutting-edge world of AI and cloud technology. And if you're eager to explore the transformative potential of Azure Integration Services, you're in for a treat. From seamless application and data integration to powerful workflow automation, Azure Integration Services is poised to revolutionize how businesses operate.


In this blog we’ll explore the major announcements for Azure Integration Services coming out of this year’s Ignite event. Join us online for a dynamic journey during which you'll gain invaluable insights, connect with industry experts, and uncover the full potential of Azure Integration Services. 


Azure Integration Environment  

We are excited to introduce a new Azure service that provides unified experience to help customers effectively manage and monitor their integration resources. This service comprises two core components: the Azure Integration Environment and Business Process Tracking, both currently in public preview.


Azure Integration Environment enables customers to organize their resources logically, providing business context and reducing overall complexity. The model is flexible to allow organizations to use integration environments in a way that aligns with internal standards and principles. For some organizations this may mean grouping integration environments based upon traditional landscapes such as development, test, staging, user acceptance testing and productions. For others, they may decide to group resources based upon business units or organizations such as Finance, Marketing, Operations, Corporate Services. Regardless of your organization’s structure, you should find the flexibility needed to address your requirements. 


Business Process Tracking, on the other hand, allows organizations to establish business context for the transactions processed by Azure Logic Apps. For instance, it can be used to provide business stakeholders with insights into complex processes, such as order processing spanning multiple logic apps and workflows. Business Process Tracking simplifies this process, making it accessible and understandable. 


Azure Logic Apps 

Azure Logic Apps, a versatile, cloud-based integration service, is an invaluable component for any organization seeking to create modern, interconnected applications and workflows. With its wide range of capabilities, it empowers businesses to streamline processes and automate tasks with ease. At this year’s Ignite, we’ll be introducing an array of new features, starting with an exciting new AI assistant to take developer productivity to the next level. 


Azure Logic Apps workflow assistant  

Bringing the power of AI to Azure Logic Apps for the first time, this workflow assistant answers any questions you have about Azure Logic Apps from directly within the designer. Currently in public preview, this chat interface provides access to Azure Logic Apps documentation and best practices without requiring you to navigate documentation or search online forums.


By harnessing Azure OpenAI and ChatGPT, the workflow assistant queries diverse knowledge sources related to Azure Logic Apps, including Microsoft Learn, connector schemas, and tech community blogs, which are processed into a vectorized format and made accessible through a backend system.  


In addition to discovering curated information, there are any number of ways Azure Logic Apps workflow assistant can help: 

  • Describe workflows built by other developers on shared workflows 
  • Recommend and compare connectors from the over 1,000 options in Azure Logic Apps, as well as best practices for their use  
  • Recommend patterns for workflows and integration applications, with responses that you can leverage for error handling, testing, and other optimizations 
  • Provide step-by-step guidance on how to build workflows based on scenarios, including connectors to use, how to configure them and how to process the data 
  • Offer contextual responses based on your workflow, allowing you to ask specific questions on how to do accomplish something in the context of an action 

With the workflow assistant at your side during development, all the information you require is readily available, presented in a curated manner and without the need for context-switching. 


.NET Framework Custom Code for Azure Logic Apps (standard)  

We are pleased to announce the general availability of .NET Framework Custom Code for Azure Logic Apps (standard). This capability allows customers to extend their low-code solutions with the power of custom code. Once a developer has written and compiled their .NET Framework code, it can be called from a built-in action from within a workflow.  


In addition, this new capability provides the following benefits: 

  • A no-cliffs extensibility capability to our low-code offering, giving developers the flexibility and control needed to solve the toughest integration problems 
  • No additional service plans required, deploy your custom code alongside your workflows 
  • A local debug experience, in VS Code, which allows you to step through your workflows and code in the same debugging session 
  • Support for BizTalk migration scenarios, allowing customers to lift and shift their custom .NET Framework investments from on-premises to the cloud 

Application Insights Enhancements for Azure Logic Apps (standard)  

We are also excited to announce the general availability of Application Insights Enhancements for Azure Logic Apps (standard). This new capability gives you a simplified experience for discovering insights about your Logic App workflows and more control over filtering events at the source and reducing your storage costs. 


The investments that we have made in this area are opt-in, meaning that your existing Logic Apps will continue to use the v1 schema. By enabling this new schema, also referred to as v2, your data will be emitted using a new schema. This means that if you have built custom reports that use the v1 schema, you will need to update those reports to take advantage of the new data structures. To learn about how to enable the v2 schema, please refer to in-product documentation. 


Azure Logic Apps Standard VS Code Extension Installation Enhancements 

Azure Logic Apps Standard empowers developers to build workflows locally using a VS Code Extension that closely mirrors the online experience provided within the portal. This enhancement bridges the gap between the development of Azure Logic Apps Standard workflows and the familiar tools and processes that developers and integration specialists are accustomed to. And now, we are simplifying the onboarding process, eliminating a series of prerequisites that were previously required before developers could dive into this experience. 


Currently, in public preview starting from version 2.84.1, the installation process of the VS Code Extension for Azure Logic Apps Standard will be significantly simplified. The extension will automatically identify and install any required dependencies, such as other extensions or frameworks. This will also encompass all the necessary extensions, including the C# extension essential for debugging. 


Furthermore, developers will have the convenience of installing all the required frameworks in a portable binaries folder. With this feature, the extension will be proficient in managing these requirements on your behalf, alleviating the need for manual installation or tracking of individual framework versions supported by Logic Apps. 


Azure Logic Apps' New Data Mapper for Visual Studio Code 

Data Mapper is Azure Logic Apps' new visual data transformation tool. Now, in general availability, as a Visual Studio Code extension, it can create and edit maps between source and target schemas. In order to exchange messages between different XML or JSON formats in an Azure Logic Apps workflow—especially when the source and target schemas have different structures—the data must be transformed from one format to another. Data Mapper serves this purpose by helping you visually create these mappings while also handling transformations in a singular tool.  


Data Mapper provides a modernized experience for XSLT authoring and transformation that includes drag and drop gestures, a prebuilt functions library, and manual testing. You can use the extension to create maps for XML to XML, JSON to JSON, XML to JSON, and JSON to XML transformations. Once created, these maps can be called from workflows in your logic app project in both Visual Studio Code and deployed to Azure. 


Azure API Management 

Azure API Management enables organizations to publish, secure, and analyze APIs and make them accessible to internal teams, external partners, or developers. Following are the updates to the service we’ll be presenting at this year’s Ignite: 


Microsoft Copilot for Azure now has policy authoring capabilities for Azure API Management  

Microsoft Copilot for Azure, now in public preview, introduces policy authoring capabilities for Azure API Management. With Copilot for Azure, you can easily create policies that match your specific requirements without knowing the syntax or have already configured policies explained to you. This proves particularly useful for handling complex policies with multiple requirements. Simply instruct Copilot for Azure (preview) in the context of API Management policy editor to generate policy definitions, copy the results into the editor, and make any necessary adjustments. Ask questions to gain insights into different options, modify the provided policy, or clarify the policy you already have. Explore more about this new capability here. 


Azure API Management's Credential Manager 

Credential Manager, in Azure API Management, formerly known as API Management Authorizations, has been renamed and is now available with feature enhancements for user-delegated permissions. This central repository within API Management is dedicated to managing, storing, and controlling access to your API access tokens. It plays a role in creating secure and seamless connections among your services which can then be used during API runtime using the <get-authorization-context> policy. With Credential Manager, teams will be able to provide a more seamless experience for handling API access tokens. 


Defender for APIs 

Defender for APIs, a new offering as part of Microsoft Defender for Cloud – a cloud-native application protection platform (CNAPP), is now generally available. Natively integrating with Azure API Management, it helps organizations to prevent, detect, and respond to API threats with an integrated cloud security context. With this tool, security teams can gain visibility into their business-critical APIs on Azure, understand their security posture, prioritize vulnerability fixes, and detect and respond to active runtime threats within minutes – using machine-learning powered anomalous and suspicious API usage detections.   


Azure Service Bus 

Partitioned namespaces for Azure Service Bus Premium 

Also, at Ignite we’ll be announcing general availability of partitioned namespaces for Azure Service Bus Premium. This feature allows for the utilization of partitioning in the premium messaging tier. As a result, the overall throughput of a partitioned entity is no longer constrained by the performance of a single message broker. In addition to the partitioned namespaces feature, we are also introducing an updated Service Level Agreement (SLA). When you create a Premium namespace with the new partitioned entities feature enabled in a region where Availability Zones are available, we will provide an SLA of 99.99%. 


Azure Event Grid 

Built to handle high-scale message routing from any source to any destination for any application, Azure Event Grid now supports additional capabilities that help customers capitalize on growing industry scenarios. A key part of this new functionality is the ability to deliver publish-subscribe messaging at scale, now Generally Available. This enables flexible consumption patterns for easy utilization of data over HTTP and MQTT protocols. 


Reflecting the growing demand for connectivity, integration, and analytics between Internet of Things (IoT) devices and cloud-based services, Azure Event Grid's new MQTT broker feature is now Generally Available. This enables bi-directional communication between MQTT clients at scale, facilitating one-to-many, many-to-one, and one-to-one messaging patterns using MQTT v3.1.1 and MQTT v5 protocols. These capabilities allow IoT devices from manufacturing plants, automobiles, retail stores, and more to send data to and receive data from Azure services and third-party services. To process the data further, users can route IoT data to Azure services such as Azure Event Hubs, Azure Functions, and Azure Logic Apps. Data can also be routed to third-party services via webhooks. 

To enable event-driven architectures, pull delivery through namespace topics is now Generally Available. This allows customers to process events from highly secure environments without configuring a public endpoint, controlling the rate and volume of messages consumed, while supporting much larger throughput. 


Event Grid namespaces also now support the ability to push events (in Public Preview) to Azure Event Hubs at high scale through a namespace topic subscription. This enables the development of more distributed applications to send discrete events to ingestion pipelines.  


To help customers scale to meet the demands of these new scenarios, Event Grid has also increased the number of Throughput Units available in an Event Grid namespace to 40, meeting the needs of more data-intensive scenarios by providing more capacity in a namespace. 


Join Us for Azure Integration Services Day: Your Opportunity for Direct Insights and Live Q&A! 

Are you interested in hearing about these announcements directly from our Azure product team? Do you have questions you'd like to have answered in real-time? Don't miss out on Azure Integration Services Day on November 30, 2023, from 11:00 AM to 4:30 PM PT. Secure your spot by registering now for the event. 

Version history
Last update:
‎Nov 15 2023 10:19 AM
Updated by: