logic apps
321 TopicsAzure Logic Apps Running Anywhere – Runtime Deep Dive
For the recently released Azure Logic Apps (Preview) extension for Visual Studio Code, the Logic Apps team redesigned the Azure Logic Apps runtime architecture for portability and performance. This new design lets you locally build and run logic apps in Visual Studio Code and then deploy to various hosting environments such as Logic Apps Preview in azure and Docker containers. The redesigned Logic Apps runtime also brings in a new extensibility framework that allows more flexibility in terms of writing custom code and building your own custom connectors. This is the first of a series of blog posts that the team is planning to provide in-depth view covering various aspects of the redesigned runtime.25KViews12likes7CommentsAnnouncement: Azure Logic Apps' New Data Mapper for Visual Studio Code (Preview)
We are excited to announce the public preview of Data Mapper, Azure Logic Apps' new data mapping extension available now in Visual Studio Code. This release offers a modern and unified experience for XSLT mapping and transformation in a singular tool.23KViews10likes24CommentsOpenTelemetry in Azure Logic Apps (Standard and Hybrid)
Why OpenTelemetry? As modern applications become more distributed and complex, robust observability is no longer optional—it is essential. Organizations need a consistent way to understand how workflows are performing, trace failures, and optimize end-to-end execution. OpenTelemetry provides a unified, vendor-agnostic framework for collecting telemetry data—logs, metrics, and traces—across different services and infrastructure layers. It simplifies monitoring and makes it easier to integrate with a variety of observability backends such as Azure Monitor, Grafana Tempo, Jaeger, and others. For Logic Apps—especially when deployed in hybrid or on-premises scenarios—OpenTelemetry is a powerful addition that elevates diagnostic capabilities beyond the default Application Insights telemetry. What is OpenTelemetry? OpenTelemetry (OTel) is an open-source observability framework under the Cloud Native Computing Foundation (CNCF) that provides a unified standard for generating, collecting, and exporting telemetry data such as logs, metrics, and traces. By abstracting away vendor-specific instrumentation and enabling interoperability across various tools and platforms, OpenTelemetry empowers developers and operators to gain deep visibility into distributed systems—regardless of the underlying infrastructure or language stack. In the context of Azure Logic Apps, OpenTelemetry support enables standardized, traceable telemetry that can integrate seamlessly with a wide range of observability solutions. This helps teams monitor, troubleshoot, and optimize workflows with more precision and flexibility. How to Configure from Visual Studio Code? To configure OpenTelemetry for a Logic App (Standard) project from Visual Studio Code: Locate the host.json file in the root of your Logic App project. Enable OpenTelemetry by adding "telemetryMode": "OpenTelemetry" at the root level of the file. { "version": "2.0", "extensionBundle": { "id": "Microsoft.Azure.Functions.ExtensionBundle.Workflows", "version": "[1.*, 2.0.0)" }, "telemetryMode": "OpenTelemetry" } Define the following application settings in local.settings.json or within your CI/CD deployment pipeline: OTEL_EXPORTER_OTLP_ENDPOINT: The OTLP exporter endpoint URL where the telemetry data should be sent. OTEL_EXPORTER_OTLP_HEADERS (optional): A list of headers to apply to all outgoing data. This is commonly used to pass authentication keys or tokens to your observability backend. If your endpoint requires additional OpenTelemetry-related settings, include those in the application settings as well. Refer to the official OTLP Exporter Configuration documentation for details. How to Configure OpenTelemetry from Azure Portal? – Standard Logic Apps To enable OpenTelemetry support for a Standard Logic App hosted using either a Workflow Standard Plan or App Service Environment v3, follow the steps below: 1. Update the host.json File In the Azure portal, navigate to your Standard Logic App resource. In the left-hand menu, under Development Tools, select Advanced Tools > Go. This opens the Kudu console. In Kudu, from the Debug Console menu, select CMD, and navigate to: site > wwwroot Locate and open the host.json file in a text editor. Add the following configuration at the root level of the file to enable OpenTelemetry, then save and close the editor. { "version": "2.0", "extensionBundle": { "id": "Microsoft.Azure.Functions.ExtensionBundle.Workflows", "version": "[1.*, 2.0.0)" }, "telemetryMode": "OpenTelemetry" } 2. Configure App Settings for Telemetry Export Still within your Logic App resource, go to Settings > Environment Variables and select App settings. Add the following key-value pairs: App Setting Description OTEL_EXPORTER_OTLP_ENDPOINT The OTLP (OpenTelemetry Protocol) endpoint URL where telemetry data will be exported. For example: https://otel.your-observability-platform.com OTEL_EXPORTER_OTLP_HEADERS (Optional) Any custom headers required by your telemetry backend, such as an Authorization token (e.g., Authorization=Bearer <key>). Select Apply to save the configuration. How to Configure OpenTelemetry from Azure Portal? – Hybrid Logic Apps To enable OpenTelemetry support for a Standard Logic App using the Hybrid hosting option, follow the steps below. This configuration enables telemetry collection and export from an on-premises deployment, using environment variables and local file system access. 1. Modify host.json on the SMB Share On your on-premises file share (SMB), navigate to the root directory of your Logic App project. Locate the host.json file. Add the following configuration to enable OpenTelemetry and save the file. { "version": "2.0", "extensionBundle": { "id": "Microsoft.Azure.Functions.ExtensionBundle.Workflows", "version": "[1.*, 2.0.0)" }, "telemetryMode": "OpenTelemetry" } 2. Configure Environment Variables in Azure Portal Go to the Azure Portal and navigate to your Standard Logic App (Hybrid) resource. From the left-hand menu, select Settings > Containers, then click on Edit and deploy. In the Edit a container pane, select Environment variables, and then click Add to define the following: Name Source Value Description OTEL_EXPORTER_OTLP_ENDPOINT Manual <OTLP-endpoint-URL> The OTLP exporter endpoint URL where telemetry should be sent. Example: https://otel.yourbackend.com OTEL_EXPORTER_OTLP_HEADERS (Optional) Manual <OTLP-headers> Custom headers (e.g., Authorization=Bearer <token>) required by your observability backend. Once you've added all necessary settings, click Save. Example of Endpoint Configuration & How to Check Logs To export telemetry data using OpenTelemetry, configure the following environment variables in your Logic App’s application settings or container environment: Name Source Value Description OTEL_EXPORTER_OTLP_ENDPOINT Manual Entry https://otel.kloudmate.com:4318 The OTLP receiver endpoint for your observability backend. OTEL_EXPORTER_OTLP_HEADERS Manual Entry Authorization=<your-api-key> Used to authenticate requests to the telemetry backend. OTEL_EXPORTER_OTLP_PROTOCOL Manual Entry http/protobuf Protocol used for exporting telemetry (KloudMate supports gRPC/HTTP). In this example, we are using KloudMate as the destination for telemetry data. Once correctly configured, your Logic App will begin exporting telemetry data to KloudMate as shown below: Limitations and Troubleshooting Steps Current Limitations Supported trigger types for OpenTelemetry in Logic Apps are: HTTP Service Bus Event Hub Exporting metrics is not currently supported. Troubleshooting Steps No traces received: Validate OTEL_EXPORTER_OTLP_ENDPOINT URL and port availability. Ensure outbound traffic to observability backend is permitted. Authentication issues: Review and correct header values in OTEL_EXPORTER_OTLP_HEADERS. References Set up and view enhanced telemetry for Standard workflows - Azure Logic Apps | Microsoft LearnOrganizing logic apps workflows with Logic Apps Standard
One of the common asks from customers, when requesting guidance for Logic Apps Standard is "How many workflows can I host in logic apps standard application"? Coming from a Logic Apps Consumption paradigm, where developers simply organize logic apps in resource groups - deploying them individually using ARM template scripts - and use the Logic Apps Engine provided by the platform, this is a pertinent questions, as users would like to know how much value for money they can expect from the hosting service that they are subscribing to, and most important, how and when should they worry about scaling up or scaling out.15KViews9likes4CommentsGrant Graph API Permission to Managed Identity Object
Managed identities provide an identity for applications to use when connecting to resources that support Azure Active Directory (Azure AD) authentication. We can access Graph API either using service principal object in Azure or using Managed Identity. When it comes to service Principal, we can grant API Permissions to the service principal object in Azure but incase of Managed Identity, we do not have option to provide Graph API permission for Managed Identity object via portal. Hence we need to use the below PowerShell script to grant Graph API Permission (Application Permission) to the managed Identity object. In this blog, we will see how to grant graph API permission to the Managed Identity object Note: To provide Graph API Permission you need to be Global Administrator in Azure Active Directory Below Parameters needs to be modified as per your resources: TenantID : Provide the tenantID of your subscription GraphAppId : This parameter is optional. We don’t have to change this value. This corresponds to Graph API Guid. DisplayNameofMSI : Provide your Logic App name. Since managed identity will be created in the same name as the resource on which identity is enabled, we can provide the Logic App name Permissions : Provide the appropriate Graph API Permission. https://docs.microsoft.com/en-us/graph/permissions-reference. Note: These are application permission. Powershell Script: $TenantID="provide the tenant ID" $GraphAppId = "00000003-0000-0000-c000-000000000000" $DisplayNameOfMSI="Provide the Logic App name" $PermissionName = "Directory.Read.All" # Install the module Install-Module AzureAD Connect-AzureAD -TenantId $TenantID $MSI = (Get-AzureADServicePrincipal -Filter "displayName eq '$DisplayNameOfMSI'") Start-Sleep -Seconds 10 $GraphServicePrincipal = Get-AzureADServicePrincipal -Filter "appId eq '$GraphAppId'" $AppRole = $GraphServicePrincipal.AppRoles | ` Where-Object {$_.Value -eq $PermissionName -and $_.AllowedMemberTypes -contains "Application"} New-AzureAdServiceAppRoleAssignment -ObjectId $MSI.ObjectId -PrincipalId $MSI.ObjectId ` -ResourceId $GraphServicePrincipal.ObjectId -Id $AppRole.Id Logic App: Execute the Powershell script to grant appropriate Graph API Permission to the Managed Identity object Once the Powershell is executed, you will be able to see the below Graph API permission added.Azure Logic Apps - Authenticate with managed identity for Azure AD OAuth-based connectors
Azure Logic Apps currently supports managed identities for specific built-in triggers and actions. This blog post announces preview support for using your logic app's managed identity to authenticate to Azure AD OAuth-based managed connector triggers and actions.39KViews8likes18CommentsIntegrate Azure Open AI in Teams Channel via Logic App
Summary Now in Azure, we can deploy Open AI with chatgpt-35-turbo model as an Azure resource, so this blog introduces how to integrate the Azure Open AI in Teams channel without using a bot. Prerequisite Azure Open AI resource with a deployment which using "chatgpt-35-turbo". Azure Storage Account with a blob container which named "gpt". For Open AI conversion, if we need to enable the multi-turn conversations, we need to provide the whole conversation (channel replies) histories for Open AI API call. But in Logic App, we don't have built-in action for get all the replies of a channel message, so we need to use blob storage to save the conversation history based on message IDs. Prepare Teams channel ID and group ID in advance, you can get them via right on Channel and select "Get link to channel". Once we have the URL, we need to do a URL decode (or just replace %3a as ':' and %40 as '@', otherwise the Logic App cannot get correct channel ID) to get actual channel and Group ID. In the following screenshot, the green section is the channel ID and the red one is the group ID. Mechanism Logic App is monitoring the keywords posted in specific channel via trigger "When keywords are mentioned". In my case, I'm using ChatGPT as keyword, so once the messages/replies contian "ChatGPT", it will trigger Logic App. If this is a new message, Logic App will create a new blob for saving conversation history based on message ID (it is ReplyToId in Teams connector response). Otherwise it reads the existing blobs and re-use the conversation. Compose the content as Open AI payload and send request, transfer the response to specific Teams channel. Update blob with the latest conversation. Template and Parameters Explanation The sample template link: Drac-Zhang/LogicApp_For_Teams_OpenAI_Integration (github.com) Parameters of the template: Parameter Name Comments openai_apikey The api key of Open AI resource, it can be found in Open AI -> Keys and Endpoints openai_endpoint Open AI api endpoint, the format is https://[Open AI resource Name].openai.azure.com/openai/deployments/[Deployment name]/chat/completions?api-version=2023-03-15-preview The version must be 2023-03-15-preview since the previous version is using a different payload JSON schema teams_channel_keyword The keywords you would like to trigger the Logic App, not case sensitive teams_channel_id See prerequisite No.3 teams_group_id See prerequisite No.3 storage_account_name The storage account name for saving conversation history storage_account_accesskey The access key for the storage account Known Issues After the deployment, the Teams API connector need to be authorize manually. Logic App need to be disable and re-enable for registering the webhook on Teams side If the response from Open AI contains the keywords, then the Logic App will be trigger again If the response contains double quotas, then the next reply will failed due to invalid JSON format, but it can be easily fixed by replace expression Sample Chat Reference How to work with the ChatGPT and GPT-4 models (preview) - Azure OpenAI Service | Microsoft Learn29KViews7likes9Comments