logic apps
276 TopicsOrganizing logic apps workflows with Logic Apps Standard
One of the common asks from customers, when requesting guidance for Logic Apps Standard is "How many workflows can I host in logic apps standard application"? Coming from a Logic Apps Consumption paradigm, where developers simply organize logic apps in resource groups - deploying them individually using ARM template scripts - and use the Logic Apps Engine provided by the platform, this is a pertinent questions, as users would like to know how much value for money they can expect from the hosting service that they are subscribing to, and most important, how and when should they worry about scaling up or scaling out.15KViews9likes4CommentsIntroducing the New Azure Logic Apps Designer for Consumption: Faster, Smoother, and More Reliable
We are excited to announce that the new designer for consumption is now generally available for all users. Here's what you need to know about this major update and what's coming next.6.8KViews4likes10CommentsTypical Storage access issues troubleshooting
We get a big number of cases with Storage Account connection failing and sometimes we see that our customers are not aware of the troubleshooting steps they can take to accelerate the resolution of this issue. As such, we've compiled some scenarios and the usual troubleshooting steps we ask you to take. Always remember that if you have done changes to your infrastructure, consider rolling them back to ensure that this is not the root cause. Even a small change that apparently has no effect, may cause downtime on your application. Common messages The errors that are shown in the portal when the Storage Account connectivity is down are very similar, and they may not indicate correctly the cause. Error Message that surfaces in the Portal for Logic Apps Standard System.Private.Core.Lib: Access to the path 'C:\home\site\wwwroot\host.json' is denied Cannot reach host runtime. Error details, Code: 'BadRequest', Message: 'Encountered an error (InternalServerError) from host runtime.' System.Private.CoreLib: The format of the specified network name is invalid. : 'C:\\home\\site\\wwwroot\\host.json'.' System.Private.CoreLib: The user name or password is incorrect. : 'C:\home\site\wwwroot\host.json'. Microsoft.Windows.Azure.ResourceStack: The SSL connection could not be established, see inner exception. System.Net.Http: The SSL connection could not be established, see inner exception. System.Net.Security: Authentication failed because the remote party has closed the transport stream Unexpected error occurred while loading workflow content and artifacts The errors don't really indicate what the root cause is, but it's very common to be a broken connection with the Storage. What to verify? There are 4 major components to verify in these cases: Logic App environment variables and network settings Storage Account networking settings Network settings DNS settings Logic App environment variables and Network From an App Settings point of view, there is not much to verify, but these are important steps, that sometimes are overlook. At this time, all or nearly all Logic Apps have been migrated to dotnet Functions_Worker_Runtime (under Environmental Variables tab), but this is good to confirm. It's also good to confirm if your Platform setting is set to 64 bits (under Configuration tab/ General Settings). We've seen that some deployments are using old templates and setting this as 32 bits, which doesn't make full use of the available resources. Check if Logic App has the following environment variables with value: WEBSITE_CONTENTOVERVNET - set to 1 OR WEBSITE_VNET_ROUTE_ALL - set to 1. OR vnetRouteAllEnabled set to 1. Configure virtual network integration with application and configuration routing. - Azure App Service | Microsoft Learn These settings can also be replaced with the UI setting in the Virtual Network tab, when you select "Content Storage" in the Configuration routing. For better understanding, vnetContentShareEnabled takes precedence. In other words, if it is set (true/false), WEBSITE_CONTENTOVERVNET is ignored. Only if vnetContentShareEnabled is null, WEBSITE_CONTENTOVERVNET is taken into account. Also keep this in mind: Storage considerations for Azure Functions | Microsoft Learn WEBSITE_CONTENTAZUREFILECONNECTIONSTRING and AzureWebJobsStorage have the connection string as in the Storage Account Website_contentazurefileconnectionstring | App settings reference for Azure Functions | Microsoft Learn Azurewebjobsstorage | App settings reference for Azure Functions | Microsoft Learn WEBSITE_CONTENTSHARE has the Fileshare name Website_contentshare | App settings reference for Azure Functions | Microsoft Learn These are the first points to validate. Storage Account settings If all these are matching/properly configured and still the Logic App is in error, we move to the next step, that is to validate the Storage Account network settings. When the Storage Account does not have Vnet integration enabled, there should be no issues, because the connection is made through the public endpoints. Still, even with this, you must ensure that at least the "Allow storage account key access" is enabled. This is because at this time, the Logic App is dependent on the Access key to connect to the Storage Account. Although you can set the AzureWebJobsStorage to run with Managed Identity, you can't fully disable storage account key access for Standard logic apps that use the Workflow Service Plan hosting option. However, with ASE v3 hosting option, you can disable storage account key access after you finish the steps to set up managed identity authentication. Create example Standard workflow in Azure portal - Azure Logic Apps | Microsoft Learn If this setting is enabled, you must check if Storage Account is behind Firewall. The Access may be Enabled for select networks or fully disabled. Both options require Service Endpoints or Private Endpoints configured. Deploying Standard Logic App to Storage Account behind Firewall using Service or Private Endpoints | Microsoft Community Hub So check the Networking tab under the Storage Account and confirm the following: In case you select the "selected networks" option, confirm that the VNET is the same as the Logic App is extended to. Your Logic App and Storage may be hosted in different Vnets, but you must ensure that there is full connectivity between them. They must be peered and with HTTPS and SMB traffic allowed (more explained in the Network section). You can select "Disabled" network access as well. You should also confirm that the Fileshare is created. Usually this is created automatically with the creation of the Logic App, but if you use Terraform or ARM, it may not create the file share and you must do it manually. Confirm if all 4 Private Endpoints are created and approved (File, Table, Queue and Blob). All these resources are used for different components of the Logic App. This is not fully documented, as it is internal engine documentation and not publicly available. For Azure Functions, the runtime base, this is partially documented, as you can read in the article: Storage considerations for Azure Functions | Microsoft Learn If a Private Endpoint is missing, create it and link it to the Vnet as Shared Resource. Not having all Private Endpoints created may end in runtime errors, connections errors or trigger failures. For example, if a workflow is not generating the URL even if it saves correctly, it may be the Table and Queue Private Endpoints missing, as we've seen many times with customers. You can read a bit more about the integration of the Logic App and a firewall secured Storage Account and the needed configuration in these articles: Secure traffic between Standard workflows and virtual networks - Azure Logic Apps | Microsoft Learn Deploy Standard logic apps to private storage accounts - Azure Logic Apps | Microsoft Learn You can use the Kudu console (Advanced tools tab) to further troubleshoot the connection with the Storage Account by using some network troubleshooting commands. If the Kudu console is not available, we recommend using a VM in the same Vnet as the Logic App, to mimic the scenario. Nslookup [hostname or IP] [DNS HOST IP] TCPPing [hostname or IP]:[PORT] Test-Netconnection [hostname] -port [PORT] If you have Custom DNS, the command NSLookup will not return the results from your DNS unless you specify the IP address as a parameter. Instead, you can use the nameresolver command for this, which will use the Vnet DNS settings to check for the endpoint name resolution. nameresolver [endpoint hostname or IP address] Networking Related Commands for Azure App Services | Microsoft Community Hub Vnet configuration Having configured the Private Endpoint for the Logic App will not affect traffic to the Storage. This is because the PE is only for Inbound traffic. The Storage Communication will the considered as outbound traffic, as it's the Logic App that actively communicates with the Storage. Secure traffic between Standard workflows and virtual networks - Azure Logic Apps | Microsoft Learn So consider that the link between these resources must not be interrupted. This forces you to understand that the Logic App uses both HTTPS and SMB protocols to communicate with the Storage Account, meaning that traffic under the ports 443 and 445 needs to be fully allowed in your Vnet. If you have a Network Security Group associated with the Logic App subnet, you need to confirm that the rules are allowing this traffic. You may need to explicitly create rules to allow this. Source port Destination port Source Destination Protocol Purpose * 443 Subnet integrated with Standard logic app Storage account TCP Storage account * 445 Subnet integrated with Standard logic app Storage account TCP Server Message Block (SMB) File Share In case you have forced routing to your Network Virtual Appliance (i.e. Firewall), you must also ensure that this resource is not filtering the traffic or blocking it. Having TLS inspection enabled in your Firewall must also be disabled, for the Logic App traffic. In short, this is because the Firewall will replace the certificate in the message, thus making the Logic App not recognizing the returned certificate, invalidating the message. You can read more about TLS inspection in this URL: Azure Firewall Premium features | Microsoft Learn DNS If you are using Azure DNS, this section should not apply, because all records are automatically created once you create the resources, but if you're using a Custom DNS, when you create the Azure resource (ex: Storage Private Endpoint), the IP address won't be registered in your DNS, so you must do it manually. You must ensure that all A Records are created and maintained, also keeping in mind that they need to point to the correct IP and name. If there are mismatches, you may see the communications severed between the Logic App and other resources, such as the Storage Account. So double-check all DNS records, and confirm that all is in proper state and place. If you continue to have issues after all these steps are verified, I suggest you open a case with us, so that we can validate what else may be happening, because either a step may have been missed, or some other issue may be occurring.Download Logic App content for Consumption and Standard Logic App in the Portal
It's common to see customers needing to download the JSON contents for their Logic Apps, either to keep a copy of the code or to initiate CI/CD. The methods to download this are very simple, accessible on a single button. We will only approach the Portal method to extract the Workflows JSON content, not approaching the other available methods (Visual Studio, VS Code, PowerShell, etc.).Running Powershell inline with Az commands- Logic App Standard
With the availability of the Inline "Execute Powershell code" action, a few questions have been brought to us like for example how to execute Az commands with this action. First, let's talk about requirements. As the documentation states, we need to use a Logic App Standard workflow, and when we add the action, it will create two files: a requirements.psd1 file and the execute_powershell_code.ps1 file, in the workflow folder, for reference. Add and run PowerShell in Standard workflows - Azure Logic Apps | Microsoft Learn We can import private modules and public modules, but these aren't imported automatically. We need to specify the modules we need. So, to make our lives easier, the great folks in the Product Group already gave us some hints on how to import them. For our purposes, we will import only the Az module. The Requirements file will need to look like this: # This file enables modules to be automatically managed by the Functions service. # See https://aka.ms/functionsmanageddependency for additional information. # @{ # For latest supported version, go to 'https://www.powershellgallery.com/packages/Az'. Uncomment the next line and replace the MAJOR_VERSION, e.g., 'Az' = '5.*' 'Az' = '10.*' } With the module imported, we can now face the script itself. This demo is a very simplistic script, that will list the resource groups in the subscriptions. The example script is as follows: $action = Get-TriggerOutput Connect-AzAccount -Identity $results = Get-AzSubscription | ForEach-Object { $subscriptionName = $_.Name Set-AzContext -SubscriptionId $_.SubscriptionId (Get-AzResourceGroup).ResourceGroupName | ForEach-Object { [PSCustomObject] @{ Subscription = $subscriptionName ResourceGroup = $_ } } } Push-WorkflowOutput -Output $results The trick to make this work is that we're connecting to the AzAccount with the Managed Identity. The Logic App already has a System Assigned MI out of the box, that allows you to connect to the Az environment, but you will need to assign permissions as well. For my example, I assigned the MI a Contributor role, for the subscription, as I was listing all Resource Groups in it. But you may restrict as needed, of course. With the test, you can see that this simple script executes quite nicely, taking about 45 seconds to complete (may depend on user experience and the script complexity). Keep in mind that this is running on a WS1 plan, so it may be a bit slow. Once it caches the request, it's quite fast: So, to summarize, the steps taken to achieve proper execution for Az commands with the inline PowerShell action were: Add the action Import the Az module in the requirements file Assign the proper role to the Logic App Managed Identity Create the script Test!Automating Logic Apps connections to Dynamics 365 using Bicep
I recently worked with a customer to show the ease of integration between Logic Apps and the Dataverse as part of Dynamics 365 (D365). The flows of integrations we looked at included: Inbound: D365 updates pushed in near real-time into a Logic Apps HTTP trigger. Outbound: A Logic App sending HTTP requests to retrieve data from D365. The focus of this short post will be on the outbound use case, showing how to use the Microsoft Dataverse connector with Bicep automation. A simple use case The app shown here couldn't be much simpler: it's a Timer recurrence which uses the List Rows action to retrieve data from D365, here's an snip from an execution: Impressed? 🤣 Getting this setup clicking-through the Azure Portal is fairly simple. The connector example uses a Service Principal to authenticate the Logic App to D365 (OAuth being an alternative), so several parameters are needed: Additionally you'll be required to configure an Environment parameter for D365, which is a URL for the target environment, e.g. https://meaingful-url-for-your-org.crm.dynamics.com. Configuring the Service Principal may be the most troublesome part and is outside of the scope of this Bicep automation, and would be considered a separate task per-environment. This page may help you complete the required identity creation. So... what about the Bicep? You can see the Bicep files in the GitHub repository here. We have to deploy 2 resources: resource laworkflow 'Microsoft.Logic/workflows@2019-05-01' = { } ... resource commondataserviceApiConnection 'Microsoft.Web/connections@2016-06-01' = { } ... The first Microsoft.Logic/workflows resource deploys the app configuration, and the second Microsoft.Web/connections resource deploys the Dataverse connection used by the app. The relationship between resources after deployment will be: The Bicep for such a simple example took some trial and error to get right and the documentation is far from clear, something I will try to get improved. In hindsight it seems straight forward, these snippets outline where I struggled. A snip from the connections resource: resource commondataserviceApiConnection 'Microsoft.Web/connections@2016-06-01' = { name: 'commondataservice' ... properties: { displayName: 'la-to-d365-commondataservice' api: { id: '/subscriptions/${subscription().subscriptionId}/providers/Microsoft.Web/locations/${location}/managedApis/commondataservice' ... The property at path properties.api.id is all important here. Now looking at the workflows resource: resource laworkflow 'Microsoft.Logic/workflows@2019-05-01' = { name: logicAppName ... parameters: { '$connections': { value: { commondataservice: { connectionName: 'commondataservice' connectionId: resourceId('Microsoft.Web/connections', 'commondataservice') id: commondataserviceApiConnection.properties.api.id } } } ... Here we see the important parameters for the connection configuration, creating the relationship between the resources: connectionName: reference the name of the connection as specified in the resource. connectionId: uses the Bicep resourceId function to obtain the deployed Azure resource ID. id: references the properties.api.id value specified earlier. So fairly simple, but understanding what value is required where isn't straight forward and that's where documentation improvement is needed. Secret Management An extra area I looked at was improved secret management in Bicep. Values required for the Service Principal must be handled securely, so how do you achieve this? The approach I took was to use the az.getSecret Bicep function within the .bicepparm file, allowing for a secret to be read from an Azure KeyVault at deployment time. This has the advantage of separating the main template file from the parameters it uses. The KeyVault used is pre-provisioned which stores the Service Principal secrets and not deployed as part of this Bicep code. using './logicapps.bicep' ... param commondataserviceEnvironment = getSecret( readEnvironmentVariable('AZURE_KV_SUBSCRIPTION_ID'), readEnvironmentVariable('AZURE_KV_RESOURCE_GROUP'), readEnvironmentVariable('AZURE_KV_NAME'), 'commondataserviceClientSecret') This example obtains the commondataserviceClientSecret parameter value from Key Vault at the given Subscription, Resource Group, Key Vault name, and secret name. You must grant Azure Resource Manager access to the Key Vault, enabled by the setting shown below: The Subscription ID, Resource Group name, and Key Vault name are read from environment variables using the readEnvironmentVariable function, showing another possibility for configuration alongside individual .bicepparm file per-environment. In Summary While this was a very simple Logic Apps use case, I hope it ties together the areas of connector automation, configuration, and security, helping you accelerate the time to a working solution. Happy integrating!Trigger workflows in Standard logic apps with Easy Auth
#Edits June 18, 2024 The Easy Auth can be configured from UX on a Standard Logic App in the "Authentication" blade under the Settings group. Apps that already have Auth Settings V2 configured will also display the details in this blade. Sample screenshot: #EndEdits For single-tenant Azure Logic Apps, we're adding the capability in Standard logic apps to set up Azure Active Directory (Azure AD) authorization policies. When a Standard logic app workflow starts with the Request trigger, which handles inbound HTTP calls, the logic app expects each received inbound request to present access tokens that include Azure AD policies. This authentication also permits requests for the location header from an asynchronous workflow. Similar to a Consumption logic app, you can specify the claim types and values that the logic app expects in the access token presented by each inbound request. Sometimes called "Easy Auth", this capability is an Azure Functions provision for authenticating access with a managed identity and is available through Azure App Service's built-in authentication and authorization capabilities. Easy Auth makes authenticating workflow invocations possible through triggers. Rather than use Shared Access Signature (SAS) tokens, you can use Easy Auth as a more secure authentication method that doesn't require access token regeneration. Basically, Easy Auth provides all the advantages available when you use a managed identity for authentication. For more information, review Authentication and authorization in Azure App Service and Azure Functions. Meanwhile, to set up authorization policies, you can call the Auth Settings V2 by using an HTTP client such as Postman. For more information about the Swagger description, review Auth Settings V2 - WebApps REST API. This article shows how to enable and use Easy Auth this way for authenticating calls sent to the Request trigger in a Standard logic app workflow. Enable Easy Auth on the Request trigger This section provides more information about calling the Auth Settings V2 API. To call the API, use the following HTTP request: PUT https://management.azure.com/subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/Microsoft.Web/sites/{logicAppName}/config/authsettingsV2?api-version=2021-02-01 When you call the Auth Settings V2 API, replace all the property placeholder values, such as {subscriptionId}, with the actual values you want to use. However, in the request body, keep the following properties unchanged: "globalValidation": { "requireAuthentication": true, "unauthenticatedClientAction": "AllowAnonymous" } Note- EasyAuth is managed by AppService, and for an incoming request, it is a hop that that comes before LA Runtime. When EasyAuth is enabled for a Logicapp standard, all incoming requests are validated against the policies in your V2 Auth settings. If you have “unauthenticatedClientAction”: “Return401” and when the request fails with EasyAuth, those requests are not routed to LA runtime and will fail with 401 from AppService. Therefore, you will also observe broken portal experience with Return401. When you set it to “AllowAnonymous”, all calls (failed and successful) will be routed to the LA runtime. The LA runtime will know if the request failed with EasyAuth or was successful and will process the request accordingly. For example, to get run histories, we authenticate it on SAS specific to that run generated based on the Logic Apps access keys. LA runtime will know that this request failed with EasyAuth but it will be processed successfully as it has valid SAS. The underlying AppService platform will have no knowledge of validating other auth like SAS. The following list has more information about the specific properties that you use: - identityProviders.azureActiveDirectory.openIdIssuer: The token issuer for your Azure AD - identityProviders.azureActiveDirectory.clientId: The ID for your AAD App Registration. This will be augmented as an allowed audience. - identityProviders.azureActiveDirectory.validation.allowedAudiences: An array with the allowed audience values for the token - identityProviders.azureActiveDirectory.validation.defaultAuthorizationPolicy.allowedPrincipals.identities: An array with the object IDs for the Azure AD identities, such as user/group The following example, which is attached at the end of this article, shows a sample payload to include as the PUT request body: { "id": "/subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/Microsoft.Web/sites/{logicAppName}/config/authsettingsV2", "name": "authsettingsV2", "type": "Microsoft.Web/sites/config", "location": "{locationOfLogicApp}", "tags": {}, "properties": { "platform": { "enabled": true, "runtimeVersion": "~1" }, "globalValidation": { "requireAuthentication": true, "unauthenticatedClientAction": "AllowAnonymous" }, "identityProviders": { "azureActiveDirectory": { "enabled": true, "registration": { "openIdIssuer": "{issuerId}", "clientId": "{clientId}" }, "login": { "disableWWWAuthenticate": false }, "validation": { "jwtClaimChecks": {}, "allowedAudiences": [ {audience1}, "{audience2}" ], "defaultAuthorizationPolicy": { "allowedPrincipals": { "identities": [ "{ObjectId of AAD app1}", "{ObjectId of AAD app2}" ] } } } }, "facebook": { "enabled": false, "registration": {}, "login": {} }, "gitHub": { "enabled": false, "registration": {}, "login": {} }, "google": { "enabled": false, "registration": {}, "login": {}, "validation": {} }, "twitter": { "enabled": false, "registration": {} }, "legacyMicrosoftAccount": { "enabled": false, "registration": {}, "login": {}, "validation": {} }, "apple": { "enabled": false, "registration": {}, "login": {} } }, "login": { "routes": {}, "tokenStore": { "enabled": false, "tokenRefreshExtensionHours": 72.0, "fileSystem": {}, "azureBlobStorage": {} }, "preserveUrlFragmentsForLogins": false, "cookieExpiration": { "convention": "FixedTime", "timeToExpiration": "08:00:00" }, "nonce": { "validateNonce": true, "nonceExpirationInterval": "00:05:00" } }, "httpSettings": { "requireHttps": true, "routes": { "apiPrefix": "/.auth" }, "forwardProxy": { "convention": "NoProxy" } } } }​ Call the Request trigger with Azure AD OAuth To call the Request trigger in your workflow using Azure AD OAuth, send a request to the callback or invoke URL by passing the Authorization header, but not the SAS tokens, in the query parameter using the following syntax: POST https://{logicAppName}.azurewebsites.net:443/api/{workflowName}/triggers/manual/invoke?api-version=2020-05-01-preview For example, in your HTTP client, which is Postman here, make the following call: Troubleshoot errors If the following errors happen, try the suggested resolutions: The request should have a valid Authorization header with "Bearer" scheme and the "WEBSITE_AUTH_ENABLED" appsetting set to true on the logicapp. This error means that your authentication token failed authorization. You can ignore the part about the "WEBSITE_AUTH_ENABLED appsetting" because you don't need to update this value on your logic app and is going to be fixed. Make sure that you've entered the necessary property values as specified in the Easy Auth setup section. The request has both SAS authentication scheme and Bearer authorization scheme. Only one scheme should be used. You can't use both SAS and Bearer authorization scheme tokens at the same time. You can use only one token because both tokens are valid and cause confusion when calling the request trigger.33KViews2likes31Comments