azure functions
300 TopicsWhat's the secret sauce for getting Functions API to work with static web site?
I'm brand new, got my first Azure static web site up and running so that's good! Now I need to create some images in code and that's fighting me tooth and nail. The code to generate the image looks like this: using Microsoft.Azure.Functions.Worker; using Microsoft.Azure.Functions.Worker.Http; using Microsoft.Extensions.Logging; using SkiaSharp; using System.Diagnostics; using System.IO; using System.Net; namespace Api { public class GenerateImage { private readonly ILogger _logger; public GenerateImage(ILoggerFactory loggerFactory) { Debug.WriteLine($"GenerateImage.GenerateImage()"); _logger = loggerFactory.CreateLogger<GenerateImage>(); } // http://localhost:7071/api/image/124 works [Function("GenerateImage")] public HttpResponseData Run( [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "image/{id}")] HttpRequestData req, string id) { int width = 200, height = 100; Debug.WriteLine($"GenerateImage.Run() [id={id}]"); using var bitmap = new SKBitmap(width, height); using var canvas = new SKCanvas(bitmap); canvas.Clear(SKColors.LightBlue); var paint = new SKPaint { Color = SKColors.Black, TextSize = 24, IsAntialias = true }; canvas.DrawText($"ID: {id}", 10, 50, paint); using var ms = new MemoryStream(); bitmap.Encode(ms, SKEncodedImageFormat.Png, 100); ms.Position = 0; var response = req.CreateResponse(HttpStatusCode.OK); response.Headers.Add("Content-Type", "image/png"); response.Headers.Add("Cache-Control", "public, max-age=86400"); // 1 day // response.Body = ms; ms.CopyTo(response.Body); return response; } } } and if I navigate to http://localhost:7071/api/image/124 (for example) it happily generates an image with the number 124 in it. But if I add the HTML tag <img src="/api/image/123" alt="Generated Image"> to one of my other web pages, it says there's no such page. Apparently this is because my web pages are coming from my web site and it's at https://localhost:7154 and it doesn't know how to contact the Functions API. My staticwebapp.config.json looks like this: { "routes": [ { "route": "/api/*", "allowedRoles": [ "anonymous" ] } ], "navigationFallback": { "rewrite": "/index.html", "exclude": [ "/api/*" ] } } What am I missing?46Views0likes1CommentAnnouncing Early Preview: BYO Remote MCP Server on Azure Functions
If you’ve already built Model Context Protocol (MCP) servers with the MCP SDKs and wished you could turn them into world class Remote MCP servers using a hyperscale, serverless platform, then this one’s for you! We’ve published samples showing how to host bring‑your-own (BYO) Remote MCP servers on Azure Functions, so you can run the servers you’ve already built with the MCP SDKs—Python, Node, and .NET—with minimal changes and full serverless goodness. Why this is exciting Keep your code. If you’ve already implemented servers with the MCP SDKs (Python, Node, .NET), deploy them to Azure Functions as remote MCP servers with just one line of code change. Serverless scale when you need it. Functions on the Flex Consumption plan handles bursty traffic, scales out and back to zero automatically, and gives you serverless billing. Secure by default. Your remote server endpoint is protected with function keys out-of- the-box, with option to layer on Azure API Management for added authorization flow. BYO vs. Functions Remote MCP extension—pick the path that fits The BYO option complements the existing Azure Functions MCP extension: Build and host with Functions MCP extension: You can build stateful MCP servers with the MCP tool trigger and binding and host them on Functions. Support for SSE is available today with streamable HTTP coming soon. Host BYO remote MCP Server (this announcement): If you already have a server built with the MCP SDKs, or you prefer those SDKs’ ergonomics, host it as‑is on Functions and keep your current codebase. Either way, you benefit from Functions’ serverless platform: secure access & auth, burst scale, event-driven scale from 0 to N, and pay-for-what-you‑use. What’s supported in this early preview Servers built with the Python, Node, and .NET SDKs Debug locally with func start on Visual Studio or Visual Studio Code; deploy with the Azure Developer CLI (azd up) to get your remote MCP server quickly deployed to Azure Functions Stateless servers using the streamable HTTP transport, with guidance coming soon for stateful servers Hosting on Flex Consumption plan Try it now! Python: https://github.com/Azure-Samples/mcp-sdk-functions-hosting-python Node: https://github.com/Azure-Samples/mcp-sdk-functions-hosting-node .NET: https://github.com/Azure-Samples/mcp-sdk-functions-hosting-dotnet Each repo includes the sample weather MCP server implemented with the MCP SDK for that language. You’ll find instructions on how to run the server locally with Azure Functions Core Tools and deploy with azd up in minutes. Once deployed, you can connect to the remote server from an MCP client. The samples use Visual Studio Code, but other clients like Claude can also be used. Provide feedback to shape feature Tell us what you need next - identity flows, diagnostics, more languages, or any other features. Your feedback will shape how we take this early preview to the next level!1.3KViews3likes0CommentsAnnouncing a flexible, predictable billing model for Azure SRE Agent
Billing for Azure SRE Agent will start on September 1, 2025. Announced at Microsoft Build 2025, Azure SRE Agent is a pre-built AI agent for root cause analysis, uptime improvement, and operational cost reduction. Learn more about the billing model and example scenarios.2.2KViews1like1Comment🚀 New in Azure API Management: MCP in v2 SKUs + external MCP-compliant server support
Your APIs are becoming tools. Your users are becoming agents. Your platform needs to adapt. Azure API Management is becoming the secure, scalable control plane for connecting agents, tools, and APIs — with governance built in. -------------------------------------------------------------------------------------------------------------------------------------------------------------------- Today, we’re announcing two major updates to bring the power of the Model Context Protocol (MCP) in Azure API Management to more environments and scenarios: MCP support in v2 SKUs — now in public preview Expose existing MCP-compliant servers through API Management These features make it easier than ever to connect APIs and agents with enterprise-grade control—without rewriting your backends. Why MCP? MCP is an open protocol that enables AI agents—like GitHub Copilot, ChatGPT, and Azure OpenAI—to discover and invoke APIs as tools. It turns traditional REST APIs into structured, secure tools that agents can call during execution — powering real-time, context-aware workflows. Why API Management for MCP? Azure API Management is the single, secure control plane for exposing and governing MCP capabilities — whether from your REST APIs, Azure-hosted services, or external MCP-compliant runtimes. With built-in support for: Security using OAuth 2.1, Microsoft Entra ID, API keys, IP filtering, and rate limiting. Outbound token injection via Credential Manager with policy-based routing. Monitoring and diagnostics using Azure Monitor, Logs, and Application Insights. Discovery and reuse with Azure API Center integration. Comprehensive policy engine for request/response transformation, caching, validation, header manipulation, throttling, and more. …you get end-to-end governance for both inbound and outbound agent interactions — with no new infrastructure or code rewrites. ✅ What’s New? 1. MCP support in v2 SKUs Previously available only in classic tiers (Basic, Standard, Premium), MCP support is now in public preview for v2 SKUs — Basic v2, Standard v2, and Premium v2 — with no pre-requisites or manual enablement required. You can now: Expose any REST API as an MCP server in v2 SKUs Protect it with Microsoft Entra ID, keys or tokens Register tools in Azure API Center 2. Expose existing MCP-compliant servers (pass-through scenario) Already using tools hosted in Logic Apps, Azure Functions, LangChain or custom runtimes? Now you can govern those external tool servers by exposing them through API Management. Use API Management to: Secure external MCP servers with OAuth, rate limits, and Credential Manager Monitor and log usage with Azure Monitor and Application Insights Unify discovery with internal tools via Azure API Center 🔗 You bring the tools. API Management brings the governance. 🧭 What’s Next We’re actively expanding MCP capabilities in API Management: Tool-level access policies for granular governance Support for MCP resources and prompts to expand beyond tools 📚 Get Started 📘 Expose APIs as MCP servers 🌐 Connect external MCP servers 🔐 Secure access to MCP servers 🔎 Discover tools in API Center Summary Azure API Management is your single control plane for agents, tools and APIs — whether you're building internal copilots or connecting external toolchains. This preview unlocks more flexibility, less friction, and a secure foundation for the next wave of agent-powered applications. No new infrastructure. Secure by default. Built for the future.2.3KViews2likes3CommentsAnnouncing Native Azure Functions Support in Azure Container Apps
Azure Container Apps is introducing a new, streamlined method for running Azure Functions directly in Azure Container Apps (ACA). This integration allows you to leverage the full features and capabilities of Azure Container Apps while benefiting from the simplicity of auto-scaling provided by Azure Functions. With the new native hosting model, you can deploy Azure Functions directly onto Azure Container Apps using the Microsoft.App resource provider by setting “kind=functionapp” property on the container app resource. You can deploy Azure Functions using ARM templates, Bicep, Azure CLI, and the Azure portal. Get started today and explore the complete feature set of Azure Container Apps, including multi-revision management, easy authentication, metrics and alerting, health probes and many more. To learn more, visit: https://aka.ms/fnonacav24.8KViews2likes1CommentImportant Changes to App Service Managed Certificates: Is Your Certificate Affected?
Overview As part of an upcoming industry-wide change, DigiCert, the Certificate Authority (CA) for Azure App Service Managed Certificates (ASMC), is required to migrate to a new validation platform to meet multi-perspective issuance corroboration (MPIC) requirements. While most certificates will not be impacted by this change, certain site configurations and setups may prevent certificate issuance or renewal starting July 28, 2025. Update (August 5, 2025) We’ve published a Microsoft Learn documentation titled App Service Managed Certificate (ASMC) changes – July 28, 2025 that contains more in-depth mitigation guidance and a growing FAQ section to support the changes outlined in this blog post. While the blog currently contains the most complete overview, the documentation will soon be updated to reflect all blog content. Going forward, any new information or clarifications will be added to the documentation page, so we recommend bookmarking it for the latest guidance. What Will the Change Look Like? For most customers: No disruption. Certificate issuance and renewals will continue as expected for eligible site configurations. For impacted scenarios: Certificate requests will fail (no certificate issued) starting July 28, 2025, if your site configuration is not supported. Existing certificates will remain valid until their expiration (up to six months after last renewal). Impacted Scenarios You will be affected by this change if any of the following apply to your site configurations: Your site is not publicly accessible: Public accessibility to your app is required. If your app is only accessible privately (e.g., requiring a client certificate for access, disabling public network access, using private endpoints or IP restrictions), you will not be able to create or renew a managed certificate. Other site configurations or setup methods not explicitly listed here that restrict public access, such as firewalls, authentication gateways, or any custom access policies, can also impact eligibility for managed certificate issuance or renewal. Action: Ensure your app is accessible from the public internet. However, if you need to limit access to your app, then you must acquire your own SSL certificate and add it to your site. Your site uses Azure Traffic Manager "nested" or "external" endpoints: Only “Azure Endpoints” on Traffic Manager will be supported for certificate creation and renewal. “Nested endpoints” and “External endpoints” will not be supported. Action: Transition to using "Azure Endpoints". However, if you cannot, then you must obtain a different SSL certificate for your domain and add it to your site. Your site relies on *.trafficmanager.net domain: Certificates for *.trafficmanager.net domains will not be supported for creation or renewal. Action: Add a custom domain to your app and point the custom domain to your *.trafficmanager.net domain. After that, secure the custom domain with a new SSL certificate. If none of the above applies, no further action is required. How to Identify Impacted Resources? To assist with the upcoming changes, you can use Azure Resource Graph (ARG) queries to help identify resources that may be affected under each scenario. Please note that these queries are provided as a starting point and may not capture every configuration. Review your environment for any unique setups or custom configurations. Scenario 1: Sites Not Publicly Accessible This ARG query retrieves a list of sites that either have the public network access property disabled or are configured to use client certificates. It then filters for sites that are using App Service Managed Certificates (ASMC) for their custom hostname SSL bindings. These certificates are the ones that could be affected by the upcoming changes. However, please note that this query does not provide complete coverage, as there may be additional configurations impacting public access to your app that are not included here. Ultimately, this query serves as a helpful guide for users, but a thorough review of your environment is recommended. You can copy this query, paste it into Azure Resource Graph Explorer, and then click "Run query" to view the results for your environment. // ARG Query: Identify App Service sites that commonly restrict public access and use ASMC for custom hostname SSL bindings resources | where type == "microsoft.web/sites" // Extract relevant properties for public access and client certificate settings | extend publicNetworkAccess = tolower(tostring(properties.publicNetworkAccess)), clientCertEnabled = tolower(tostring(properties.clientCertEnabled)) // Filter for sites that either have public network access disabled // or have client certificates enabled (both can restrict public access) | where publicNetworkAccess == "disabled" or clientCertEnabled != "false" // Expand the list of SSL bindings for each site | mv-expand hostNameSslState = properties.hostNameSslStates | extend hostName = tostring(hostNameSslState.name), thumbprint = tostring(hostNameSslState.thumbprint) // Only consider custom domains (exclude default *.azurewebsites.net) and sites with an SSL certificate bound | where tolower(hostName) !endswith "azurewebsites.net" and isnotempty(thumbprint) // Select key site properties for output | project siteName = name, siteId = id, siteResourceGroup = resourceGroup, thumbprint, publicNetworkAccess, clientCertEnabled // Join with certificates to find only those using App Service Managed Certificates (ASMC) // ASMCs are identified by the presence of the "canonicalName" property | join kind=inner ( resources | where type == "microsoft.web/certificates" | extend certThumbprint = tostring(properties.thumbprint), canonicalName = tostring(properties.canonicalName) // Only ASMC uses the "canonicalName" property | where isnotempty(canonicalName) | project certName = name, certId = id, certResourceGroup = tostring(properties.resourceGroup), certExpiration = properties.expirationDate, certThumbprint, canonicalName ) on $left.thumbprint == $right.certThumbprint // Final output: sites with restricted public access and using ASMC for custom hostname SSL bindings | project siteName, siteId, siteResourceGroup, publicNetworkAccess, clientCertEnabled, thumbprint, certName, certId, certResourceGroup, certExpiration, canonicalName Scenario 2: Traffic Manager Endpoint Types For this scenario, please manually review your Traffic Manager profile configurations to ensure only “Azure Endpoints” are in use. We recommend inspecting your Traffic Manager profiles directly in the Azure portal or using relevant APIs to confirm your setup and ensure compliance with the new requirements. Scenario 3: Certificates Issued to *.trafficmanager.net Domains This ARG query helps you identify App Service Managed Certificates (ASMC) that were issued to *.trafficmanager.net domains. In addition, it also checks whether any web apps are currently using those certificates for custom domain SSL bindings. You can copy this query, paste it into Azure Resource Graph Explorer, and then click "Run query" to view the results for your environment. // ARG Query: Identify App Service Managed Certificates (ASMC) issued to *.trafficmanager.net domains // Also checks if any web apps are currently using those certificates for custom domain SSL bindings resources | where type == "microsoft.web/certificates" // Extract the certificate thumbprint and canonicalName (ASMCs have a canonicalName property) | extend certThumbprint = tostring(properties.thumbprint), canonicalName = tostring(properties.canonicalName) // Only ASMC uses the "canonicalName" property // Filter for certificates issued to *.trafficmanager.net domains | where canonicalName endswith "trafficmanager.net" // Select key certificate properties for output | project certName = name, certId = id, certResourceGroup = tostring(properties.resourceGroup), certExpiration = properties.expirationDate, certThumbprint, canonicalName // Join with web apps to see if any are using these certificates for SSL bindings | join kind=leftouter ( resources | where type == "microsoft.web/sites" // Expand the list of SSL bindings for each site | mv-expand hostNameSslState = properties.hostNameSslStates | extend hostName = tostring(hostNameSslState.name), thumbprint = tostring(hostNameSslState.thumbprint) // Only consider bindings for *.trafficmanager.net custom domains with a certificate bound | where tolower(hostName) endswith "trafficmanager.net" and isnotempty(thumbprint) // Select key site properties for output | project siteName = name, siteId = id, siteResourceGroup = resourceGroup, thumbprint ) on $left.certThumbprint == $right.thumbprint // Final output: ASMCs for *.trafficmanager.net domains and any web apps using them | project certName, certId, certResourceGroup, certExpiration, canonicalName, siteName, siteId, siteResourceGroup Ongoing Updates We will continue to update this post with any new queries or important changes as they become available. Be sure to check back for the latest information. Note on Comments We hope this information helps you navigate the upcoming changes. To keep this post clear and focused, comments are closed. If you have questions, need help, or want to share tips or alternative detection methods, please visit our official support channels or the Microsoft Q&A, where our team and the community can assist you.22KViews1like1CommentBuilding a TOTP Authenticator App on Azure Functions and Azure Key Vault
Two-factor authentication (2FA) has become a cornerstone of modern digital security, serving as a crucial defense against unauthorized access and account compromises. While many organizations rely on popular authenticator apps like Microsoft Authenticator, there's significant value in understanding how to build and customize your own TOTP (Time-based One-Time Password) solution. This becomes particularly relevant for those requiring specific customizations, enhanced security controls, or seamless integration with existing systems. In this blog, I'll walk through building a TOTP authenticator application using Azure's modern cloud services. Our solution demonstrates using Azure Functions for server-side operations with Azure Key Vault for secrets management. A bonus section covers integrating with Azure Static Web Apps for the frontend. The solution supports the standard TOTP protocol (RFC 6238), ensuring compatibility with services like GitHub and Microsoft's own authentication systems. While this implementation serves as a proof of concept rather than a production-ready system, it provides a solid foundation for understanding how authenticator apps work under the hood. By walking through the core components - from secret management to token generation - it will share valuable insights into both authentication systems and cloud architecture. This knowledge proves especially valuable for teams considering building custom authentication solutions or those looking to better understand the security principles behind 2FA. Understanding TOTP Time-based One-Time Password (TOTP) is an algorithm that generates temporary passwords based on a shared secret key and the current time. The foundation of TOTP lies in its use of a shared secret key. When a user first enables 2FA with a service like GitHub, a unique secret key is generated. This key is then encoded into a QR code that the user scans with their authenticator app. This initial exchange of the secret is the only time it's transmitted between the service and the authenticator. For example, a service will provide a QR code that looks like this: On decoding that, we see that the text encoded within this QR code is: otpauth://totp/Test%20Token?secret=2FASTEST&issuer=2FAS When we break down this URI: otpauth:// specifies this is an OTP authentication totp/ indicates this is time-based (as opposed to counter-based HOTP) Test%20Token is the account name (URL encoded) secret=2FASTEST is the shared secret key issuer=2FAS identifies the service providing the 2FA Once the user scans this, the secret is shared with the app and both the service and authenticator app use it in combination with the current time to generate codes. The process divides time into 30-second intervals. For each interval, the current Unix timestamp is combined with the secret key using a cryptographic hash function (HMAC-SHA1), which produces a consistent 6-digit code that both sides can generate independently. Security in TOTP comes from several key design principles. The short 30-second validity window means that even if an attacker intercepts a code, they have very limited time to use it. The one-way nature of the hash function means that even with a valid code, an attacker cannot work backwards to discover the secret key. Additionally, since the system relies on UTC time, it works seamlessly across different time zones. Most services implement a small amount of time drift tolerance. Since device clocks may not be perfectly synchronized, services typically accept codes from adjacent 30 second time windows. This provides a balance between security and usability, ensuring that slight time differences don't prevent legitimate authentication attempts while maintaining the security benefits of time-based codes. TOTP has become the de facto standard for two-factor authentication across the internet. Its implementation in RFC 6238 ensures compatibility between different services and authenticator apps. This means that whether you're using Google Authenticator, Microsoft Authenticator, or building your own solution like we are, the underlying mechanics remain the same, providing a consistent and secure experience for users. Architecture Our TOTP authenticator is built with security and scalability in mind, leveraging Azure's managed services to handle sensitive authentication data. The system consists of two main components: the web frontend, the backend API, and the secret storage. Backend API: Implemented as Azure Functions, our backend provides endpoints for managing TOTP secrets and generating tokens. We use Azure Functions because they provide excellent security features through managed identities, automatic HTTPS enforcement, and built-in scaling capabilities. The API will contain endpoints for adding new 2FA accounts and retrieving tokens. Secret storage: Azure Key Vault serves as our secure storage for TOTP secrets. This choice provides several crucial benefits: hardware-level encryption for secrets, detailed access auditing, and automatic key rotation capabilities. Azure Key Vault's managed identity integration with Azure Functions ensures secure, certificate-free access to secrets, while its global redundancy guarantees high availability. Prerequisites To follow along this blog, you'll need the following: Azure subscription: You will need an active subscription to host the services we will use. Make sure you have appropriate permissions to create and manage resources. If you don't have one, you can sign up here: https://azure.microsoft.com/en-us/pricing/purchase-options/azure-account Visual Studio Code: For the development environment, install Visual Studio Code. Other IDEs are available, though we will be benefiting from the extensions within this IDE. Download VS Code here: https://code.visualstudio.com/ VS Code Azure extensions (optional): There are many different ways to deploy to Azure Static Web Apps and Azure Functions, but having one-click deploy functionality inside our IDE is extremely useful. To install on VS Code, head to Extensions > Search Azure Static Web Apps > Click Install and do the same for the Azure Functions extension. Building the app Deploying the resources We will need to create at least an Azure Key Vault resource, and if you want to test the Function in the cloud (not just locally) then an Azure Function App too. I've attached the Azure CLI commands to deploy these resources, though it can be done through the portal if that's more comfortable. Firstly, create an Azure Key Vault resource: az keyvault create \ --name <your-kv-name> \ --resource-group <your-rg> \ --location <region> Enable RBAC for your Azure Key Vault: az keyvault update \ --name <your-kv-name> \ --enable-rbac-authorization true Create new Azure Function App: az functionapp create \ --name <app-name> \ --storage-account <storage-name> \ --consumption-plan-location <region> \ --runtime node \ --runtime-version 18 \ --functions-version 4 Set Azure Key Vault name environment variable in Azure Function App: az functionapp config appsettings set \ --name <app-name> \ --resource-group <your-rg> \ --settings "KEY_VAULT_NAME=<your-kv-name>" Grant your Azure Function App's managed identity access to Azure Key Vault: az role assignment create \ --assignee-object-id <function-app-managed-identity> \ --role "Key Vault Secrets Officer" \ --scope /subscriptions/<subscription-id>/resourceGroups/<rg-name>/providers/Microsoft.KeyVault/vaults/<your-kv-name> Building the API The backend of our authenticator app serves as the secure foundation for managing 2FA secrets and generating TOTP tokens. While it might be tempting to handle TOTP generation entirely in the frontend (as some authenticator apps do), our cloud-based approach offers several advantages. By keeping secrets server-side, we can provide secure backup and recovery options, implement additional security controls, and protect against client-side vulnerabilities. The backend API will have two key responsibilities which the frontend will trigger: Securely store new account secrets Generating valid TOTP tokens on demand First, we need to create an Azure Functions project in VS Code. The creation wizard will ask you to create a trigger, so let's start with (1) and create a trigger for processing new accounts: Go to Azure tab > Click the Azure Functions icon > Click Create New Project > Choose a folder > Choose JavaScript > Choose Model V4 > Choose HTTP trigger > Provide a name ('accounts') > Click Open in new window. Let's make a few modifications to this base Function: Ensure that the only allowed HTTP method is POST, as there is no need to support both and we will make use of the request body allowed in POST requests. Clear everything inside that function to make way for our upcoming code. Now, let's work forward from this adjusted base: const { app } = require("@azure/functions"); app.http("accounts", { methods: ["POST"], authLevel: "anonymous", handler: async (request, context) => { } }); This accounts endpoint will be responsible for securely storing new TOTP secrets when users add accounts to their authenticator. Here's what we need this endpoint to do: Receive the new account details: the TOTP secret, account name and issuer (extracted from the QR code on the frontend) Validate the request, ensuring proper formatting of all fields and that the user is authenticated Store the secret in Azure Key Vault with appropriate metadata Return success/failure status to allow the frontend to update accordingly. First, let's validate the incoming request data. When setting up two-factor authentication, services provide a QR code containing a URI in the otpauth:// format. This standardized format includes all the information we need to set up TOTP authentication. Assuming the frontend has decoded the QR code and sent us the resulting data, let's add some code to parse and validate this URI format. We'll use JavaScript's built-in URL class to handle the parsing, which will also take care of URL encoding/decoding for us. Add the following code to the function: // First, ensure we have a JSON payload let requestBody; try { requestBody = await request.json(); } catch (error) { context.log('Error parsing request body:', error); return { status: 400, jsonBody: { error: 'Invalid request format', details: 'Request body must be valid JSON containing a TOTP URI' } }; } // Check for the URI in the request const { uri } = requestBody; if (!uri || typeof uri !== 'string') { return { status: 400, jsonBody: { error: 'Missing or invalid TOTP URI', details: 'Request must include a "uri" field containing the TOTP setup URI' } }; } This first section of code handles the basic validation of our incoming request data. We start by attempting to parse the request body as JSON using request.json(), wrapping it in a try-catch block to handle any parsing failures gracefully. If the parsing fails, we return a 400 Bad Request status with a clear error message. After successfully parsing the JSON, we check for the presence of a uri field in the request body and ensure it's a string value. This validation ensures we have the minimum required data before we attempt to parse the actual TOTP URI in the next step. Let's now move on to parsing and validating the TOTP URI itself. This URI should contain all the important information: the type of OTP (TOTP in our case), the account name, the secret key, and optionally the issuer. Here's an example of a valid URI which would be provided by services: otpauth://totp/Test%20Token?secret=2FASTEST&issuer=2FAS To parse this, add the following code after our initial validation: // Parse and validate the TOTP URI try { const totpUrl = new URL(uri); // Validate it's a TOTP URI if (totpUrl.protocol !== "otpauth:") { throw new Error("URI must use otpauth:// protocol"); } if (totpUrl.host !== "totp") { throw new Error("URI must be for TOTP authentication"); } // Extract the components const accountName = decodeURIComponent(totpUrl.pathname.split("/")[1]); const secret = totpUrl.searchParams.get("secret"); const issuer = totpUrl.searchParams.get("issuer"); // Validate required components if (!secret) { throw new Error("Missing secret in URI"); } // Store the parsed data for the next step const validatedData = { accountName, secret, issuer: issuer || accountName, // Fall back to account name if issuer not specified }; ... } catch (error) { context.log("Error validating TOTP URI:", error); return { status: 400, jsonBody: { error: "Invalid TOTP URI", details: error.message, }, }; } We use JavaScript's built-in URL class to do the heavy lifting of parsing the URI components. We first verify this is actually a TOTP URI by checking the protocol and path. Then we extract the three key pieces of information: the account name (from the path), the secret key, and the issuer (both from the query parameters). We validate that the essential secret is present and store all this information in a validatedData object. Now that we have our TOTP data properly validated and parsed, let's move on to setting up our Azure Key Vault integration. Firstly, we must install the required Azure SDK packages: npm install azure/identity azure/keyvault-secrets Now we can add the Azure Key Vault integration to our function. Add these imports at the top of your file: const { DefaultAzureCredential } = require('@azure/identity'); const { SecretClient } = require('@azure/keyvault-secrets'); const { randomUUID } = require('crypto'); // Initialize Key Vault client const credential = new DefaultAzureCredential(); const vaultName = process.env.KEY_VAULT_NAME; const vaultUrl = `https://${vaultName}.vault.azure.net`; const secretClient = new SecretClient(vaultUrl, credential); This code sets up our connection to Azure Key Vault using Azure's managed identity authentication. The DefaultAzureCredential will automatically handle authentication when deployed to Azure, and our vault name comes from an environment variable to keep it configurable. Be sure to go and set the KEY_VAULT_NAME variable inside of your local.settings.json file. Now let's add the code to store our TOTP secret in Azure Key Vault. Add this after our URI validation: // Create a unique name for this secret const secretName = `totp-${Date.now()}`; // Store the secret in Key Vault with metadata try { await secretClient.setSecret(secretName, validatedData.secret, { contentType: 'application/json', tags: { accountName: validatedData.accountName, issuer: validatedData.issuer, type: 'totp-secret' } }); context.log(`Stored new TOTP secret for account ${validatedData.accountName}`); return { status: 201, jsonBody: { message: 'TOTP secret stored successfully', secretName: secretName, accountName: validatedData.accountName, issuer: validatedData.issuer } }; } catch (error) { context.error('Error storing secret in Key Vault:', error); return { status: 500, jsonBody: { error: 'Failed to store TOTP secret' } }; } When storing the secret, we use setSecret with three important parameters: A unique name generated using a UUID (totp-${randomUUID()}). This ensures each secret has a globally unique identifier with no possibility of collisions, even across distributed systems. The resulting name looks like totp-123e4567-e89b-12d3-a456-426614174000. The actual TOTP secret we extracted from the URI. Metadata about the secret, including: contentType marking this as JSON data tags containing the account name and issuer, which helps us identify the purpose of each secret without needing to retrieve its actual value A type tag marking this specifically as a TOTP secret. If the storage succeeds, we return a 201 Created status with details about the stored secret (but never the secret itself). The returned secretName is particularly important as it will be used later when we need to retrieve this secret to generate TOTP codes. Now that we can securely store TOTP secrets, let's create our second endpoint that generates the 6-digit codes. This endpoint will: Retrieve a secret from Azure Key Vault using its unique ID Generate a valid TOTP code based on the current time Return the code along with its remaining validity time Follow the same setup steps as earlier, and ensure you have an empty function. I've named it tokens and set it as a GET request: app.http('tokens', { methods: ['GET'], authLevel: 'anonymous', handler: async (request, context) => { } }); Let's add the code to validate the query parameter and retrieve the secret from Azure Key Vault. A valid request will look like this: /api/tokens?id=totp-123e4567-e89b-12d3-a456-426614174000 We want to ensure the ID parameter exists and has the correct format: // Get the secret ID from query parameters const secretId = request.query.get('id'); // Validate the secret ID format if (!secretId || !secretId.startsWith('totp-')) { return { status: 400, jsonBody: { error: 'Invalid or missing secret ID. Must be in format: totp-{uuid}' } }; } This code first checks if we have a properly formatted secret ID in our query parameters. The ID should start with totp- and be followed by a UUID, matching the format we used when storing secrets in our first endpoint. If the ID is missing or invalid, we return a 400 Bad Request with a helpful error message. Now if the ID is valid, we should attempt to retrieve the secret from Azure Key Vault: try { // Retrieve the secret from Key Vault const secret = await secretClient.getSecret(secretId); ... } catch (error) { context.error('Error retrieving secret:', error); return { status: 500, jsonBody: { error: 'Failed to retrieve secret' } }; } If anything goes wrong during this process (like the secret doesn't exist or we have connection issues), we log the error and return a 500 Internal Server Error. Now that we have the secret from Azure Key Vault, let's add the code to generate the 6-digit TOTP code. First, install otp package: npm install otp Then add this import at the top of your file: const OTP = require('otp'); Now let's generate a 6-digit TOTP using this library from the data retrieved from Azure Key Vault: const totp = new OTP({ secret: secret.value }); // Generate the current token const token = totp.totp(); // Calculate remaining seconds in this 30-second window const timeRemaining = 30 - (Math.floor(Date.now() / 1000) % 30); return { status: 200, jsonBody: { token, timeRemaining } }; Let's break down exactly how this code generates our 6-digit TOTP code. When we generate a TOTP code, we're using our stored secret key to create a unique 6-digit number that changes every 30 seconds. The OTP library handles this through several steps behind the scenes. First, when we create a new OTP instance with new OTP({ secret: secret.value }), we're setting up a TOTP generator with our base32-encoded secret (like 'JBSWY3DPEHPK3PXP') that we retrieved from Azure Key Vault. When we call totp(), the library takes our secret and combines it with the current time to generate a code. It takes the current Unix timestamp, divides it by 30 to get the current time window, then uses this value and our secret in an HMAC-SHA1 operation. The resulting hash is then dynamically truncated to give us exactly 6 digits. This is why anyone with the same secret will generate the same code within the same 30-second window. To help users know when the current code will expire, we calculate timeRemaining by finding out how far we are into the current 30-second window and subtracting that from 30. This gives users a countdown until the next code will be generated. With both our endpoints complete, we now have a functional backend for our TOTP authenticator. The first endpoint securely stores TOTP secrets in Azure Key Vault, generating a unique ID for each one. The second endpoint uses these IDs to retrieve secrets and generate valid 6-digit TOTP codes on demand. This server-side approach offers several advantages over traditional authenticator apps: our secrets are securely stored in Azure Key Vault rather than on user devices, we can easily back up and restore access if needed, and we can add additional security controls around code generation. Testing First, we'll need to run the functions locally using the Azure Functions Core Tools. Open your terminal in the project directory and run: func start I'm using a website designed to check if your 2FA app is working correctly. It creates a valid QR code, and also calculates the TOTP on their end so you can compare results. I highly recommend using this alongside me to test our solution: https://2fas.com/check-token/ It will present you with a QR code. You can scan it in your frontend, though you can copy/paste the below which is the exact same value: otpauth://totp/Test%20Token?secret=2FASTEST&issuer=2FAS Now let's test our endpoints sequentially using curl (or Postman if you prefer). My functions started on port 7071, be sure to check yours before you send the request. Let's start with adding the above secret to Azure Key Vault: curl -X POST http://localhost:7071/api/accounts \ -H "Content-Type: application/json" \ -d '{ "uri": "otpauth://totp/Test%20Token?secret=2FASTEST&issuer=2FAS" }' This should return a response containing the generated secret ID (your UUID will be different): { "message": "TOTP secret stored successfully", "secretName": "totp-f724efb9-a0a7-441f-86c3-2cd36647bfcf", "accountName": "Test Token", "issuer": "2FAS" } Sidenote: If you head to Azure Key Vault in the Azure portal, you can see the saved secret: Now we can use this secretName to generate TOTP codes: curl http://localhost:7071/api/tokens?id=totp-550e8400-e29b-41d4-a716-446655440000 The response will include a 6-digit code and the remaining time until it expires: { "token": "530868", "timeRemaining": 26 } To prove that this is accurate, quickly look again at the website, and you should see the exact same code and a very similar time remaining: This confirms that your code is valid! You can keep generating new codes and checking them - remember that the code changes every 30 seconds, so be quick when testing and validating. Bonus: Frontend UI While not the focus of this blog, as bonus content I've put together a React component which provides a functional interface for our TOTP authenticator. This component allows users to upload QR codes provided by other services, processes them to extract the TOTP URI, sends it to our backend for storage, and then displays the generated 6-digit code with a countdown timer. Here's how it looks: As you can see, I've followed a similar style to other known and modern authenticator apps. I recommend writing your own code for the user interface, as it's highly subjective. However, the following is the full React component in case you can benefit from it: import React, { useState, useEffect, useCallback } from "react"; import { Shield, UserCircle, Plus, Image as ImageIcon } from "lucide-react"; import jsQR from "jsqr"; const TOTPAuthenticator = () => { const [secretId, setSecretId] = useState(null); const [token, setToken] = useState(null); const [timeRemaining, setTimeRemaining] = useState(null); const [localTimer, setLocalTimer] = useState(null); const [error, setError] = useState(null); const [isPasting, setIsPasting] = useState(false); useEffect(() => { let timerInterval; if (timeRemaining !== null) { setLocalTimer(timeRemaining); timerInterval = setInterval(() => { setLocalTimer((prev) => { if (prev <= 0) return timeRemaining; return prev - 1; }); }, 1000); } return () => clearInterval(timerInterval); }, [timeRemaining]); const processImage = async (imageData) => { try { const img = new Image(); img.src = imageData; await new Promise((resolve, reject) => { img.onload = resolve; img.onerror = reject; }); const canvas = document.createElement("canvas"); const context = canvas.getContext("2d"); canvas.width = img.width; canvas.height = img.height; context.drawImage(img, 0, 0); const imgData = context.getImageData(0, 0, canvas.width, canvas.height); const code = jsQR(imgData.data, canvas.width, canvas.height); if (!code) { throw new Error("No QR code found in image"); } const response = await fetch( "http://localhost:7071/api/accounts", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ uri: code.data }), } ); const data = await response.json(); if (!response.ok) throw new Error(data.error); setSecretId(data.secretName); setToken({ issuer: data.issuer, accountName: data.accountName, code: "--", }); setError(null); } catch (err) { setError(err.message); } finally { setIsPasting(false); } }; const handlePaste = useCallback(async (e) => { e.preventDefault(); setIsPasting(true); setError(null); try { const items = e.clipboardData.items; const imageItem = Array.from(items).find((item) => item.type.startsWith("image/") ); if (!imageItem) { throw new Error("No image found in clipboard"); } const blob = imageItem.getAsFile(); const reader = new FileReader(); reader.onload = async (event) => { await processImage(event.target.result); }; reader.onerror = () => { setError("Failed to read image"); setIsPasting(false); }; reader.readAsDataURL(blob); } catch (err) { setError(err.message); setIsPasting(false); } }, []); const handleDrop = useCallback(async (e) => { e.preventDefault(); setIsPasting(true); setError(null); try { const file = e.dataTransfer.files[0]; if (!file || !file.type.startsWith("image/")) { throw new Error("Please drop an image file"); } const reader = new FileReader(); reader.onload = async (event) => { await processImage(event.target.result); }; reader.onerror = () => { setError("Failed to read image"); setIsPasting(false); }; reader.readAsDataURL(file); } catch (err) { setError(err.message); setIsPasting(false); } }, []); const handleDragOver = (e) => { e.preventDefault(); }; useEffect(() => { let interval; const fetchToken = async () => { try { const response = await fetch( `http://localhost:7071/api/tokens?id=${secretId}` ); const data = await response.json(); if (!response.ok) throw new Error(data.error); setToken((prevToken) => ({ ...prevToken, code: data.token, })); setTimeRemaining(data.timeRemaining); const nextFetchDelay = data.timeRemaining * 1000 || 30000; interval = setTimeout(fetchToken, nextFetchDelay); } catch (err) { setError(err.message); interval = setTimeout(fetchToken, 30000); } }; if (secretId) { fetchToken(); } return () => clearTimeout(interval); }, [secretId]); if (!secretId) { return ( <div className="w-[416px] max-w-full mx-auto bg-white rounded-xl shadow-md overflow-hidden"> <div className="bg-[#0078D4] p-4 text-white flex items-center gap-2"> <Shield className="mt-px" size={24} /> <h2 className="text-xl font-semibold m-0">My Authenticator</h2> </div> <div className="p-6"> <div className={`w-full p-10 border-2 border-dashed border-gray-300 rounded-lg text-center cursor-pointer transition-all duration-200 ${ isPasting ? "bg-gray-100" : "bg-white" }`} onPaste={handlePaste} onDrop={handleDrop} onDragOver={handleDragOver} tabIndex={0} > <ImageIcon size={32} className="text-gray-600 mx-auto" /> <p className="text-gray-600 mt-3 text-sm"> {isPasting ? "Processing..." : "Paste or drop QR code here"} </p> </div> {error && <div className="text-red-600 text-sm mt-2">{error}</div>} </div> </div> ); } return ( <div className="w-[416px] max-w-full mx-auto bg-white rounded-xl shadow-md overflow-hidden"> <div className="bg-[#0078D4] p-4 text-white flex items-center gap-2"> <Shield className="mt-px" size={24} /> <h2 className="text-xl font-semibold m-0">My Authenticator</h2> </div> <div className="flex items-center p-4 border-b"> <div className="bg-gray-100 rounded-full w-10 h-10 flex items-center justify-center mr-4"> <UserCircle size={24} className="text-gray-600" /> </div> <div className="flex-1"> <h3 className="text-base font-medium text-gray-800 m-0"> {token?.issuer || "--"} </h3> <p className="text-sm text-gray-600 mt-1 m-0"> {token?.accountName || "--"} </p> </div> <div className="text-right"> <p className="text-2xl font-medium text-gray-800 m-0 mb-0.5"> {token?.code || "--"} </p> <p className="text-xs text-gray-600 m-0"> {localTimer || "--"} seconds </p> </div> </div> <div className="p-6"> <div className={`w-full p-10 border-2 border-dashed border-gray-300 rounded-lg text-center cursor-pointer transition-all duration-200 ${ isPasting ? "bg-gray-100" : "bg-white" }`} onPaste={handlePaste} onDrop={handleDrop} onDragOver={handleDragOver} tabIndex={0} > <ImageIcon size={32} className="text-gray-600 mx-auto" /> <p className="text-gray-600 mt-3 text-sm"> {isPasting ? "Processing..." : "Paste or drop QR code here"} </p> </div> </div> {error && <div className="text-red-600 text-sm mt-2">{error}</div>} </div> ); }; export default TOTPAuthenticator; For deployment, I recommend Azure Static Web Apps because it offers built-in authentication, global CDN distribution, and seamless integration with our Azure Functions backend. Summary In this blog, we've built a TOTP authenticator that demonstrates both the inner workings of two-factor authentication and modern cloud architecture. We've demystified how TOTP actually works - from the initial QR code scanning and secret sharing, to the time-based algorithm that generates synchronized 6-digit codes. By implementing this ourselves using Azure services like Azure Key Vault and Azure Functions, we've gained deep insights into both the security protocol and cloud-native development. While this implementation focuses on the core TOTP functionality, it serves as a foundation that you can build upon with features like authenticated multi-user support, backup codes, or audit logging. Whether you're interested in authentication protocols, cloud architecture, or both, this project provides hands-on experience with real-world security implementations. The complete source code for this project is available on my GitHub repository: https://github.com/stephendotgg/azure-totp-authenticator Thanks for reading! Hopefully this has helped you understand TOTP and Azure services better.1.8KViews1like1CommentAnnouncing the public preview launch of Azure Functions durable task scheduler
We are excited to roll out the public preview of the Azure Functions durable task scheduler. This new Azure-managed backend is designed to provide high performance, improve reliability, reduce operational overhead, and simplify the monitoring of your stateful orchestrations. If you missed the initial announcement of the private preview, see this blog post. Durable Task Scheduler Durable functions simplifies the development of complex, stateful, and long-running apps in a serverless environment. It allows developers to orchestrate multiple function calls without having to handle fault tolerance. It's great for scenarios like orchestrating multiple agents, distributed transactions, big data processing, batch processing like ETL (extract, transform, load), asynchronous APIs, and essentially any scenario that requires chaining function calls with state persistence. The durable task scheduler is a new storage provider for durable functions, designed to address the challenges and gaps identified by our durable customers with existing bring-your-own storage options. Over the past few months, since the initial limited early access launch of the durable task scheduler, we’ve been working closely with our customers to understand their requirements and ensure they are fully supported in using the durable task scheduler successfully. We’ve also dedicated significant effort to strengthening the fundamentals – expanding regional availability, solidifying APIs, and ensuring the durable task scheduler is reliable, secure, scalable, and can be leveraged from any of the supported durable functions programming languages. Now, we’re excited to open the gates and make the durable task scheduler available to the public. Some notable capabilities and enhancements over the existing “bring your own storage” options include: Azure Managed Unlike the other existing storage providers for durable functions, the durable task scheduler offers dedicated resources that are fully managed by Azure. You no longer need to bring your own storage account for storing orchestration and entity state, as it is completely built in. Looking ahead, the roadmap includes additional operational capabilities, such as auto-purging old execution history, handling failover, and other Business Continuity and Disaster Recovery (BCDR) capabilities. Superior Performance and Scalability Enhanced throughput for processing orchestrations and entities, ideal for demanding and high-scale applications. Efficiently manages sudden bursts of events, ensuring reliable and quick processing of your orchestrations across your function app instances. The table below compares the throughput of the durable task scheduler provider and the Azure Storage provider. The function app used for this test runs onone to four Elastic Premium EP2 instances. The orchestration code was written in C# using the .NET Isolated worker model on NET 8. The same app was used for all storage providers, and the only change was the backend storage provider configuration. The test is triggered using an HTTP trigger which starts 5,000 orchestrations concurrently. The benchmark used a standard orchestrator function calling five activity functions sequentially, each returning a "Hello, {cityName}!" string. This specific benchmark showed that the durable task scheduler is roughly five times faster than the Azure Storage provider. Orchestration Debugging and Management Dashboard Simplify the monitoring and management of orchestrations with an intuitive out-of-the-box UI. It offers clear visibility into orchestration errors and lifecycle events through detailed visual diagrams, providing essential information on exceptions and processing times. It also enables interactive orchestration management, allowing you to perform ad hoc actions such as suspending, resuming, raising events, and terminating orchestrations. Monitor the inputs and outputs between orchestration and activities. Exceptions surfaced making it easy to identify where and why an orchestration may have failed. Security Best Practices Uses identity-based authentication with Role-Based Access Control (RBAC) for enterprise-grade authorization, eliminating the need for SAS tokens or access keys. Local Emulator To simplify the development experience, we are also launching a durable task scheduler emulator that can be run as a container on your development machine. The emulator supports the same durable task scheduler runtime APIs and stores data in local memory, enabling a completely offline debugging experience. The emulator also allows you to run the durable task scheduler management dashboard locally. Pricing Plan We’re excited to announce the initial launch of the durable task scheduler with a Dedicated, fixed pricing plan. One of the key pieces of feedback that we’ve consistently received from customers is the desire for more upfront billing transparency. To address this, we’ve introduced a fixed pricing model with the option to purchase a specific amount of performance and storage through an abstraction called a Capacity Unit (CU). A single CU provides : Single tenancy with dedicated resources for predictable performance Up to 2,000 work items* dispatched per second 50GB of orchestration data storage A Capacity Unit (CU) is a measure of the resources allocated to your durable task scheduler. Each CU represents a pre-allocated amount of CPU, memory, and storage resources. A single CU guarantees the dispatch of a certain number of work items and provides a defined amount of storage. If additional performance and/or storage are needed, more CUs can be purchased*. A Work Item is a message dispatched by the durable task scheduler to your application, triggering the execution of orchestrator, activity, or entity functions. The number of work items that can be dispatch per second is determined by the Capacity Units allocated to the durable task scheduler. For detailed instructions on determining the number of work items your applications needs and the number of CUs you should purchase, please refer to the guidance provided here. *At the beginning of the public preview phase, schedulers will be temporarily limited to a single CU. *Billing for the durable task scheduler will begin on May 1st, 2025. Under the Hood The durable functions team has been continuously evolving the architecture of the backends that persist the state of orchestrations and entities; the durable task scheduler is the latest installment in this series, and it includes both the most successful characteristics of its predecessors, as well as some significant improvements of its own. In the next paragraph, we shed some light on what is new. Of course, it is not necessary to understand these internal implementation details, and they are subject to change as we will keep improving and optimizing the design. Like the MSSQL provider, the durable task scheduler uses a SQL database as the storage foundation, to provide robustness and versatility. Like the Netherite provider, it uses a partitioned design to achieve scale-out, and a pipelining optimization to boost the partition persistence. Unlike the previous backends, however, the durable task scheduler runs as a service, on its own compute nodes, to which workers are connected by GRPC. This significantly improves latency and load balancing. It strongly isolates the workflow management logic from the user application, allowing them to be scaled separately. What can we expect next for the durable task scheduler? One of the most exciting developments is the significant interest we’ve received in leveraging the durable task scheduler across other Azure compute offerings beyond Azure Functions, such as Azure Container Apps (ACA) and Azure Kubernetes Service (AKS). As we continue to enhance the integration with durable functions, we have also integrated the durable task SDKs, which are the underlying technology behind the durable task framework and durable functions, to support the durable task scheduler directly. We refer to these durable task sdks as the “portable SDKs” because they are a client-only SDK that connects directly to the durable task scheduler, where the managed orchestration engine resides, eliminating any dependency on the underlying compute platform, hence the name “portable”. By utilizing the portable SDKs to author your orchestrations as code, you can deploy your orchestrations across any Azure compute offering. This allows you to leverage the durable task scheduler as the backend, benefiting from its full set of capabilities. If you would like to discuss this further with our team or are interested in trying out the portable SDK yourself, please feel free to reach out to us at DurableTaskScheduler@microsoft.com . We welcome your questions and feedback. We've also received feedback from customers’ requesting a versioning mechanism to facilitate zero downtime deployments. This feature would enable you to manage breaking workflow changes by allowing all in-flight orchestrations using the older version to complete, while switching new orchestrations to the updated version. This is already in development and will be available in the near future. Lastly, we are in the process of introducing critical enterprise features under the category of Business Continuity and Disaster Recovery (BCDR). We understand the importance of these capabilities as our customers rely on the durable task scheduler for critical production scenarios. Get started with the durable task scheduler Migrating to the durable task scheduler from an existing durable function application is a quick process. The transition is purely configuration changes, meaning your existing orchestrations and business logic remain unchanged. The durable task scheduler is provided through a new Azure resource known as a scheduler. Each scheduler can contain one or multiple task hubs, which are sub-resources. A task hub, an established concept within durable functions, is responsible for managing the state and execution of orchestrations and activities. Think of a task hub as a logical way to separate your applications that require orchestrations execution. A Durable Task scheduler resource in the Azure Portal includes a task hub named dts-github-agent One you have created a scheduler and task hub(s), simply add the library package to your project and update your host.json to point your function app to the durable task scheduler endpoint and task hub. That’s all there is to it. With the correct authentication configuration applied, your applications can fully leverage the capabilities of the durable task scheduler. For more detailed information on how to migrate or start using the durable task scheduler, visit our getting started page here. Get in touch to learn more We are always interested in engaging with both existing and potential new customers. If any of the above interests you,, if you have any questions, or if you simply want to discuss your scenarios and explore how you can leverage the durable task scheduler, feel free to reach out to us anytime. Our line is always open - DurableTaskScheduler@microsoft.com.3.9KViews1like3CommentsCall Function App from Azure Data Factory with Managed Identity Authentication
Integrating Azure Function Apps into your Azure Data Factory (ADF) workflows is a common practice. To enhance security beyond the use of function API keys, leveraging managed identity authentication is strongly recommended. Given the fact that many existing guides were outdated with recent updates to Azure services, this article provides a comprehensive, up-to-date walkthrough on configuring managed identity in ADF to securely call Function Apps. The provided methods can also be adapted to other Azure services that need to call Function Apps with managed identity authentication. The high level process is: Enable Managed Identity on Data Factory Configure Microsoft Entra Sign-in on Azure Function App Configure Linked Service in Data Factory Assign Permissions to the Data Factory in Azure Function Step 1: Enable Managed Identity on Data Factory On the Data Factory’s portal, go to Managed Identities, and enable a system assigned managed identity. Step 2: Configure Microsoft Entra Sign-in on Azure Function App 1. Go to Function App portal and enable Authentication. Choose "Microsoft" as the identity provider. 2. Add an app registration to the app, it could be an existing one or you can choose to let the platform create a new app registration. 3. Next, allow the ADF as a client application to authenticate to the function app. This step is a new requirement from previous guides. If these settings are not correctly set, the 403 response will be returned: Add the Application ID of the ADF managed identity in Allowed client application and Object ID of the ADF managed identity in the Allowed identities. If the requests are only allowed from specific tenants, add the Tenant ID of the managed identity in the last box. 4. This part sets the response from function app for the unauthenticated requests. We should set the response as "HTTP 401 Unauthorized: recommended for APIs" as sign-in page is not feasible for API calls from ADF. 5. Then, click next and use the default permission option. 6. After everything is set, click "Add" to complete the configuration. Copy the generated App (client) id, as this is used in data factory to handle authorization. Step 3: Configure Linked Service in Data Factory 1. To use an Azure Function activity in a pipeline, follow the steps here: Create an Azure Function activity with UI 2. Then Edit or New a Azure Function Linked Service. 3. Change authentication method to System Assigned Managed Identity, and paste the copied client ID of function app identity provider from Step 2 into Resource ID. This step is necessary as authorization does not work without this. Step 4: Assign Permissions to the Data Factory in Azure Function 1. On the function app portal, go to Access control (IAM), and Add a new role assignment. 2. Assign reader role. 3. Assign the Data Factory’s Managed Identity to that role. After everything is set, test that the function app can be called from Azure Data Factory successfully. Reference: https://prodata.ie/2022/06/16/enabling-managed-identity-authentication-on-azure-functions-in-data-factory/ https://learn.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity https://docs.azure.cn/en-us/app-service/overview-authentication-authorization1.1KViews0likes0Comments