azure functions apps
2 TopicsAzure Database for MySQL bindings for Azure Functions (General Availability)
We’re thrilled to announce the general availability (GA) of Azure Database for MySQL Input and Output bindings for Azure Functions—a powerful way to build event-driven, serverless applications that seamlessly integrate with your MySQL databases. Key Capabilities With this GA release, your applications can use: Input bindings that allow your function to retrieve data from a MySQL database without writing any connection or query logic. Output bindings that allow your function to insert or update data in a MySQL table without writing explicit SQL commands. In addition you can use both the input and output bindings in the same function to read-modify-write data patterns. For example, retrieve a record, update a field, and write it back—all without managing connections or writing SQL. These bindings are fully supported for both in-process and isolated worker models, giving you flexibility in how you build and deploy your Azure Functions. How It Works Azure Functions bindings abstract away the boilerplate code required to connect to external services. With the MySQL Input and Output bindings, you can now declaratively connect your serverless functions to your Azure Database for MySQL database with minimal configuration. You can configure these bindings using attributes in C#, decorators in Python, or annotations in JavaScript/Java. The bindings use the MySql.Data.MySqlClient library under the hood and support Azure Database for MySQL Flexible Server. Getting Started To use the bindings, install the appropriate NuGet or npm package: # For isolated worker model (C#) dotnet add package Microsoft.Azure.Functions.Worker.Extensions.MySql # For in-process model (C#) dotnet add package Microsoft.Azure.WebJobs.Extensions.MySql Then, configure your function with a connection string and binding metadata. Full samples for all the supported programming frameworks are available in our github repository. Here is a sample C# in-process function example where you want to retrieve a user by ID, increment their login count, and save the updated record back to the MySQL database for lightweight data transformations, modifying status fields or updating counters and timestamps. public class User { public int Id { get; set; } public string Name { get; set; } public int LoginCount { get; set; } } public static class UpdateLoginCountFunction { [FunctionName("UpdateLoginCount")] public static async Task<IActionResult> Run( [HttpTrigger(AuthorizationLevel.Function, "post", Route = "user/{id}/login")] HttpRequest req, [MySql("SELECT * FROM users WHERE id = @id", CommandType = System.Data.CommandType.Text, Parameters = "@id={id}", ConnectionStringSetting = "MySqlConnectionString")] User user, [MySql("users", ConnectionStringSetting = "MySqlConnectionString")] IAsyncCollector<User> userCollector, ILogger log) { if (user == null) { return new NotFoundObjectResult("User not found."); } // Modify the user object user.LoginCount += 1; // Write the updated user back to the database await userCollector.AddAsync(user); return new OkObjectResult($"Login count updated to {user.LoginCount} for user {user. Name}."); } } Learn More Azure Functions MySQL Bindings Azure Functions Conclusion With input and output bindings for Azure Database for MySQL now generally available, building serverless apps on Azure with MySQL has never been simpler or more efficient. By eliminating the need for manual connection management and boilerplate code, these bindings empower you to focus on what matters most: building scalable, event-driven applications with clean, maintainable code. Whether you're building real-time dashboards, automating workflows, or syncing data across systems, these bindings unlock new levels of productivity and performance. We can’t wait to see what you’ll build with them. If you have any feedback or questions about the information provided above, please leave a comment below or email us at AskAzureDBforMySQL@service.microsoft.com. Thank you!Deploying Logic Apps Standard with Managed Identity and private networking
I was working with a customer who needed to implement some automation tasks to support their application. The automation tasks would be driven on data in their Azure SQL database. As most of their developers were busy tackling their backlog, I thought "What if we could use Logic Apps to do a no code solution with their operations team?" The first step was of course to deploy the Logic App. As the customer is running fully private networking for all services (Azure SQL, Storage, etc.), we would deploy Logic Apps Standard, which runs on the Azure App Service runtime similar to Function Apps. Like a Function App, the Logic App uses a storage account in the background. However, when deploying through the portal, the deployment failed with a 403 Unauthorized error. Of course! The customer had disabled shared key access to storage accounts, utilizing Entra ID exclusively. Like our post about setting up an Azure Container Instance to use Managed Identity to connect to an Azure Container Registry, we'd have to write a bit of Bicep script to utilize a User Assigned Managed Identity which has rights to Azure Storage and SQL. I created the User Assigned Managed Identity and granted it the following roles on the storage account: Storage Account Contributor Storage Blob Data Contributor Storage Queue Data Contributor Storage Table Data Contributor While that solved the access issue and allowed the deployment to complete, the Logic App was showing a runtime error in the portal, stating that it was "Unable to load the proper Managed Identity." As it turns out, we need to explicitly tell the Logic App in the App Service configuration that we are using Managed Identity as our authentication mechanism, and which Managed Identity we want to use. Once I added that to the configuration, my Managed Identity error went away, but I was still getting a runtime error. Looking in the log stream, I could see that I was getting many errors trying to talk to the storage account queue and table endpoints. Because the client is using all private networking, we needed to setup a private endpoint and associated private DNS entry for all three storage account endpoints: blob, queue and table. Once I added those private endpoints and added them to my App Service configuration, my Logic App deployed and ran successfully. I've added the Bicep code for the Logic App service here. One final "Gotcha": if you look in my Bicep, you will note that I am specifying both the User Assigned Managed Identity as well as a System Assigned Managed Identity. The reason for this is, when using a SQL connector, Managed Identity was not listed as an option for authentication. I was stumped by this at first, but then I noticed that it was an option in a portal deployed Logic App. The difference was that the portal deployment adds a System Assigned Managed Identity. Once I added this to my Bicep, the Managed Identity option showed up on the SQL connector. It appears that the connector is looking for the presence of a System Assigned Managed Identity to toggle that authentication option, but you can still use your User Assinged Managed Identity for SQL authentication.