Azure Functions has introduced a suite of capabilities designed to simplify the creation of serverless solutions. As customers develop more demanding workloads that necessitate rapid cold starts, dynamic scaling, and enhanced security, we are excited to announce a new set of features that meet the highest organizational needs.
We are also witnessing the continued growth of Azure OpenAI solutions as customers transition from proof of concepts to production-ready applications. Our integration with Azure OpenAI has been updated to facilitate the development of these production-ready applications on Azure Functions.
This blog highlights the preview and general availability features that have been recently released and are being launched now at Ignite.
General availability announcements
Azure Functions Flex consumption
Flex Consumption plan is a new Azure Functions hosting plan that builds on the consumption pay per second serverless billing model with automatic scale down to zero when not in use for cost efficiency. It provides you with more flexibility and customizability without compromising on existing capabilities. The Flex Consumption plan also allows you to enjoy seamless integration with your virtual network at no extra cost, ensuring secure and private communication, with no considerable impact to your app’s scale out performance. Now, customers can build serverless Functions-as-a-Service (FaaS) using Azure Functions for bust scale, higher throughput, improved reliability, better performance, and additional security on their own terms.
New capabilities include fast and large elastic scale, instance size selection, private networking, longer running executions, and concurrency control. Customers can run their serverless enterprise apps for event driven scale with negligible cold-start latency using always ready instances on Flex Consumption.
Learn more about the release on the accompanying blog
.NET 9 support
Azure Functions now supports .NET 9 for applications using the isolated worker model. You can now use the improved performance of .NET 9, write your code with the new language features of C# 13, and more. Support is being enabled for Windows and Linux on the Consumption, Elastic Premium, and App Service plan hosting options.
You can start using .NET 9 with Functions projects using the isolated worker model by changing the target framework and updating to the 2.0.0 versions of Microsoft.Azure.Functions.Worker and Microsoft.Azure.Functions.Worker.Sdk.
For more information about .NET 9 on the isolated worker model, see the developer guide.
Redis extension for Azure Functions
The Redis extension for Azure Functions is now generally available. The extension can be used as a trigger in Azure Functions, allowing Redis to initiate a serverless workflow. This functionality can be highly useful in data architectures like a write-behind cache, or any event-based architectures.
The Redis extension also supports Azure Functions input and output bindings. The input binding facilitates data retrieval from a cache, and this retrieved data can be passed into a function as an input parameter. The output binding allows users to change keys in a cache.
For more information, please reference the documentation.
Azure Functions now supports .NET 8 using in-process model
The in-process model for .NET applications on Azure Functions now supports .NET 8. Initial support covers Windows apps outside of App Service Environment deployments. Please see this GitHub thread for the latest status regarding Linux and Apps Service Environment deployments.
To date, apps using the in-process model in the Azure Functions V4 runtime have run on .NET 6, and support for .NET 6 ends on 12 November 2024. These apps can upgrade to .NET 8 either by migrating to .NET 8 on the isolated worker model or by leveraging this new in-process option for .NET 8.
As a reminder, support for the in-process model for .NET on Azure Functions ends 10 November 2026. Function authors that choose this in-process model option for .NET 8 should still begin planning a migration to the isolated worker model prior to that date.
For more info: https://aka.ms/azure-functions/dotnet/net8-in-process
PowerShell 7.4
You can now develop apps using PowerShell 7.4 locally and deploy them to all Azure Functions plans.
To learn more about the release, see What’s New in PowerShell 7.4. To learn more about what has changed from PowerShell 7.2 to 7.4, also see What's New in PowerShell 7.3.
Note that PowerShell 7.4 includes breaking changes. To upgrade your Function Applications today, see the PowerShell 7.4 migration guide.
Public preview announcements
Azure OpenAI triggers and bindings update
Azure Functions now supports creating an OpenAI resource and an optional Azure AI Search vector store for building intelligent applications in Flex consumption portal experience. This makes it easier to build intelligent applications for text completion, chat with assistant calling, or Retrieval augmented generation when you want to use your own company data with Azure Open AI.
You can build intelligent apps using the native Azure OpenAI SDKs or use the updated Azure Functions OpenAI bindings to jumpstart development of the following scenarios:
Chat assistants
- Input binding to chat with LLMs.
- Output binding to retrieve chat history from persisted storage.
- Skills trigger to extend capabilities of OpenAI LLM through natural language.
Retrieval Augmented Generation (Bring your own data for semantic search)
- Data ingestion with Functions bindings.
- Automatic chunking and embeddings creation.
- Store embeddings in vector database including AI Search, Cosmos DB for MongoDB, and Azure Data Explorer.
- Binding that takes prompts, retrieves documents, sends to OpenAI LLM, and returns to user.
Text completion for content summarization and creation
- Input binding that takes prompt or content and returns response from LLM.
These bindings have been improved in the latest preview to support managed identity and use the latest underlying SDKs along with multiple fixes based on customer feedback.
Learn more about the updated preview.
Azure Functions projects can be used as part of .NET Aspire orchestrations with preview support
.NET Aspire is an opinionated stack that simplifies development of distributed applications in the cloud. You can now enlist Azure Functions projects using .NET on the isolated worker model into Aspire orchestrations using preview support. This equips teams to author event-driven functions as part of your microservice solutions in Aspire. Those functions can then take advantage of .NET Aspire’s features around service discovery, OpenTelemetry, and more.
The Aspire.Hosting.Azure.Functions NuGet package allows you to reference Functions projects from Aspire app host projects. You can then use additional Aspire integrations to configure triggers and bindings for Azure Blobs, Azure Queues, Azure Event Hubs, and Azure Service Bus.
This integration can be used by projects targeting .NET 8 and .NET 9. Please note that when deploying the Aspire solution to Azure, Functions projects are deployed as Azure Container Apps resources without event-driven scaling.
Node.js 22 support
Azure Functions now supports Node.js 22 in preview. You can now develop functions using Node.js 22 locally and deploy them to all Azure Functions plans on Linux and Windows.
Learn more about upgrading your Function Apps to Node.js 22
Early access preview
Durable Task Scheduler for Azure Durable Functions
The Durable Task Scheduler is a fully managed backend for Azure Durable Functions that enhances performance, reliability, and ease of monitoring of stateful orchestrations.
Learn more about the features of Durable Task Scheduler and how to sign up for early access.
We would love to hear feedback on these new capabilities and your overall experience with Azure Functions so we can make sure we meet all your needs. You can click on the “Send us your feedback” button from the overview page of your function app.
Thanks for all your feedback from the Azure Functions Team.