Azure Container Apps
155 TopicsBuilding the Agentic Future
As a business built by developers, for developers, Microsoft has spent decades making it faster, easier and more exciting to create great software. And developers everywhere have turned everything from BASIC and the .NET Framework, to Azure, VS Code, GitHub and more into the digital world we all live in today. But nothing compares to what’s on the horizon as agentic AI redefines both how we build and the apps we’re building. In fact, the promise of agentic AI is so strong that market forecasts predict we’re on track to reach 1.3 billion AI Agents by 2028. Our own data, from 1,500 organizations around the world, shows agent capabilities have jumped as a driver for AI applications from near last to a top three priority when comparing deployments earlier this year to applications being defined today. Of those organizations building AI agents, 41% chose Microsoft to build and run their solutions, significantly more than any other vendor. But within software development the opportunity is even greater, with approximately 50% of businesses intending to incorporate agentic AI into software engineering this year alone. Developers face a fascinating yet challenging world of complex agent workflows, a constant pipeline of new models, new security and governance requirements, and the continued pressure to deliver value from AI, fast, all while contending with decades of legacy applications and technical debt. This week at Microsoft Build, you can see how we’re making this future a reality with new AI-native developer practices and experiences, by extending the value of AI across the entire software lifecycle, and by bringing critical AI, data, and toolchain services directly to the hands of developers, in the most popular developer tools in the world. Agentic DevOps AI has already transformed the way we code, with 15 million developers using GitHub Copilot today to build faster. But coding is only a fraction of the developer’s time. Extending agents across the entire software lifecycle, means developers can move faster from idea to production, boost code quality, and strengthen security, while removing the burden of low value, routine, time consuming tasks. We can even address decades of technical debt and keep apps running smoothly in production. This is the foundation of agentic DevOps—the next evolution of DevOps, reimagined for a world where intelligent agents collaborate with developer teams and with each other. Agents introduced today across GitHub Copilot and Azure operate like a member of your development team, automating and optimizing every stage of the software lifecycle, from performing code reviews, and writing tests to fixing defects and building entire specs. Copilot can even collaborate with other agents to complete complex tasks like resolving production issues. Developers stay at the center of innovation, orchestrating agents for the mundane while focusing their energy on the work that matters most. Customers like EY are already seeing the impact: “The coding agent in GitHub Copilot is opening up doors for each developer to have their own team, all working in parallel to amplify their work. Now we're able to assign tasks that would typically detract from deeper, more complex work, freeing up several hours for focus time." - James Zabinski, DevEx Lead at EY You can learn more about agentic DevOps and the new capabilities announced today from Amanda Silver, Corporate Vice President of Product, Microsoft Developer Division, and Mario Rodriguez, Chief Product Office at GitHub. And be sure to read more from GitHub CEO Thomas Dohmke about the latest with GitHub Copilot. At Microsoft Build, see agentic DevOps in action in the following sessions, available both in-person May 19 - 22 in Seattle and on-demand: BRK100: Reimagining Software Development and DevOps with Agentic AI BRK 113: The Agent Awakens: Collaborative Development with GitHub Copilot BRK118: Accelerate Azure Development with GitHub Copilot, VS Code & AI BRK131: Java App Modernization Simplified with AI BRK102: Agent Mode in Action: AI Coding with Vibe and Spec-Driven Flows BRK101: The Future of .NET App Modernization Streamlined with AI New AI Toolchain Integrations Beyond these new agentic capabilities, we’re also releasing new integrations that bring key services directly to the tools developers are already using. From the 150 million GitHub users to the 50 million monthly users of the VS Code family, we’re making it easier for developers everywhere to build AI apps. If GitHub Copilot changed how we write code, Azure AI Foundry is changing what we can build. And the combination of the two is incredibly powerful. Now we’re bringing leading models from Azure AI Foundry directly into your GitHub experience and workflow, with a new native integration. GitHub models lets you experiment with leading models from OpenAI, Meta, Cohere, Microsoft, Mistral and more. Test and compare performance while building models directly into your codebase all within in GitHub. You can easily select the best model performance and price side by side and swap models with a simple, unified API. And keeping with our enterprise commitment, teams can set guardrails so model selection is secure, responsible, and in line with your team’s policies. Meanwhile, new Azure Native Integrations gives developers seamless access to a curated set of 20 software services from DataDog, New Relic, Pinecone, Pure Storage Cloud and more, directly through Azure portal, SDK, and CLI. With Azure Native Integrations, developers get the flexibility to work with their preferred vendors across the AI toolchain with simplified single sign-on and management, while staying in Azure. Today, we are pleased to announce the addition of even more developer services: Arize AI: Arize’s platform provides essential tooling for AI and agent evaluation, experimentation, and observability at scale. With Arize, developers can easily optimize AI applications through tools for tracing, prompt engineering, dataset curation, and automated evaluations. Learn more. LambdaTest HyperExecute: LambdaTest HyperExecute is an AI-native test execution platform designed to accelerate software testing. It enables developers and testers to run tests up to 70% faster than traditional cloud grids by optimizing test orchestration, observability and streamlining TestOps to expedite release cycles. Learn more. Mistral: Mistral and Microsoft announced a partnership today, which includes integrating Mistral La Plateforme as part of Azure Native Integrations. Mistral La Plateforme provides pay-as-you-go API access to Mistral AI's latest large language models for text generation, embeddings, and function calling. Developers can use this AI platform to build AI-powered applications with retrieval-augmented generation (RAG), fine-tune models for domain-specific tasks, and integrate AI agents into enterprise workflows. MongoDB (Public Preview): MongoDB Atlas is a fully managed cloud database that provides scalability, security, and multi-cloud support for modern applications. Developers can use it to store and search vector embeddings, implement retrieval-augmented generation (RAG), and build AI-powered search and recommendation systems. Learn more. Neon: Neon Serverless Postgres is a fully managed, autoscaling PostgreSQL database designed for instant provisioning, cost efficiency, and AI-native workloads. Developers can use it to rapidly spin up databases for AI agents, store vector embeddings with pgvector, and scale AI applications seamlessly. Learn more. Java and .Net App Modernization Shipping to production isn’t the finish line—and maintaining legacy code shouldn’t slow you down. Today we’re announcing comprehensive resources to help you successfully plan and execute app modernization initiatives, along with new agents in GitHub Copilot to help you modernize at scale, in a fraction of the time. In fact, customers like Ford China are seeing breakthrough results, reducing up to 70% of their Java migration efforts by using GitHub Copilot to automate middleware code migration tasks. Microsoft’s App Modernization Guidance applies decades of enterprise apps experience to help you analyze production apps and prioritize modernization efforts, while applying best practices and technical patterns to ensure success. And now GitHub Copilot transforms the modernization process, handling code assessments, dependency updates, and remediation across your production Java and .NET apps (support for mainframe environments is coming soon!). It generates and executes update plans automatically, while giving you full visibility, control, and a clear summary of changes. You can even raise modernization tasks in GitHub Issues from our proven service Azure Migrate to assign to developer teams. Your apps are more secure, maintainable, and cost-efficient, faster than ever. Learn how we’re reimagining app modernization for the era of AI with the new App Modernization Guidance and the modernization agent in GitHub Copilot to help you modernize your complete app estate. Scaling AI Apps and Agents Sophisticated apps and agents need an equally powerful runtime. And today we’re advancing our complete portfolio, from serverless with Azure Functions and Azure Container Apps, to the control and scale of Azure Kubernetes Service. At Build we’re simplifying how you deploy, test, and operate open-source and custom models on Kubernetes through Kubernetes AI Toolchain Operator (KAITO), making it easy to inference AI models with the flexibility, auto-scaling, pay-per-second pricing, and governance of Azure Container Apps serverless GPU, helping you create real-time, event-driven workflows for AI agents by integrating Azure Functions with Azure AI Foundry Agent Service, and much, much more. The platform you choose to scale your apps has never been more important. With new integrations with Azure AI Foundry, advanced automation that reduces developer overhead, and simplified operations, security and governance, Azure’s app platform can help you deliver the sophisticated, secure AI apps your business demands. To see the full slate of innovations across the app platform, check out: Powering the Next Generation of AI Apps and Agents on the Azure Application Platform Tools that keep pace with how you need to build This week we’re also introducing new enhancements to our tooling to help you build as fast as possible and explore what’s next with AI, all directly from your editor. GitHub Copilot for Azure brings Azure-specific tools into agent mode in VS Code, keeping you in the flow as you create, manage, and troubleshoot cloud apps. Meanwhile the Azure Tools for VS Code extension pack brings everything you need to build apps on Azure using GitHub Copilot to VS Code, making it easy to discover and interact with cloud services that power your applications. Microsoft’s gallery of AI App Templates continues to expand, helping you rapidly move from concept to production app, deployed on Azure. Each template includes fully working applications, complete with app code, AI features, infrastructure as code (IaC), configurable CI/CD pipelines with GitHub Actions, along with an application architecture, ready to deploy to Azure. These templates reflect the most common patterns and use cases we see across our AI customers, from getting started with AI agents to building GenAI chat experiences with your enterprise data and helping you learn how to use best practices such as keyless authentication. Learn more by reading the latest on Build Apps and Agents with Visual Studio Code and Azure Building the agentic future The emergence of agentic DevOps, the new wave of development powered by GitHub Copilot and new services launching across Microsoft Build will be transformative. But just as we’ve seen over the first 50 years of Microsoft’s history, the real impact will come from the global community of developers. You all have the power to turn these tools and platforms into advanced AI apps and agents that make every business move faster, operate more intelligently and innovate in ways that were previously impossible. Learn more and get started with GitHub Copilot285Views0likes0CommentsReimagining App Modernization for the Era of AI
This blog highlights the key announcements and innovations from Microsoft Build 2025. It focuses on how AI is transforming the software development lifecycle, particularly in app modernization. Key topics include the use of GitHub Copilot for accelerating development and modernization, the introduction of Azure SRE agent for managing production systems, and the launch of the App Modernization Guidance to help organizations modernize their applications with AI-first design. The blog emphasizes the strategic approach to modernization, aiming to reduce complexity, improve agility, and deliver measurable business outcomes267Views0likes0CommentsAzure Functions – Build 2025
Azure Functions – Build 2025 update With Microsoft Build underway, the team is excited to provide an update on the latest releases in Azure Functions this year. Customers are leveraging Azure Functions to build AI solutions, thanks to its serverless capabilities that scale on demand and its native integration for processing real-time data. The newly launched capabilities enable the creation of AI and agentic applications with enhanced offerings, built-in security, and a pay-as-you-go model. Real-time retrieval augmented generation, making organizational data accessible through semantic search Native event driven tool function calling with the AI Foundry Agent service Hosted Model Context Protocol servers. Support for Flex consumption plans, including zone redundancy, increased regions, and larger instance sizes. Enhanced security for applications through managed identity and networking support across all Azure Functions plans. Durable Functions to develop deterministic agentic solutions, providing control over agent processes with built-in state management for automatic retries and complex orchestration patterns, including human approvals. Read more about using durable for agents in this blog. Azure Functions has made significant investments over the past couple of years to simplify the development of secure, scalable, and intelligent applications. Learn more about the scenarios and capabilities in the documentation. Building AI apps General availability announcements Azure Functions integration with Azure AI Foundry Agent Service Integrating Azure Functions with AI Foundry Agent service enables you to build intelligent, event-driven applications that are scalable, secure, and cost-efficient. Azure Functions act as custom tools that AI agents can call to execute business logic, access secure systems, or process data dynamically in response to events like HTTP requests or queue messages. This integration allows for modular AI workflows, where agents can reason through tasks and trigger specific functions as needed—ideal for scenarios like customer support, document processing, or automated insights—without the need to manage infrastructure. Learn more Public preview announcements Remote Model Context Protocol (MCP) Model Context Protocol (MCP) is a way for apps to provide capabilities and context to a large language model. A key feature of MCP is the ability to define tools that AI agents can leverage to accomplish whatever tasks they’ve been given. MCP servers can be run locally, but remote MCP servers are important for sharing tools that work at cloud scale. The preview of triggers and bindings allow you to build tools using remote MCP with server-sent events (SSE) with Azure Functions. Azure Functions lets you author focused, event-driven logic that scales automatically in response to demand. You just write code reflecting unique requirements of your tools, and Functions will take care of the rest. Learn more. Azure OpenAI Trigger and Bindings preview update The Azure OpenAI extension has been updated to support managed identity, the latest OpenAI SDK, support for Azure Cosmos DB for NoSQL as a vector store, and customer feedback improvements. Retrieval Augmented Generation (Bring your own data for semantic search) Data ingestion with Functions bindings. Automatic chunking and embeddings creation. Store embeddings in vector database including AI Search, Cosmos DB for MongoDB, Cosmos DB for NoSQL, and Azure Data Explorer. Binding that takes prompts, retrieves documents, sends to OpenAI LLM, and returns to user. Text completion for content summarization and creation Input binding that takes prompt and returns response from LLM. Chat assistants Input and output binding to chat with LLMs. Output binding to retrieve chat history from persisted storage. Skills trigger that is registered and called by LLM through natural language. Learn more. Flex consumption General availability announcements New regions for Azure Functions Flex consumption Beyond the already generally available regions, you can now create Flex Consumption apps in the following regions: Australia Southeast Brazil South Canada Central Central India Central US France Central Germany West Central Italy North Japan East Korea Central North Central US Norway East South Africa North South India Spain Central UAE North Uk West West Central US West Europe West US Pricing for each region will be available by July 1 st . To learn more, see View Currently Supported regions. Public preview announcements Azure Functions Flex Consumption now supports availability zones and 512 MB instances You can now enable availability zones for your Flex Consumption apps during create or post-create. You can also choose the 512 MB instance memory size. Availability zones are physically separate groups of datacenters within each Azure region. When one zone fails, services can fail over to one of the remaining zones. When availability zones are enabled, instances are distributed across availability zones for increased reliability. Availability zones preview is initially available in the following regions: Australia East Canada Central Central India East Asia Germany West Central Italy North Norway East South Africa North Sweden Central West US 3 UAE North UK South To learn more about availability zones, see reliability in Azure Functions Moreover, Azure Functions now allows you to choose 512 MB in addition to 2048 MB and 4096 MB as the memory instance size for your Flex Consumption apps. This enables you to further cost optimize your apps that require less resources, and allows your apps to scale out further within the default quota. To learn more about instance sizes, see Flex Consumption plan instance memory. Azure Functions on Azure Container Apps General availability announcements We are excited to introduce a new, streamlined method for running Azure Functions directly in Azure Container Apps (ACA). This powerful integration allows you to leverage the full features and capabilities of Azure Container Apps while benefiting from the simplicity of auto-scaling provided by Azure Functions For customers who want to deploy and manage their Azure Functions using the native capabilities of Azure Container Apps, we have recently released the ability to use Azure Functions on Azure Container Apps environment to deploy your multitype services to a cloud-native solution designed for centralized management and serverless scale. Azure Function’s host, runtime, extensions, and Azure Function apps can be developed and deployed as containers using familiar Functions tooling including Core Tools, AzCLI/Portal/code-to-cloud with GitHub actions and DevOps tasks into the Container Apps compute environment. This enables centralized networking, observability, and configuration boundaries for multitype application development when building microservices. Azure Functions on Azure Container Apps can be integrated with DAPR, scaled using KEDA and provisioned to a highly performant serverless plan. This allows you to maximize productivity with a serverless container service built for microservices, robust autoscaling, and fully managed infrastructure. Learn more. Triggers and Bindings General availability announcements Azure SQL trigger for Azure Functions You can now build application logic in azure function apps consumption plan that can scale apps to zero and up driven by the data from Azure SQL database. Azure SQL trigger for Azure Functions allows you to use nearly any SQL database enabled with change tracking to develop and scale event-driven applications using Azure Functions. Invoking an Azure Function from changes to an Azure SQL table is now possible through the Azure SQL trigger for Azure Functions in all plans for Azure Functions supported languages. Azure SQL trigger for Azure Functions enables you, with nearly any SQL database enabled with change tracking, to develop and scale event-driven applications using Azure Functions. The Azure SQL trigger is compatible with Azure SQL Database, Azure SQL Managed Instance, and SQL Server and can be developed with all Azure Functions supported languages for all plans. With input and output bindings for SQL already in GA, you can quickly write Azure Functions that read and write from your databases. Together with triggers and input/output bindings, the SQL extension for Azure Functions provides you improved efficiency with low-code/no-code database interactions and enables those who are looking to migrate their applications to Azure the ability to participate in modern architectures. Learn more. Bind to Blob Storage types from the Azure SDK for Python Azure Functions triggers and bindings enable you to easily integrate event and data sources with function applications. This feature enables you to use types from service SDKs and frameworks, providing more capability beyond what is currently offered. Specifically, SDK type bindings for Azure Storage Blob enable the following key scenarios: Downloading and uploading blobs of large sizes, reducing current memory limitations and GRPC limits. Improved performance by using blobs with Azure Functions To learn more, see SDK type bindings for Azure Blob Storage in Python Azure Functions support for HTTP streams in Python Azure Functions support for HTTP streams in Python is now GA. With this feature, customers can stream HTTP requests to and responses from their Function Apps, using function exposed FastAPI request and response APIs. Previously with HTTP requests, the amount of data that could be transmitted was limited at the SKU instance memory size. With HTTP streaming, large amounts of data can be processed with chunking. This feature enables new scenarios including processing large data streaming OpenAI responses and delivering dynamic content. You can leverage this feature for use cases where real time exchange and interaction between client and server over HTTP connections is needed. Additionally, FastAPI response types are supported with this feature. To learn more, see HTTP streams in Azure Functions using Python. Public preview announcements Bind to types in Azure Functions for Azure Service Bus, Azure Cosmos DB and Azure Event Hubs in Python Azure Functions triggers and bindings enable you to easily integrate event and data sources with function applications. This feature enables you to use types from service SDKs and frameworks, providing more capability beyond what is currently offered. Azure Service Bus: You can now interact with the ServiceBusReceivedMessage type from the SDK, offering more advanced functionality compared to the previous ServiceBusMessage type. To learn more, see SDK type bindings for Service Bus in Python. Azure Cosmos DB: SDK type bindings for Azure Cosmos DB enable the following key scenarios: Interacting with Cosmos DB instances seamlessly (databases, containers, and documents), reducing current memory limitations and GRPC limits. Improved performance by using Cosmos DB with Azure Functions To learn more, see SDK type bindings for Cosmos DB in Python. Azure Event Hubs: Azure Event Hubs SDK type bindings enable you to use types from service SDKs and frameworks, providing more capability beyond what is currently offered. To learn more, see SDK type bindings for Event Hubs in Python. SDK type bindings in Azure Functions for Azure Blob Storage in Java SDK type bindings for Azure Blob Storage enable the following key scenarios: Downloading and uploading blobs of large sizes, reducing current memory limitations and GRPC limits. Enabling advanced operations including partial reads, parallel uploads, and direct property manipulations. Improved performance by using blobs with Azure Functions To learn more, see SDK type bindings for Azure Blob Storage in Java. Language updates General availability announcements Python 3.12 You can now develop functions using Python 3.12 locally and deploy them to all Azure Functions plans. Python 3.12 builds on the performance enhancements that were first released with Python 3.11 and adds several performance and language readability features in the interpreter. You can now take advantage of these new features and enhancements when creating serverless applications on Azure Functions. Learn more. Azure Functions support for Java 21 LTS Azure Functions support for Java 21 is now generally available. You can now develop apps using Java 21 locally and deploy them to all Azure Functions plans on Linux and Windows. For more info: Updating your app to Java 21. Learn more about Java 21 More information about Azure Functions Supported Languages Durable Functions General availability announcements Durable functions v3 in Azure Functions Durable functions extension v3 in Azure Functions is now generally available. Major improvements in this new major version include improved cost efficiency for usage of Azure Storage v2 accounts and an upgrade to the latest Azure Storage SDKs, as well as the .NET Framework used by the extension. For more info: https://learn.microsoft.com/azure/azure-functions/durable/durable-functions-versions Public preview announcements Durable task scheduler Durable task scheduler is a new storage provider for Durable Functions. It is designed to address the challenges and gaps identified by our customers with existing bring-your-own storage options. Over the past few months, since the initial limited early access launch of the durable task scheduler, we’ve been working closely with our customers to understand their requirements and ensure they are fully supported in using durable task scheduler successfully. We’ve also strengthened the fundamentals by Expanding regional availability Finalizing APIs Ensuring high reliability, scalability and built-in security Adding support for all durable functions programming languages This is the preferred managed backend solution for customers who require high performance, enhanced monitoring of stateful orchestrations, or find managing bring-your-own storage accounts too cumbersome. It is the ideal choice for stateful functions (durable functions) in Azure Functions. Learn more. GitHub Copilot for Azure to develop your functions in VS Code With the GitHub Copilot for Azure extension, you can now generate complete Azure Functions code just by describing what you want — directly in VS Code. Using GitHub Copilot in agent mode, simply prompt it with your desired app logic, and it writes Functions code that follows Azure Functions’ best practices automatically. This means you don’t have to start from scratch — GitHub Copilot ensures your functions use the latest programming models, event-driven triggers, secure auth defaults, and recommended development patterns, so you can build scalable, production-ready apps faster and with confidence. Coming soon: Azure Functions deployment and infrastructure best practices, including Bicep generation, to streamline your entire Functions development lifecycle. Install GitHub Copilot for Azure to try it out today. Managed Identity support during application creation Azure Functions continues to invest in best practices to ensure customers can provide built-in security for their applications. Support for using managed identity is available for working with the required storage account used by Azure Functions as well as supported extensions. You can now configure managed identity directly in the portal when creating a function app to reduce the need for secrets. You can learn more about security in Azure Functions in the documentation. OpenTelemetry Support in Azure Functions Public Preview We're excited to announce significant improvements to OpenTelemetry support in Azure Functions, expanding on the limited preview announced last year. These enhancements deliver better observability, improved performance, and more detailed insights into your function executions, helping you diagnose issues faster and optimize your applications more effectively. These updates make it easier to monitor and troubleshoot your serverless apps with clearer, more relevant insights. To get started, enable OpenTelemetry in your function app and check out the latest documentation. We would love to hear feedback on these new capabilities and your overall experience with Functions so we can make sure we meet all your needs. You can click on the “Send us your feedback” button from the overview page of your function app. Thanks for all your feedback from the Azure Functions Team.339Views0likes0CommentsNew Networking Capabilities in Azure Container Apps
New Networking Capabilities in Azure Container Apps Azure Container Apps is your go-to fully managed serverless container service that enables you to deploy and run containerized applications with per-second billing and autoscaling without having to manage infrastructure. Today, Azure Container Apps is thrilled to announce several new enterprise capabilities that will take the flexibility, security, and manageability of your containerized applications to the next level. These capabilities include premium ingress, rule-based routing, private endpoints, Azure Arc integration, and planned maintenance. Let’s dive into the advanced networking features that Azure Container Apps has introduced. Public Preview: Premium Ingress in Azure Container Apps Azure Container Apps now supports premium ingress in public preview. This feature brings environment-level ingress configuration options, with the highlight being customizable ingress scaling. This capability supports the scaling of the ingress proxy, allowing you to better handle higher demand workloads, such as large performance tests. By configuring your ingress proxy to run on workload profiles, you can scale out more ingress instances to manage the load. Keep in mind, running the ingress proxy on a workload profile will incur associated costs. But wait, there’s more! This release also includes other ingress-related settings to boost your application’s flexibility, such as termination grace period, idle request timeout, and header count. To learn more, please visit https://aka.ms/aca/ingress-config. Public Preview: Rule-Based Routing in Azure Container Apps Next up, we have rule-based routing, now in public preview. This feature is all about giving you greater flexibility and composability for your Azure Container Apps. It simplifies your architecture for microservice applications, A/B testing, blue-green deployments, and more. With rule-based routing, you can direct incoming HTTP traffic to different apps within your Container Apps environment based on the requested host name or path. This includes support for custom domains! No need to set up a separate reverse proxy like NGINX anymore. Just provide routing rules for your environment and incoming traffic will automatically be routed to the specified target apps. To learn more, please visit https://aka.ms/aca/rule-based-routing. Generally Available: Private Endpoints in Azure Container Apps We’re also excited to announce that private endpoints are now generally available for workload profile environments in Azure Container Apps. This means you can connect to your Container Apps environment using a private IP address in your Azure Virtual Network, eliminating exposure to the public internet and securing access to your applications. Plus, you can connect directly from Azure Front Door to your workload profile environments over a private link instead of the public internet. Today, you can enable Private Link to the container apps origin for Azure Front Door through the Azure CLI and Azure portal. TCP support is now available too! This feature is supported for both Consumption and Dedicated plans in workload profile environments. Whether you have new or existing environments, you can leverage this capability without needing to re-provision your environment. Additionally, this capability introduces the public network access setting, allowing you to configure Azure networking policies. GA pricing will go into effect on July 1, 2025. To learn more, please visit https://aka.ms/aca/private-endpoints. What else is going on with Azure Container Apps at Build 2025? There’s a lot happening at Build 2025! Azure Container Apps has numerous sessions and other features being launched. For a complete overview, check out our https://aka.ms/aca/whats-new-blog-build-2025. For feedback, feature requests, or questions about Azure Container Apps, visit our GitHub page. We look forward to hearing from you!107Views0likes0CommentsNew Observability & Debugging Capabilities for Azure Container Apps
Azure Container Apps gives you a strong foundation for monitoring and debugging, with built-in features that give you a holistic view of your container app’s health throughout its application lifecycle. As applications grow, developers need even deeper visibility and faster ways to troubleshoot issues. That’s why we’re excited to announce new observability and debugging features. These features will help you further monitor your environment and identify root causes faster. Generally Available: OpenTelemetry agent in Azure Container Apps OpenTelemetry agent in Azure Container Apps is now generally available. This feature enables you to use open-source standards to send your app’s data without setting up the OpenTelemetry collector yourself. You can use the managed agent to choose where to send logs, metrics, and traces. Once enabled, the agent runs in your Container Apps environment and automatically collects and exports telemetry data. You can send data to Azure Monitor Application Insights (logs, traces), Datadog (metrics, logs, traces), or any generic OTLP-configured endpoint (logs, metrics, traces). You can configure and manage the agent today using the Azure portal, ARM templates, Bicep, Terraform, or the Azure CLI. To learn more, visit https://aka.ms/aca/otel Generally Available: Aspire Dashboard in Azure Container Apps Azure Container Apps .NET 8’s Aspire dashboard is now generally available. Access live data about your project and containers in the cloud to evaluate the performance of your applications and debug errors with comprehensive logs, metrics, traces, and more. In addition, we now support the newest version of the Aspire dashboard (9.2) in Azure Container Apps! Rollout has begun, with global availability expected in the coming weeks. This update includes a new way to visualize your app’s resources, the ability to pause/resume telemetry, and more. Check out the Aspire 9.2 release notes for details. To learn more about the Aspire dashboard on Azure Container Apps, visit https://aka.ms/aca/dashboard Azure SRE Agent We are excited to be part of the public preview of the Azure SRE Agent! The Azure SRE Agent is an AI-powered tool that helps cloud developers reduce the cost of operations while improving uptime. It can respond to production alerts, autonomously mitigate issues, and determine root cause analysis (RCA) with minimal developer and operator intervention. Azure SRE Agent continuously monitors application health and performance for production applications on Azure, to build context and provide insights for faster troubleshooting. By working with SWE agent capabilities in GitHub Copilot, the SRE agent can proactively identify application issues and rapidly drive to resolution with development teams and agent services. The public preview is expected to roll out in CY25 H1. To learn more about Azure SRE Agent, visit https://aka.ms/Build25/blog/SREAgent New Diagnose and Solve Dashboard for Container App Environments Enhance your troubleshooting experience with new dashboards and detectors on the Diagnose and Solve blade for Azure Container Apps. The new Container App Environment Health dashboard provides a comprehensive overview of your apps, jobs, replicas, node count, and CPU usage over time. With these insights, you can effectively manage the health, performance, and resource utilization of your apps and jobs. The dashboard is available on your Container App Environment’s Diagnose & Solve blade, under the “Container App Environment Health” troubleshooting category. The new Container App detectors enable you to diagnose and resolve issues such as container create failures, container exit events, health probe failures, image pull failures, KEDA scaler failures, storage mount failures, incorrect target port settings, and more. The new detectors are available on your Container App Diagnose & Solve blade, under the “Availability and Performance” troubleshooting category. To learn more, look through the troubleshooting categories in Microsoft Learn at https://learn.microsoft.com/azure/container-apps/troubleshoot-container-create-failures . New Container App Queries in Log Analytics We’ve added a new set of Kusto queries for Azure Container App users, now available in the Query Hub. These queries can help you identify common failure patterns and issues across your apps. Examples include most frequent error messages, failed revision provisions, insufficient quota, and more. You can see these queries by navigating to the Logs blade and filtering the queries by resource type: Container Apps. To learn more, visit the log monitoring document in Microsoft Learn. What else is going on with Azure Container Apps at Build 2025? There’s a lot happening at Build 2025! Azure Container Apps has numerous sessions and other features being launched. For a complete overview, check out our https://aka.ms/aca/whats-new-blog-build-2025. Wrap Up Your feedback is incredibly important to us, so after exploring the new observability features, let us know your thoughts! If there's a feature that you'd love to see next, we encourage you to up-vote it. To keep an eye on what's coming up, don't forget to check out our roadmap. We’re also posting monthly updates on our GitHub! You can see the latest announcement and past announcements here.119Views1like0CommentsPowering the Next Generation of AI Apps and Agents on the Azure Application Platform
Generative AI is already transforming how businesses operate, with organizations seeing an average return of 3.7x for every $1 of investment [The Business Opportunity of AI, IDC study commissioned by Microsoft]. Developers sit at the center of this transformation, and their need for speed, flexibility, and familiarity with existing tools is driving the demand for application platforms that integrate AI seamlessly into their current development workflows. To fully realize the potential of generative AI in applications, organizations must provide developers with frictionless access to AI models, frameworks, and environments that enable them to scale AI applications. We see this in action at organizations like Accenture, Assembly Software, Carvana, Coldplay (Pixel Lab), Global Travel Collection, Fujitsu, healow, Heineken, Indiana Pacers, NFL Combine, Office Depot, Terra Mater Studios (Red Bull), and Writesonic. Today, we’re excited to announce new innovations across the Azure Application Platform to meet developers where they are and help enterprises accelerate their AI transformation. The Azure App Platform offers managed Kubernetes (Azure Kubernetes Service), serverless (Azure Container Apps and Azure Functions), PaaS (Azure App Service) and integration (Azure Logic Apps and API Management). Whether you’re modernizing existing applications or creating new AI apps and agents, Azure provides a developer‑centric App Platform—seamlessly integrated with Visual Studio, GitHub, and Azure AI Foundry—and backed by a broad portfolio of fully managed databases, from Azure Cosmos DB to Azure Database for PostgreSQL and Azure SQL Database. Innovate faster with AI apps and agents In today’s fast-evolving AI landscape, the key to staying competitive is being able to move from AI experimentation to production quickly and easily. Whether you’re deploying open-source AI models or integrating with any of the 1900+ models in Azure AI Foundry, the Azure App Platform provides a streamlined path for building and scaling AI apps and agents. Kubernetes AI Toolchain Operator (KAITO) for AKS add-on (GA) and Azure Arc extension (preview) simplifies deploying, testing, and operating open-source and custom models on Kubernetes. Automated GPU provisioning, pre-configured settings, workspace customization, real-time deployment tracking, and built-in testing interfaces significantly reduce infrastructure overhead and accelerate AI development. Visual Studio Code integration enables developers to quickly prototype, deploy, and manage models. Learn more. Serverless GPU integration with AI Foundry Models (preview) offers a new deployment target for easy AI model inferencing. Azure Container Apps serverless GPU offers unparalleled flexibility to run any supported model. It features automatic scaling, pay-per-second pricing, robust data governance, and built-in enterprise networking and security support, making it an ideal solution for scalable and secure AI deployments. Learn more. Azure Functions integration with AI Foundry Agent Service (GA) enables you to create real-time, event-driven workflows for AI agents without managing infrastructure. This integration enables agents to securely invoke Azure Functions to execute business logic, access systems, or process data on demand. It unlocks scalable, cost-efficient automation for intelligent applications that respond dynamically to user input or events. Learn more. Azure Functions enriches Azure OpenAI extension (preview) to automate embeddings for real-time RAG, semantic search, and function calling with built-in support for AI Search, Azure Cosmos DB for MongoDB and Azure Data Explorer vector stores. Learn more. Azure Functions MCP extension adds support for instructions and monitoring (preview) making it easier to build and operate remote MCP servers at cloud scale. With this update, developers can deliver richer AI interactions by providing capabilities and context to large language models directly from Azure Functions. This enables AI agents to both call functions and respond intelligently with no separate orchestration layer required. Learn more. Harnessing AI to drive intelligent business processes As AI continues to grow in adoption, its ability to automate complex business process workflows becomes increasingly valuable. Azure Logic Apps empowers organizations to build, orchestrate, and monitor intelligent, agent-driven workflows. Logic Apps agent loop orchestrates agentic business processes (preview) with goal-based automation using AI-powered reasoning engines such as OpenAI’s GPT-4o or GPT-4.1. Instead of building fixed flows, users can define the desired outcomes, and Agent loop action in Logic Apps figures out the steps dynamically. With 1400+ out-of-the-box connectors to various enterprise systems and SaaS applications, and full observability, Logic Apps enables you to rapidly deliver on all business process needs with agentic automation. Learn more. Enable intelligent data pipelines for RAG using Logic Apps (preview) with new native integrations with Azure Cosmos DB and Azure AI Search. Teams can ingest content into vector stores and databases through low-code templates. No custom code required. This enables AI agents to ground responses in proprietary data, improving relevance and accuracy for real business outcomes. Learn more. Empower AI agents to act with Logic Apps in AI Foundry (preview) across enterprise systems using low-code automation. Prebuilt connectors and templates simplify integration with Microsoft and third-party services from databases to SaaS apps. This gives developers and business users a faster way to orchestrate intelligent actions, automate complex workflows, and operationalize AI across the organization. Learn more. Scale AI innovation across your enterprise As AI adoption grows, so does the need for visibility and control over how models are accessed and utilized. Azure API Management helps you achieve this with advanced tools that ensure governance, security, and efficient management of your AI APIs. Expanded AI Gateway capabilities in Azure API Management (GA) give organizations deeper control, observability, and governance for generative AI workloads. Key additions include LLM Logging for prompts, completions, and token usage insights; session-aware load balancing to maintain context in multi-turn chats; robust guardrails through integration with Azure AI Content Safety service, and direct onboarding of models from Azure AI Foundry. Customers can also now apply GenAI-specific policies to AWS Bedrock model endpoints, enabling unified governance across multi-cloud environments. Learn more. Azure API Management support for Model Context Protocol (preview) makes it easy to expose existing APIs as secure, agent-compatible endpoints. You can apply gateway policies such as authentication, rate limiting, caching, and authorization to protect MCP servers. This ensures consistent, centralized policy enforcement across all your MCP-enabled APIs. With minimal effort, you can transform APIs into AI-ready services that integrate seamlessly with autonomous agents. Learn more. Azure API Center introduces private MCP registry and streamlined discovery (preview) giving organizations full control over which services are discoverable. Role-Based Access Control (RBAC) allows teams to manage who can find, use, and update MCP servers based on organizational roles. Developers can now discover and consume MCP-enabled APIs directly through the API Center portal. These updates improve governance and simplify developer experience for AI agent development. Learn more. Simplify operations for AI apps and agents in production Moving AI applications from proof-of-concept to production requires an environment that scales securely, cost-effectively, and reliably. The Azure App Platform continues to evolve with enhancements that remove operational friction, so you can deploy your AI apps, agents and scale with confidence. App Service Premium v4 Plan (preview) delivers up to 25% better performance and up to 24% cost savings over the previous generation—ideal for scalable, secure web apps. App Service Premium v4 helps modernize both Windows and Linux applications with better performance, security, and DevOps integration. It now offers a more cost-effective solution for customers seeking a fully managed PaaS, reducing infrastructure overhead while supporting today’s demanding AI applications. Learn more. AKS security dashboard (GA) provides unified visibility and automated remediation powered by Microsoft Defender for Containers—helping operations stay ahead of threats and compliance needs without leaving the Azure portal. Learn more. AKS Long-Term Support (GA) introduces 2-year support for all versions of Kubernetes after 1.27, in addition to the standard community-supported versions. This extended support model enables teams to reduce upgrade frequency and complexity, ensure platform stability, and provide greater operational flexibility. Learn more. Dynamic service recommendations for AKS (preview) streamlines the process of selecting and connecting services to your Azure Kubernetes Service cluster by offering tailored Azure service recommendations directly in the Azure portal. It uses in-portal intelligence to suggest the right services based on your usage patterns, making it easier to choose what’s best for your workloads. Learn more. Azure Functions Flex Consumption adds support for availability zones and smaller instance sizes (preview) to improve reliability and resiliency for critical workloads. The new 512 MB memory option helps customers fine-tune resource usage and reduce costs for lightweight functions. These updates are available in Australia East, East Asia, Sweden Central, and UK South, and can be enabled on both new and existing Flex Consumption apps. Learn more. Join us at Microsoft Build, May 19-22 The future of AI applications is here, and it’s powered by Azure. From APIs to automation, from web apps to Kubernetes, and from cloud to edge, we’re building the foundation for the next era of intelligent software. Whether you're modernizing existing systems or pioneering the next big thing in AI, Azure gives you the tools, performance, and governance to build boldly. Our platform innovations are designed to simplify your path, remove operational friction, and help you scale with confidence. Explore the various breakout, demo and lab sessions at Microsoft Build, May 19-22, to dive deeper into these Azure App Platform innovations. We can’t wait to see what you will build next!371Views0likes0CommentsUnlocking new AI workloads in Azure Container Apps
Announcing new features to support AI workloads including - improved integrations for deploying Foundry models to Azure Container Apps, general availability of Dedicated GPUs, and the private preview of GPU powered dynamic sessions.156Views0likes0CommentsAnnouncing Workflow in Azure Container Apps with the Durable task scheduler – Now in Preview!
We are thrilled to announce the durable workflow capabilities in Azure Container Apps with the Durable task scheduler (preview). This new feature brings powerful workflow capabilities to Azure Container Apps, enabling developers to build and manage complex, durable workflows as code with ease. What is Workflow and the Durable task scheduler? If you’ve missed the initial announcement of the durable task scheduler, please see these existing blog posts: https://aka.ms/dts-early-access https://aka.ms/dts-public-preview In summary, the Durable task scheduler is a fully managed backend for durable execution. Durable Execution is a fault-tolerant approach to running code, designed to handle failures gracefully through automatic retries and state persistence. It is built on three core principles: Incremental Execution: Each operation is executed independently and in order. State Persistence: The output of each step is saved to ensure progress is not lost. Fault Tolerance: If a step fails, the operation is retried from the last successful step, skipping previously completed steps. Durable Execution is especially advantageous for scenarios that require stateful chaining of operations, commonly known as workflow or orchestrations. A few scenarios include: Transactions Order Processing Workflows Infrastructure Management Deployment Pipelines AI / ML and Data Engineering Data Processing Pipelines and ETL Intelligent Applications with AI Agent Orchestrations Workflow in Azure Container Apps The Durable task scheduler features a managed workflow engine responsible for scheduling workflow execution and persisting workflow state. Additionally, it includes an out-of-the-box monitoring and management dashboard, making it easy for developers to debug and manage workflows on demand. You can author your workflows as code using the Durable Task SDKs, which currently support .NET, Python, and Java. Support for JavaScript and Go is on the roadmap. The Durable Task SDKs are lightweight, unopinionated, and designed to be portable across compute environments. To get started with the Durable Task Scheduler on Azure Container Apps: Import the Durable Task SDK for your preferred language and author your workflows. Provision a Durable Task Scheduler resource in your Azure environment. Connect your application to the Durable Task Scheduler backend for workflow orchestration and state persistence. Note: The Durable task scheduler is also available with Durable Functions that are deployed to Azure Container Apps. For more information on choosing the right workflow framework, please see this document: Key Benefits of using the Durable task scheduler for workflow task execution: Azure Managed: The Durable Task Scheduler provides dedicated resources that are fully managed by Azure. Orchestration and entity state management are completely built in. High Performance: The Durable Task Scheduler offers superior performance, efficiently managing high orchestration and task scheduling. Scalability: Manage sudden bursts of events with the ability to auto-scale your container app replicas using a built-in scaler, ensuring reliable, and efficient processing of orchestration work-items across your container app workers. Simplified Monitoring: With the built-in monitoring dashboard, developers can easily track the progress of their workflow, view activity durations, and manage workflows instances. Ease of Use: Author workflows as code using the Durable Task SDKs or Azure Durable Functions and connect directly to the Durable Task Scheduler backend. Security Best Practices: Uses identity-based authentication with Role-Based Access Control (RBAC) for enterprise-grade authorization, eliminating the need for SAS tokens or access keys. Versioning: Version workflows to support iterative changes without compatibility issues – enabling zero-downtime deployments. (Currently available in the .NET SDK; support for other SDKs is coming soon). Scheduling: Trigger workflows on a recurring interval, ideal for time-based automation. (Currently available in the .NET SDK; support for other SDKs is coming soon). Disaster Recovery: Ensure workflows can recover gracefully from failures from disasters, such as outages. (Coming soon). Get Started Today For more on the workflow capabilities using the Durable Task Scheduler in Azure Container Apps, see the official documentation here. To get started with workflow in Azure Container Apps, visit the quickstarts here. For more Azure Container Apps updates at Build 2025, refer to this blog: https://aka.ms/aca/whats-new-blog-build-2025104Views0likes0CommentsWhat's new in Azure Container Apps at Build'25
Azure Container Apps is a fully managed serverless container service that runs microservices and containerized applications on Azure. It provides built-in autoscaling, including scale to zero, and offers simplified developer experience with support for multiple programming languages and frameworks, including special features built for .NET and Java. Container Apps also provides many advanced networking and monitoring capabilities, offering seamless deployment and management of containerized applications without the need to manage underlying infrastructure. Following the features announced at Ignite’24, we've continued to innovate and enhance Azure Container Apps. We announced the general availability of Serverless GPUs, enabling seamless AI workloads with automatic scaling, optimized cold start, per-second billing, and reduced operational overhead. We added a preview of JavaScript code interpreter support for Dynamic Sessions for applications that require the execution of potentially malicious JavaScript code, such as code provided by end users. Furthermore, we partnered with Aqua Security to enhance the security of Azure Container Apps, offering comprehensive image scanning, runtime protection, and supply chain security. These advancements ensure that Azure Container Apps remains a trusted platform for running scalable, secure, and resilient containerized applications. The features we're announcing at Build’25 for new Serverless GPUs integrations, and many new networking and observability features that Enterprises care about further deepens this commitment. Running AI workloads on Azure Container Apps Azure Container Apps efficiently supports AI workloads with features like serverless GPUs with NVIDIA NIM integration, dynamic sessions with Hyper-V isolation, and integrations for enhanced performance and scalability. We are furthering this feature set by announcing new capabilities and integrations for Azure Container Apps. Deploy Foundry Models on Serverless GPUs for Inferencing Azure Container Apps now provides an integration with Foundry Models, which allows you to deploy ready-to-use AI models directly during container app creation. This integration supports serverless APIs with pay-as-you-go billing and managed compute with pay-per-GPU pricing, providing flexibility in deploying Foundry models. Announcing General Availability of Dedicated GPUs Dedicated GPUs in Azure Container Apps are now generally available, and simplify AI application development and deployment by reducing management overhead. It offers built-in support for key components like the latest CUDA driver, turnkey networking, and security features, allowing you to focus on your AI application code. Early access to Serverless GPU in Dynamic Sessions Serverless GPU in Azure Container Apps Dynamic Sessions in now available as an early access feature, enable running untrusted AI-generated code at scale within compute sandboxes protected by Hyper-V isolation. This feature supports a GPU-powered Python code interpreter to better handle AI workloads. Microsoft Dev Box offers an integration with Serverless GPU in Dynamic Sessions through the Dev Box GPU Shell feature. Advanced Networking capabilities Azure Container Apps offers many networking capabilities including custom VNet integration, private endpoints, user-defined routes, NAT Gateway support, and peer-to-peer encryption. We are extending these capabilities by offering new controls and features to support more nuanced network architectures. Announcing General Availability of Private Endpoints Private Endpoints for Azure Container Apps, now generally available, allows customers to connect to their Container Apps environment using a private IP address in their Azure Virtual Network. This eliminates exposure to the public internet and secures access to their applications. Additionally, customers can connect directly from Azure Front Door to their workload profile environments over a private link instead of the public internet. New Premium Ingress capabilities The new premium ingress feature in Azure Container Apps allows for customizable ingress scaling, enabling better handling of higher demand workloads like large performance tests. It introduces environment-level ingress configuration options, including termination grace period, idle request timeout, and header count. Announcing Public Preview of rule-based routing We are adding a rule-based routing feature in Azure Container Apps that allows you to direct incoming HTTP traffic to different apps within your Container Apps environment based on the requested host name or path. This simplifies your architecture for microservice applications, A/B testing, blue-green deployments, and more, without needing a separate reverse proxy. Observability and debugging capabilities Azure Container Apps provides several built-in observability features that give you a holistic view of your container app’s health throughout its application lifecycle, and help you monitor and diagnose the state of your app to improve performance and respond to trends and critical problems. We are extending these existing capabilities by introducing new observability and debugging features. Announcing General Availability of Open Telemetry Collector The OpenTelemetry agent in Azure Container Apps is now generally available, allowing developers to use open-source standards to send app data without setting up the collector themselves. The managed agent collects and exports telemetry data to various endpoints, including Azure Monitor Application Insights, Datadog, and any generic OTLP-configured endpoint. Announcing General Availability of Aspire dashboard The .NET 8’s Aspire dashboard in Azure Container Apps is now generally available, providing live data about your project and containers in the cloud to evaluate performance and debug errors with comprehensive logs, metrics, and traces. In addition, we now support the newest version of Aspire (v9.2), which includes new visualization features, the ability to pause/resume telemetry, and will be globally available in the coming weeks. New Diagnose and Solve dashboard The new Diagnose and Solve dashboard for Azure Container Apps provides a comprehensive overview of app health, performance, and resource utilization, with insights into apps, jobs, replicas, node count, and CPU usage over time. It also includes new detectors to diagnose and resolve issues such as container create failures, health probe failures, and image pull failures. Integration with Azure SRE agent Azure Container Apps integrates seamlessly with the Azure SRE agent to enhance operational efficiency and application uptime. By continuously monitoring application health and performance, the SRE agent provides valuable insights and autonomously responds to production alerts, mitigating issues with minimal intervention. This integration allows developers to leverage the SRE agent to monitor Azure Container Apps resources, from Container App Environments to Apps to Revisions to Replicas, ensuring faster troubleshooting and proactive issue resolution. Enhanced Enterprise capabilities In addition to these announcements, we are introducing several enhanced Enterprise features to Azure Container Apps. Announcing General Availability of Azure Container Apps on Arc-enabled Kubernetes The ability to run Azure Container Apps on your own Azure Arc-enabled Kubernetes clusters (AKS and AKS-HCI) is now generally available. This allows developers to leverage Azure Container Apps features while IT administrators maintain corporate compliance by hosting applications in hybrid environments. Announcing General Availability of Planned Maintenance in Azure Container Apps Planned Maintenance for Azure Container Apps is now generally available, which allows you to control when non-critical updates are applied to your environment. This helps minimize downtime and impact on applications. Critical updates are applied as needed to ensure security and reliability compliance. Announcing Public Preview of workflow capabilities with Durable Task Scheduler The new advanced pro-code workflow feature in Azure Container Apps, leveraging durable task scheduler, is now in public preview. With durable task scheduler in Container Apps, you can create reliable workflows as code, leveraging state persistence and fault-tolerant execution. These containerized workflows enhance scalability, reliability, and streamlined monitoring for administration of complex workflows. Native Azure Functions in Azure Container Apps The new, streamlined method for running Azure Functions natively in Azure Container Apps allows customers to leverage the full features and capabilities of Azure Container Apps while benefiting from the simplicity of auto-scaling provided by Azure Functions. With the new native hosting model, customers can deploy Azure Functions directly onto Azure Container Apps with the same experience as deploying other containerized applications. Customers can also get the complete feature set of Azure Container Apps with this new deployment experience, including multi-revision management, easy authentication, metrics and alerting, health probes and many more. Azure Container Apps at Build’25 conference Also, if you're at Build, come see us at the following sessions: Breakout 182: Better Microservices Development using Azure Container Apps Breakout 190: Secure Next-Gen AI Apps with Azure Container Apps Serverless GPUs Lab 341: Agentic AI Inferencing with Azure Container Apps Community Table Talk 457: App Reliability, Azure Container Apps, & Serverless GPUs Breakout 186: Earth’s Defense with Hera: AI Agents Battle Planet Extinction Threats Breakout 187: Event-Driven Architectures: Serverless Apps That Slay at Scale Breakout 201: Innovate, deploy, & optimize your apps without infrastructure hassles Breakout 117: Use VS Code to build AI apps and agents Breakout 185: Maximizing efficiency in cloud-native app design Demo 544: Building Resilient Cloud-Native Microservices Or come talk to us at the Serverless booth at the Expert Meet-up area at the Hub! Wrapping up As always, we invite you to visit our GitHub page for feedback, feature requests, or questions about Azure Container Apps, where you can open a new issue or up-vote existing ones. If you’re curious about what we’re working on next, check out our roadmap. We look forward to hearing from you!188Views0likes0CommentsDiagnose Web App Issues Instantly—Just Drop a Screenshot into Conversational Diagnostics
It’s that time of year again—Microsoft Build 2025 is here! And in the spirit of pushing boundaries with AI, we’re thrilled to introduce a powerful new preview feature in Conversational Diagnostics. 📸 Diagnose with a Screenshot No more struggling to describe a tricky issue or typing out long explanations. With this new capability, you can simply paste, upload, or drag a screenshot into the chat. Conversational Diagnostics will analyze the image, identify the context, and surface relevant diagnostics for your selected Azure Resource—all in seconds. Whether you're debugging a web app or triaging a customer issue, this feature helps you move from problem to insight faster than ever. Thank you!246Views2likes0Comments