Microservices
74 TopicsBuild and Modernize Intelligent Java apps at Scale
Java on Microsoft Azure Java customers and developers are constantly exploring how they can bring their Java applications to the cloud. Some are looking to modernize existing applications, while others are building new cloud-native solutions from scratch. With these changes, they need a platform that lets them keep working the way they know, without sacrificing control or performance. That’s where Microsoft Azure comes in. As a company, Microsoft is committed to making Java developers as efficient and productive as possible, empowering them to use any tool, framework, and application server on any operating system. Microsoft Azure makes it easy to work with the tools and frameworks Java developers already know and love. Whether using IntelliJ, Eclipse, or VS Code, or managing dependencies with Maven or Gradle, developers can keep using their preferred setup. Azure supports trusted Java application servers and popular open-source tools like Spring Boot, JBoss EAP, and WebLogic, making the transition to the cloud seamless and natural. Scaling on Azure is designed with simplicity and security in mind. Developers can count on built-in tools for monitoring, automation, data support, and caching, along with robust security features. With Azure’s flexible services they can scale confidently, manage costs, and build resilient applications that meet business needs. Azure provides everything Java developers need to build and modernize their applications at scale, letting them do so on their own terms. Tooling for Java app migration and modernization priorities Moving your Java applications to the cloud is easier with the right tools. Azure offers a full set of solutions for every type of migration, whether you are rehosting, re-platforming, refactoring, or rearchitecting. These tools work together to help you transition smoothly, allowing you to work faster, more efficiently, and with greater insight. With Azure, you can achieve meaningful results for your business as you modernize your applications. Azure Migrate and Partner-built Solutions Azure Migrate is a key resource in this process. It provides a holistic view of your server and application estate and generates a cloud-readiness report. With app centricity, you can now assess applications at a portfolio level rather than server by server. This makes it easier for IT decision-makers to plan migrations on a larger scale while aligning with business priorities. In addition to Azure Migrate, you can leverage several partner-built solutions such as CAST, Unify, Dr. Migrate, and others to support additional use cases and scenarios. Azure Migrate application and code assessment For developers, Azure Migrate’s app and code assessment tool (AppCAT) offers in-depth code scanning for Java applications. With this tool, you can assess code changes needed to run your apps in the cloud right from within your preferred terminals, like Bash. GitHub Copilot Chat integration further simplifies the planning process, making it easy to explore modernization options through a conversational approach. AppCAT is especially useful for detailed assessments for refactoring and rearchitecting. GitHub Copilot upgrade assistant for Java A major advancement in this toolkit is the new GitHub Copilot upgrade assistant for Java. Upgrading Java code, runtimes, frameworks, and dependencies can be time-consuming, but with the upgrade assistant, you can streamline the process significantly. Start with your local Java project, receive an AI-powered upgrade strategy, and let Copilot handle the bulk of the work. This powerful tool helps you modernize faster, allowing you to focus on building intelligent applications at scale with confidence. Ready to save time upgrading Java? You can apply for the waitlist to the Technical Preview right here – aka.ms/GHCP-for-Java. This early access is open to a limited number of customers, so we encourage you to sign up soon and share your feedback! Deploy and Scale Java Apps on Azure The Java ecosystem is diverse, encompassing technologies like Java SE, Jakarta EE, Spring, and various application servers. Whatever your Java workload – whether building a monolithic app or a cloud-native microservice – Azure provides a comprehensive platform to support it. Azure offers multiple deployment paths to help meet your specific project goals. For those migrating existing Java applications, infrastructure-as-a-service (IaaS) options like Azure Virtual Machines allow you to lift and shift applications without significant re-architecture. Meanwhile, container options, such as Azure Kubernetes Service (AKS), Azure Container Apps and Azure Red Hat OpenShift, make it easier to manage Java applications in containers. Fully managed platform-as-a-service (PaaS) offerings, like Azure App Service, provide out-of-the-box scalability, DevOps integration, and automation for streamlined management. The following diagram shows recommended Azure services for every Java application type deployed as source or binaries: The following diagram shows the recommended Azure services for every Java application type deployed as containers: Building on Azure's reputation as a versatile platform for various applications, we now turn our focus to three specific offerings that demonstrate this flexibility. Today, we highlight JBoss EAP on Azure App Service, Java on Azure Container Apps, and WebSphere Liberty on Azure Kubernetes Service and how to quickly bring your apps to production with Landing Zone Accelerator. We will also walk you through how to build and modernize intelligent Java apps at scale with the latest AI tools and models. JBoss EAP on Azure App Service Azure App Service offers a fully managed platform with specific enhancements for Java, making it an excellent choice for running enterprise Java applications. Recently, several updates have been introduced to bring even greater value to Java developers using JBoss EAP on App Service: Reduced Licensing Costs: Licensing fees for JBoss EAP on App Service have been cut by over 60%, making it more accessible to a wider range of users. Free Tier Availability: A new free tier is available for those interested in testing the service without an upfront cost, providing an easy entry point for trials and evaluation. Affordable Paid Tiers: Lower-cost paid tiers of App Service Plan for JBoss EAP have been introduced, catering to businesses seeking a cost-effective, production-ready environment. Bring Your Own License Support: Soon, customers will be able to apply existing Red Hat volume licenses to further reduce operational costs, adding flexibility for organizations already invested in Red Hat JBoss EAP. These updates provide significant savings, making JBoss EAP on App Service a smart choice for those looking to optimize costs while running Java applications on a reliable, managed platform. Java on Azure Container Apps Azure Container Apps is a popular serverless platform for Java developers who want to deploy and scale containerized applications with minimal management overhead. Designed for microservices, APIs, and event-driven workloads, Azure Container Apps makes it simple to scale applications from zero up to millions of requests, adapting dynamically to meet real-time demand. Azure Container Apps includes several features tailored specifically for Java: Managed Components for Java: With built-in Spring Cloud services like Service Registry and Config Server, managing Java applications is straightforward. These components simplify service registration, discovery, and configuration management. Enhanced Java Monitoring: Azure Monitor provides Java-specific insights, giving developers visibility into their applications and enabling proactive management with detailed metrics. Effortless Scaling: Container Apps can scale down to zero during periods of low demand and scale out as traffic grows, helping optimize costs. The platform also supports GPU-enabled workloads, perfect for AI-powered Java applications. This fully managed platform supports a range of Java frameworks and runtimes, from Spring Boot to Quarkus to Open Liberty and beyond. With built-in DevOps, secure networking, role-based access, and pay-as-you-go pricing, Azure Container Apps offers a powerful and flexible foundation to build, deploy, and monitor any Java application type. WebSphere Liberty on Azure Kubernetes Service IBM's WebSphere is one of the most widely used middleware platforms globally, especially in large enterprises. Many organizations rely on WebSphere Traditional applications, which have strong market penetration in enterprise environments. As IBM focuses on cloud-native solutions, it is encouraging organizations to migrate from WebSphere Traditional to WebSphere Liberty - a more modern, cloud-native Java runtime. With Azure Kubernetes Service, this migration becomes straightforward and manageable, allowing organizations to bring existing WebSphere Traditional apps into a more flexible, scalable environment. Why Azure Kubernetes Service? AKS provides a powerful platform for running containerized Java applications without the complexity of setting up and maintaining Kubernetes yourself. It’s a fully managed Kubernetes service, integrated end-to-end with Azure’s foundational infrastructure, CI/CD, registry, monitoring, and managed services. Because AKS is based on vanilla Kubernetes, all Kubernetes tools work, and there’s no risk of lock-in. AKS offers global availability, enterprise-grade security, automated upgrades, and compliance, making it a reliable choice for organizations aiming to modernize WebSphere applications. Competitive pricing and cost optimization make AKS even more attractive. Why Transform to WebSphere Liberty? WebSphere Liberty, along with Open Liberty, offers compatibility with WebSphere Traditional, creating an easy migration path. Liberty is a lightweight, modular runtime that’s better suited for cloud-native applications. It reduces resource costs, requiring less memory and CPU than WebSphere Traditional and has quicker startup times. Liberty also embraces modern standards, like Jakarta EE Core Profile and MicroProfile, making it ideal for cloud-native applications. Organizations can even re-purpose existing WebSphere Traditional licenses, significantly reducing migration costs. Running WebSphere Liberty on Azure Kubernetes Service is simple and flexible. IBM and Microsoft have certified Liberty on AKS, providing a reliable path for enterprises to move their WebSphere applications to the cloud. With a solution template available in the Azure Marketplace, you can deploy WebSphere Liberty on AKS in a few clicks. This setup works with both new and existing AKS clusters, as well as any container registry, allowing you to deploy quickly and scale as needed. By combining WebSphere Liberty with AKS, you gain the agility of containers and Kubernetes, along with the robust features of a cloud-native runtime on a trusted enterprise platform. Build Right and Fast! Build Your Java or Spring Apps Environment: Development, Test, or Production in Just 15-30 Minutes with Landing Zone Accelerator! To ensure the scalability and quality of your cloud journey, we re-introduce Landing Zone Accelerators, specifically designed for Azure app destinations such as App Service, Azure Container Apps, and Azure Kubernetes Service. An accelerator allows you to establish secure, complaint, and scalable development, test, or production environments within 15-30 minutes. Adhering to Azure's best practices and embedding security by default, a Landing Zone Accelerator ensures that your cloud transition is not only swift but also robust and scalable. It paves the way for both application and platform teams to thrive in the cloud environment. From realizing cost efficiency to streamlining your migration and modernization journey to ensuring the scalability of your cloud operations, our goal is to demonstrate how your cloud transition can drive innovation, and efficiency, and accelerate business value. The Landing Zone Accelerators for App Service, Azure Container Apps, and Azure Kubernetes Service represent an authoritative, proven, and prescriptive infrastructure-as-code solution, designed to assist enterprise customers in establishing a robust environment for deploying Java, Spring, and polyglot apps. It not only expedites the deployment process but also provides a comprehensive design framework, allowing for the clear planning and designing of Azure environments based on established standards. Build Intelligent Java Apps at Scale Today, many enterprise applications are built with Java. As AI grows in popularity and delivers greater business outcomes, Java developers wonder how to integrate it with their apps. Python is popular for AI - particularly for model building, deploying and fine tuning LLMs, and data handling - but moving an app to a new language can be complex and costly. Instead, Java developers can use Azure to combine their Java apps with AI, building intelligent apps without needing to master Python. Azure makes it simple to bring AI into your existing Java applications. Many customers are already using the Azure platform to add intelligence to their Java apps, delivering more value to their businesses. Whether starting fresh or modernizing existing systems, Azure provides the tools needed to build powerful, intelligent applications that scale. Modernize and Build New Intelligent Apps with Azure. Wherever you are in your cloud journey, Azure helps you modernize and build intelligent apps. Azure offers app platform services, data handling at scale, and AI tools that make it easy to create applications that deliver meaningful business value. Intelligent apps can drive growth, amplify team capabilities, and improve productivity. With Azure, you can bring your Java apps into the future and stay ahead of the competition. The Right Tools for Intelligent Java Apps. Building intelligent applications requires a strong foundation. Azure provides essential services like a robust cloud platform, scalable data solutions, and AI tools, including pretrained models and responsible AI practices. These tools ensure your apps are efficient, scalable, and aligned with best practices. Azure offers several key services for this: Azure AI Studio: A one-stop platform for experimenting and deploying AI solutions. It provides tools for model benchmarking, solution testing, and monitoring, making it easy to develop use cases like customer segmentation and predictive maintenance. Azure OpenAI Service: With access to advanced AI models like GPT-4, this service is ideal for content generation, summarization, and semantic search. Build chatbots, create marketing content, or add AI-driven features to your Java apps. Azure Machine Learning: An end-to-end platform for building and deploying machine learning models. It supports various use cases such as recommendation systems, predictive analytics, and anomaly detection. MLOps capabilities ensure your models are continuously improved and managed. Azure AI Search: Uses retrieval-augmented generation (RAG) technology for powerful search capabilities. Enhance user experience with intelligent search options, helping users quickly find relevant information. Azure Cosmos DB: A globally distributed, multi-model database service ideal for high-performance, low-latency applications. It offers turnkey global distribution, automatic scalability, and integration with other Azure services, making it a strong choice for intelligent apps that handle large amounts of data. Azure Database for PostgreSQL with PGVector: This managed PostgreSQL service now includes the PGVector extension, designed for handling vector embeddings in AI applications. It’s a valuable tool for applications requiring fast, similarity-based searches and supports applications involving recommendation engines, semantic search, and personalization. Azure AI Infrastructure: Provides high-performance infrastructure for AI workloads. Whether training large models or performing real-time inference, Azure’s AI infrastructure meets demanding needs. Get Started with AI in Java. If you are a Java app developer, now is a great time to start integrating AI into your apps. Spring developers can use Spring AI for quick integration, and developers using Quarkus or Jakarta EE or any other app type can take advantage of LangChain4j. You can also use Microsoft Azure AI client libraries for Java. No matter what your framework is, Azure has the tools to help you add intelligence to your applications. Meet the Java team at the Microsoft Ignite 2024 Come meet the Java team at Microsoft Ignite 2024! Join our breakout session, "Java on Azure: Modernize and scale enterprise Java applications on Azure" BRK147, for a close look at the newest ways to build, scale, and modernize Java apps on Azure. In this session, our engineers and product experts will share the latest updates and tools for Java developers. You’ll learn about cost-saving options, new cloud tools, and how to add smart features to your apps. This is a session for all Java developers, whether you're moving apps to the cloud or building cloud-native apps from scratch. Everyone can join - either in person at Ignite or virtually from anywhere in the world. The virtual option is free, so you can attend without leaving your desk. Don’t miss the chance to connect with the Java team, ask questions, and get tips to make your Java apps succeed on Azure! Start Today! Join Us at Java + AI Events Worldwide. Sign Up for upcoming Java and AI events like JDConf 2025 and JavaOne 2025. You’ll also find our developer advocates sharing insights and tips at Java conferences and events around the world. Begin framing your app migration plans with resources to guide you through each step. Get started here – aka.ms/Start-Java. Explore the docs and deploy your first Java or Spring app in the cloud. Follow the quick steps here – aka.ms/Java-Hub. Use our tools and information to build a plan and show your leaders the benefits of Java app modernization. Get the details here – azure.com/Java. Start building, planning, and exploring Azure for Java today!221Views0likes0CommentsIntroducing Serverless GPUs on Azure Container Apps
We're excited to announce the public preview ofAzure Container Apps Serverless GPUs accelerated by NVIDIA. This feature provides customers with NVIDIA A100 GPUs and NVIDIA T4 GPUs in a serverless environment, enabling effortless scaling and flexibility for real-time custom model inferencing and other machine learning tasks. Serverless GPUs accelerate the speed of your AI development team by allowing you to focus on your core AI code and less on managing infrastructure when using NVIDIA accelerated computing. They provide an excellent middle layer option betweenAzure AI Model Catalog's serverless APIs and hosting models on managed compute. It provides full data governance as your data never leaves the boundaries of your container while still providing a managed, serverless platform from which to build your applications. Serverless GPUs are designed to meet the growing demands of modern applications by providing powerful NVIDIA accelerated computing resources without the need for dedicated infrastructure management. "Azure Container Apps' serverless GPU offering is a leap forward for AI workloads. Serverless NVIDIA GPUs are well suited for a wide array of AI workloads from real-time inferencing scenarios with custom models to fine-tuning. NVIDIA is also working with Microsoft to bring NVIDIA NIM microservices to Azure Container Apps to optimize AI inference performance.” - Dave Salvator, Director, Accelerated Computing Products, NVIDIA Key benefits of serverless GPUs Scale-to zero GPUs: Support for serverless scaling of NVIDIA A100 and T4 GPUs. Per-second billing: Pay only for the GPU compute you use. Built-in data governance: Your data never leaves the container boundary. Flexible compute options: Choose between NVIDIA A100 and T4 GPUs. Middle-layer for AI development: Bring your own model on a managed, serverless compute platform. Scenarios Whether you choose to use NVIDIA A100 or T4 GPUs will depend on the types of apps you're creating. The following are a couple example scenarios. For each scenario with serverless GPUs, you pay only for the compute you use with per-second billing, and your apps will automatically scale in and out from zero to meet the demand. NVIDIA T4 Real-time and batch inferencing: Using custom open-source models with fast startup times, automatic scaling, and a per-second billing model, serverless GPUs are ideal for dynamic applications that don't already have a serverless API in the model catalog. NVIDIA A100 Compute intensive machine learning scenarios: Significantly speed up applications that implement fine-tuned custom generative AI models, deep learning, or neural networks. High performance computing (HPC) and data analytics: Applications that require complex calculations or simulations, such as scientific computing and financial modeling as well as accelerated data processing and analysis among massive datasets. Get started with serverless GPUs Serverless GPUs are now available for workload profile environments in West US 3 and Australia East regions with more regions to come. You will need to have quota enabled on your subscription in order to use serverless GPUs. By default, all Microsoft Enterprise Agreement customers will have one quota. If additional quota is needed, please request it here. Note: In order to achieve the best performance with serverless GPUs, use an Azure Container Registry (ACR) with artifact streaming enabled for your image tag.Follow steps here to enable artifact streaming on your ACR. From the portal, you can select to enable GPUs for your Consumption app in the container tab when creating your Container App or your Container App Job. You can also add a new consumption GPU workload profile to your existing Container App environment through the workload profiles UX in portal or through the CLI commands for managing workload profiles. Deploy a sample Stable Diffusion app To try out serverless GPUs, you can use the stable diffusion image which is provided as a quickstart during the container app create experience: In the container tab select the Use quickstart image box. In the quickstart image dropdown, selectGPU hello world container. If you wish to pull the GPU container image into your own ACR to enable artifact streaming for improved performance, or if you wish to manually enter the image, you can find the image atmcr.microsoft.com/k8se/gpu-quickstart:latest. For full steps on using your own image with serverless GPUs, see the tutorial onusing serverless GPUs in Azure Container Apps. Learn more about serverless GPUs With serverless GPUs, Azure Container Apps now simplifies the development of your AI applications by providing scale-to-zero compute, pay-as you go pricing, reduced infrastructure management, and more. To learn more, visit: Using serverless GPUs in Azure Container Apps (preview) | Microsoft Learn Tutorial: Generate images using serverless GPUs in Azure Container Apps (preview) | Microsoft Learn1.8KViews1like0CommentsConnect Privately to Azure Front Door with Azure Container Apps
Azure Container Apps is a fully managed serverless container service that enables you to deploy and run containerized applications with per-second billing and autoscaling without having to manage infrastructure. The service also provides support for a number of enhanced networking capabilities to address security and compliance needs such as network security groups (NSGs), Azure Firewall, and more. Today, Azure Container Apps is excited to announce public preview for another key networking capability, private endpoints for workload profile environments. This feature allows customers to connect to their Container Apps environment using a private IP address in their Azure Virtual Network, thereby eliminating exposure to the public internet and securing access to their applications. With the introduction of private endpoints for workload profile environments, you can now also establish a direct connection from Azure Front Door to your Container Apps environment viaPrivate Link. By enabling Private Link for an Azure Container Apps origin, customers benefit from an extra layer of security that further isolates their traffic from the public internet. Currently, you can configure this connectivity through CLI (portal support coming soon). In this post, we will do a brief overview of private endpoints on Azure Container Apps and the process of privately connecting it to Azure Front Door. Getting started with private endpoints on Azure Container Apps Private endpoints can be enabled either during the creation of a new environment or within an existing one. For new environments, you simply navigate to theNetworking tab, disable public network access, and enable private endpoints. To manage the creation of private endpoints in an existing environment, you can use the newNetworking blade, which is also in public preview. Since private endpoints use a private IP address, the endpoint for a container app is inaccessible through the public internet. This can be confirmed by the lack of connectivity when opening the application URL. If you prefer using CLI, you can find further guidance in enabling private endpoints at Use a private endpoint with an Azure Container Apps environment (preview). Adding container apps as a private origin for Azure Front Door With private endpoints, you can securely connect them to Azure Front Door through Private Link as well. The current process involves CLI commands that guide you in enabling an origin for Private Link and approving the private endpoint connection. Once approved, Azure Front Door assigns a private IP address from a managed regional private network, and you can verify the connectivity between your container app and the Azure Front Door. For a detailed tutorial, please navigate toCreate a private link to an Azure Container App with Azure Front Door (preview). Troubleshooting Have trouble testing the private endpoints? After creating a private endpoint for a container app, you can build and deploy a virtual machine to test the private connection. With no public inbound ports, this virtual machine would be associated with the virtual network defined during creation of the private endpoint. After creating the virtual machine, you can connect via Bastion and verify the private connectivity. You may find outlined instructions atVerify the private endpoint connection. Conclusion The public preview of private endpoints and private connectivity to Azure Front Door for workload profile environments is a long-awaited feature in Azure Container Apps. We encourage you to implement private endpoints for enhanced security and look forward to your feedback on this experience at our GitHub page. Additional Resources To learn more, please visit the following links to official documentation: Networking in Azure Container Apps environment - Private Endpoints Use a private endpoint with an Azure Container Apps environment Create a private link to an Azure Container App with Azure Front Door (preview) What is a private endpoint? What is Azure Private Link?890Views2likes3CommentsThe Power of Conversational Diagnostics (Public Preview) and Diagnostic Workflows
We are thrilled to introduce our latest innovation: the integration of Conversational Diagnostics (Public Preview) with sophisticated Diagnostic Workflows. This powerful combination makes the diagnostic process both intuitive and efficient. For this announcement we will use an Azure Web App hosted on Linux. The Benefits of Diagnostic Workflows Introduced last year, Diagnostic Workflows empower users to tackle complex problems through an intuitive, tree-like interface. This design helps users navigate different paths to diagnose and solve issues effectively, reducing clutter and focusing on essential diagnostics. By providing clear explanations for each decision path, Diagnostic Workflows facilitate a deeper understanding of the diagnostic process, which is particularly beneficial for new users in ramping up quickly. Diagnostic Workflows are available from Azure Portal under Diagnose and solve problems. Conversational Diagnostics: A New Era Building on the introduction of Conversational Diagnostics (Preview) from last year, we have now combined it with Diagnostic Workflows to create an integrated, natural language-based solution. This system comprehends user intent, selects the appropriate Diagnostic Workflow, and keeps users engaged by providing real-time updates and actionable insights through chat. This transparency helps build user trust and guides them efficiently towards resolving their issues. Finally, Conversational Diagnostics provides the grounded data Generative AI used to produce solutions empowering users to check the conclusions. Exciting Developments Ahead The integration of Conversational Diagnostics and Diagnostic Workflows marks a significant advancement in diagnostic capabilities. We are excited to announce that this feature will be available as part of Ignite 2024, and we welcome your feedback to help us continue improving this experience. Stay tuned for more updates and get ready to experience the transformative power of Generative AI-driven diagnostics firsthand. Thank you!205Views0likes0CommentsUnlock New AI and Cloud Potential with .NET 9 & Azure: Faster, Smarter, and Built for the Future
.NET 9, now available to developers, marks a significant milestone in the evolution of the .NET platform, pushing the boundaries of performance, cloud-native development, and AI integration. This release, shaped by contributions from over 9,000 community members worldwide, introduces thousands of improvements that set the stage for the future of application development. With seamless integration withAzure and a focus on cloud-native development and AI capabilities, .NET 9 empowers developers to build scalable, intelligent applications with unprecedented ease. Expanding Azure PaaS Support for .NET 9 With the release of .NET 9, a comprehensive range of Azure Platform as a Service (PaaS) offerings now fully support the platform’s new capabilities, including the latest .NET SDK for any Azure developer. This extensive support allows developers to build, deploy, and scale .NET 9 applications with optimal performance and adaptability on Azure. Additionally, developers can access a wealth of architecture references and sample solutions to guide them in creating high-performance .NET 9 applications on Azure’s powerful cloud services: Azure App Service: Run, manage, and scale .NET 9 web applications efficiently. Check out this blog to learn more about what's new in Azure App Service. Azure Functions: Leverage serverless computing to build event-driven .NET 9 applications with improved runtime capabilities. Azure Container Apps: Deploy microservices and containerized .NET 9 workloads with integrated observability. Azure Kubernetes Service (AKS): Run .NET 9 applications in a managed Kubernetes environment with expanded ARM64 support. Azure AI Services and Azure OpenAI Services: Integrate advanced AI and OpenAI capabilities directly into your .NET 9 applications. Azure API Management, Azure Logic Apps, Azure Cognitive Services, and Azure SignalR Service: Ensure seamless integration and scaling for .NET 9 solutions. These services provide developers with a robust platform to build high-performance, scalable, and cloud-native applications while leveraging Azure’s optimized environment for .NET. Streamlined Cloud-Native Development with .NET Aspire .NET Aspire is a game-changer for cloud-native applications, enabling developers to build distributed, production-ready solutions efficiently. Available in preview with .NET 9, Aspire streamlines app development, with cloud efficiency and observability at its core. The latest updates in Aspire include secure defaults, Azure Functions support, and enhanced container management. Key capabilities include: Optimized Azure Integrations: Aspire works seamlessly with Azure, enabling fast deployments, automated scaling, and consistent management of cloud-native applications. Easier Deployments to Azure Container Apps: Designed for containerized environments, .NET Aspire integrates with Azure Container Apps (ACA) to simplify the deployment process. Using the Azure Developer CLI (azd), developers can quickly provision and deploy .NET Aspire projects to ACA, with built-in support for Redis caching, application logging, and scalability. Built-In Observability: A real-time dashboard provides insights into logs, distributed traces, and metrics, enabling local and production monitoring with Azure Monitor. With these capabilities, .NET Aspire allows developers to deploy microservices and containerized applications effortlessly on ACA, streamlining the path from development to production in a fully managed, serverless environment. Integrating AI into .NET: A Seamless Experience In our ongoing effort to empower developers, we’ve made integrating AI into .NET applications simpler than ever. Our strategic partnerships, including collaborations with OpenAI, LlamaIndex, and Qdrant, have enriched the AI ecosystem and strengthened .NET’s capabilities. This year alone, usage of Azure OpenAI services has surged to nearly a billion API calls per month, illustrating the growing impact of AI-powered .NET applications. Real-World AI Solutions with .NET: .NET has been pivotal in driving AI innovations. From internal teams like Microsoft Copilot creating AI experiences with .NET Aspireto tools like GitHub Copilot, developed with .NET to enhance productivity in Visual Studio and VS Code, the platform showcases AI at its best. KPMG Clara is a prime example, developed to enhance audit quality and efficiency for 95,000 auditors worldwide. By leveraging .NET and scaling securely on Azure, KPMG implemented robust AI features aligned with strict industry standards, underscoring .NET and Azure as the backbone for high-performing, scalable AI solutions. Performance Enhancements in .NET 9: Raising the Bar for Azure Workloads .NET 9 introduces substantial performance upgrades with over 7,500 merged pull requests focused on speed and efficiency, ensuring .NET 9 applications run optimally on Azure. These improvements contribute to reduced cloud costs and provide a high-performance experience across Windows, Linux, and macOS. To see how significant these performance gains can be for cloud services, take a look at what past .NET upgrades achieved for Microsoft’s high-scale internal services: Bing achieved a major reduction in startup times, enhanced efficiency, and decreased latency across its high-performance search workflows. Microsoft Teams improved efficiency by 50%, reduced latency by 30–45%, and achieved up to 100% gains in CPU utilization for key services, resulting in faster user interactions. Microsoft Copilot and other AI-powered applications benefited from optimized runtime performance, enabling scalable, high-quality experiences for users. Upgrading to the latest .NET version offers similar benefits for cloud apps, optimizing both performance and cost-efficiency. For more information on updating your applications, check out the .NET Upgrade Assistant. For additional details on ASP.NET Core, .NET MAUI, NuGet, and more enhancements across the .NET platform, check out the full Announcing .NET 9 blog post. Conclusion: Your Path to the Future with .NET 9 and Azure .NET 9 isn’t just an upgrade—it’s a leap forward, combining cutting-edge AI integration, cloud-native development, and unparalleled performance. Paired with Azure’s scalability, these advancements provide a trusted, high-performance foundation for modern applications. Get started by downloading .NET 9 and exploring its features. Leverage .NET Aspire for streamlined cloud-native development, deploy scalable apps with Azure, and embrace new productivity enhancements to build for the future. For additional insights on ASP.NET, .NET MAUI, NuGet, and more, check out the full Announcing .NET 9 blog post. Explore the future of cloud-native and AI development with .NET 9 and Azure—your toolkit for creating the next generation of intelligent applications.5KViews1like0CommentsOperationalize AI apps innovation at scale by modernizing apps and data on Microsoft Azure
This blog explores how modernizing apps and data on Microsoft Azure can help operationalize AI applications at scale, providing businesses with the tools and infrastructure needed to thrive.186Views0likes0CommentsAzure App Service Logging: How to Monitor Your Web Apps in Real-Time
As a developer, having visibility into the behavior of your applications is crucial to maintaining the reliability and performance of your software. Luckily, Azure App Service provides two powerful logging features to help you monitor your web apps in real-time: App Service Logs and Log Stream. In this blog post, we'll explore how to configure these features for both Windows and Linux Web Apps in Azure App Service.77KViews8likes9CommentsAzure Kubernetes Service Baseline - The Hard Way, Part Deux
Have you suffered through our blog named Azure Kubernetes Service Baseline - The Hard Way? Well, it's time for some more hard work as we bring you the next episode "Azure Kubernetes Service Baseline - The Hard Way, Part Deux" which looks into securing the workloads in your Kubernetes cluster even further using Workload Identity, Network Policies and Microsoft Defender for Containers.13KViews1like0Comments