Azure Container Apps
122 TopicsEasily deploy .NET apps to Azure Container Apps with default configuration for data protection
The Azure Container Apps and .NET team have made it easier than ever to deploy your .NET application by supporting automatic configuration for data protection. This support is currently available as an opt-in feature in the Container Apps API version 2024-02-02-preview. This blog post will discuss the feature and what it enables, how to determine if your application is correctly configured, and how to enable configuration for data protection across a variety of .NET versions.2KViews1like1CommentBuild and Modernize Intelligent Java apps at Scale
Java on Microsoft Azure Java customers and developers are constantly exploring how they can bring their Java applications to the cloud. Some are looking to modernize existing applications, while others are building new cloud-native solutions from scratch. With these changes, they need a platform that lets them keep working the way they know, without sacrificing control or performance. That’s where Microsoft Azure comes in. As a company, Microsoft is committed to making Java developers as efficient and productive as possible, empowering them to use any tool, framework, and application server on any operating system. Microsoft Azure makes it easy to work with the tools and frameworks Java developers already know and love. Whether using IntelliJ, Eclipse, or VS Code, or managing dependencies with Maven or Gradle, developers can keep using their preferred setup. Azure supports trusted Java application servers and popular open-source tools like Spring Boot, JBoss EAP, and WebLogic, making the transition to the cloud seamless and natural. Scaling on Azure is designed with simplicity and security in mind. Developers can count on built-in tools for monitoring, automation, data support, and caching, along with robust security features. With Azure’s flexible services they can scale confidently, manage costs, and build resilient applications that meet business needs. Azure provides everything Java developers need to build and modernize their applications at scale, letting them do so on their own terms. Tooling for Java app migration and modernization priorities Moving your Java applications to the cloud is easier with the right tools. Azure offers a full set of solutions for every type of migration, whether you are rehosting, re-platforming, refactoring, or rearchitecting. These tools work together to help you transition smoothly, allowing you to work faster, more efficiently, and with greater insight. With Azure, you can achieve meaningful results for your business as you modernize your applications. Azure Migrate and Partner-built Solutions Azure Migrate is a key resource in this process. It provides a holistic view of your server and application estate and generates a cloud-readiness report. With app centricity, you can now assess applications at a portfolio level rather than server by server. This makes it easier for IT decision-makers to plan migrations on a larger scale while aligning with business priorities. In addition to Azure Migrate, you can leverage several partner-built solutions such as CAST, Unify, Dr. Migrate, and others to support additional use cases and scenarios. Azure Migrate application and code assessment For developers, Azure Migrate’s app and code assessment tool (AppCAT) offers in-depth code scanning for Java applications. With this tool, you can assess code changes needed to run your apps in the cloud right from within your preferred terminals, like Bash. GitHub Copilot Chat integration further simplifies the planning process, making it easy to explore modernization options through a conversational approach. AppCAT is especially useful for detailed assessments for refactoring and rearchitecting. GitHub Copilot upgrade assistant for Java A major advancement in this toolkit is the new GitHub Copilot upgrade assistant for Java. Upgrading Java code, runtimes, frameworks, and dependencies can be time-consuming, but with the upgrade assistant, you can streamline the process significantly. Start with your local Java project, receive an AI-powered upgrade strategy, and let Copilot handle the bulk of the work. This powerful tool helps you modernize faster, allowing you to focus on building intelligent applications at scale with confidence. Ready to save time upgrading Java? You can apply for the waitlist to the Technical Preview right here – aka.ms/GHCP-for-Java. This early access is open to a limited number of customers, so we encourage you to sign up soon and share your feedback! Deploy and Scale Java Apps on Azure The Java ecosystem is diverse, encompassing technologies like Java SE, Jakarta EE, Spring, and various application servers. Whatever your Java workload – whether building a monolithic app or a cloud-native microservice – Azure provides a comprehensive platform to support it. Azure offers multiple deployment paths to help meet your specific project goals. For those migrating existing Java applications, infrastructure-as-a-service (IaaS) options like Azure Virtual Machines allow you to lift and shift applications without significant re-architecture. Meanwhile, container options, such as Azure Kubernetes Service (AKS), Azure Container Apps and Azure Red Hat OpenShift, make it easier to manage Java applications in containers. Fully managed platform-as-a-service (PaaS) offerings, like Azure App Service, provide out-of-the-box scalability, DevOps integration, and automation for streamlined management. The following diagram shows recommended Azure services for every Java application type deployed as source or binaries: The following diagram shows the recommended Azure services for every Java application type deployed as containers: Building on Azure's reputation as a versatile platform for various applications, we now turn our focus to three specific offerings that demonstrate this flexibility. Today, we highlight JBoss EAP on Azure App Service, Java on Azure Container Apps, and WebSphere Liberty on Azure Kubernetes Service and how to quickly bring your apps to production with Landing Zone Accelerator. We will also walk you through how to build and modernize intelligent Java apps at scale with the latest AI tools and models. JBoss EAP on Azure App Service Azure App Service offers a fully managed platform with specific enhancements for Java, making it an excellent choice for running enterprise Java applications. Recently, several updates have been introduced to bring even greater value to Java developers using JBoss EAP on App Service: Reduced Licensing Costs: Licensing fees for JBoss EAP on App Service have been cut by over 60%, making it more accessible to a wider range of users. Free Tier Availability: A new free tier is available for those interested in testing the service without an upfront cost, providing an easy entry point for trials and evaluation. Affordable Paid Tiers: Lower-cost paid tiers of App Service Plan for JBoss EAP have been introduced, catering to businesses seeking a cost-effective, production-ready environment. Bring Your Own License Support: Soon, customers will be able to apply existing Red Hat volume licenses to further reduce operational costs, adding flexibility for organizations already invested in Red Hat JBoss EAP. These updates provide significant savings, making JBoss EAP on App Service a smart choice for those looking to optimize costs while running Java applications on a reliable, managed platform. Java on Azure Container Apps Azure Container Apps is a popular serverless platform for Java developers who want to deploy and scale containerized applications with minimal management overhead. Designed for microservices, APIs, and event-driven workloads, Azure Container Apps makes it simple to scale applications from zero up to millions of requests, adapting dynamically to meet real-time demand. Azure Container Apps includes several features tailored specifically for Java: Managed Components for Java: With built-in Spring Cloud services like Service Registry and Config Server, managing Java applications is straightforward. These components simplify service registration, discovery, and configuration management. Enhanced Java Monitoring: Azure Monitor provides Java-specific insights, giving developers visibility into their applications and enabling proactive management with detailed metrics. Effortless Scaling: Container Apps can scale down to zero during periods of low demand and scale out as traffic grows, helping optimize costs. The platform also supports GPU-enabled workloads, perfect for AI-powered Java applications. This fully managed platform supports a range of Java frameworks and runtimes, from Spring Boot to Quarkus to Open Liberty and beyond. With built-in DevOps, secure networking, role-based access, and pay-as-you-go pricing, Azure Container Apps offers a powerful and flexible foundation to build, deploy, and monitor any Java application type. WebSphere Liberty on Azure Kubernetes Service IBM's WebSphere is one of the most widely used middleware platforms globally, especially in large enterprises. Many organizations rely on WebSphere Traditional applications, which have strong market penetration in enterprise environments. As IBM focuses on cloud-native solutions, it is encouraging organizations to migrate from WebSphere Traditional to WebSphere Liberty - a more modern, cloud-native Java runtime. With Azure Kubernetes Service, this migration becomes straightforward and manageable, allowing organizations to bring existing WebSphere Traditional apps into a more flexible, scalable environment. Why Azure Kubernetes Service? AKS provides a powerful platform for running containerized Java applications without the complexity of setting up and maintaining Kubernetes yourself. It’s a fully managed Kubernetes service, integrated end-to-end with Azure’s foundational infrastructure, CI/CD, registry, monitoring, and managed services. Because AKS is based on vanilla Kubernetes, all Kubernetes tools work, and there’s no risk of lock-in. AKS offers global availability, enterprise-grade security, automated upgrades, and compliance, making it a reliable choice for organizations aiming to modernize WebSphere applications. Competitive pricing and cost optimization make AKS even more attractive. Why Transform to WebSphere Liberty? WebSphere Liberty, along with Open Liberty, offers compatibility with WebSphere Traditional, creating an easy migration path. Liberty is a lightweight, modular runtime that’s better suited for cloud-native applications. It reduces resource costs, requiring less memory and CPU than WebSphere Traditional and has quicker startup times. Liberty also embraces modern standards, like Jakarta EE Core Profile and MicroProfile, making it ideal for cloud-native applications. Organizations can even re-purpose existing WebSphere Traditional licenses, significantly reducing migration costs. Running WebSphere Liberty on Azure Kubernetes Service is simple and flexible. IBM and Microsoft have certified Liberty on AKS, providing a reliable path for enterprises to move their WebSphere applications to the cloud. With a solution template available in the Azure Marketplace, you can deploy WebSphere Liberty on AKS in a few clicks. This setup works with both new and existing AKS clusters, as well as any container registry, allowing you to deploy quickly and scale as needed. By combining WebSphere Liberty with AKS, you gain the agility of containers and Kubernetes, along with the robust features of a cloud-native runtime on a trusted enterprise platform. Build Right and Fast! Build Your Java or Spring Apps Environment: Development, Test, or Production in Just 15-30 Minutes with Landing Zone Accelerator! To ensure the scalability and quality of your cloud journey, we re-introduce Landing Zone Accelerators, specifically designed for Azure app destinations such as App Service, Azure Container Apps, and Azure Kubernetes Service. An accelerator allows you to establish secure, complaint, and scalable development, test, or production environments within 15-30 minutes. Adhering to Azure's best practices and embedding security by default, a Landing Zone Accelerator ensures that your cloud transition is not only swift but also robust and scalable. It paves the way for both application and platform teams to thrive in the cloud environment. From realizing cost efficiency to streamlining your migration and modernization journey to ensuring the scalability of your cloud operations, our goal is to demonstrate how your cloud transition can drive innovation, and efficiency, and accelerate business value. The Landing Zone Accelerators for App Service, Azure Container Apps, and Azure Kubernetes Service represent an authoritative, proven, and prescriptive infrastructure-as-code solution, designed to assist enterprise customers in establishing a robust environment for deploying Java, Spring, and polyglot apps. It not only expedites the deployment process but also provides a comprehensive design framework, allowing for the clear planning and designing of Azure environments based on established standards. Build Intelligent Java Apps at Scale Today, many enterprise applications are built with Java. As AI grows in popularity and delivers greater business outcomes, Java developers wonder how to integrate it with their apps. Python is popular for AI - particularly for model building, deploying and fine tuning LLMs, and data handling - but moving an app to a new language can be complex and costly. Instead, Java developers can use Azure to combine their Java apps with AI, building intelligent apps without needing to master Python. Azure makes it simple to bring AI into your existing Java applications. Many customers are already using the Azure platform to add intelligence to their Java apps, delivering more value to their businesses. Whether starting fresh or modernizing existing systems, Azure provides the tools needed to build powerful, intelligent applications that scale. Modernize and Build New Intelligent Apps with Azure. Wherever you are in your cloud journey, Azure helps you modernize and build intelligent apps. Azure offers app platform services, data handling at scale, and AI tools that make it easy to create applications that deliver meaningful business value. Intelligent apps can drive growth, amplify team capabilities, and improve productivity. With Azure, you can bring your Java apps into the future and stay ahead of the competition. The Right Tools for Intelligent Java Apps. Building intelligent applications requires a strong foundation. Azure provides essential services like a robust cloud platform, scalable data solutions, and AI tools, including pretrained models and responsible AI practices. These tools ensure your apps are efficient, scalable, and aligned with best practices. Azure offers several key services for this: Azure AI Studio: A one-stop platform for experimenting and deploying AI solutions. It provides tools for model benchmarking, solution testing, and monitoring, making it easy to develop use cases like customer segmentation and predictive maintenance. Azure OpenAI Service: With access to advanced AI models like GPT-4, this service is ideal for content generation, summarization, and semantic search. Build chatbots, create marketing content, or add AI-driven features to your Java apps. Azure Machine Learning: An end-to-end platform for building and deploying machine learning models. It supports various use cases such as recommendation systems, predictive analytics, and anomaly detection. MLOps capabilities ensure your models are continuously improved and managed. Azure AI Search: Uses retrieval-augmented generation (RAG) technology for powerful search capabilities. Enhance user experience with intelligent search options, helping users quickly find relevant information. Azure Cosmos DB: A globally distributed, multi-model database service ideal for high-performance, low-latency applications. It offers turnkey global distribution, automatic scalability, and integration with other Azure services, making it a strong choice for intelligent apps that handle large amounts of data. Azure Database for PostgreSQL with PGVector: This managed PostgreSQL service now includes the PGVector extension, designed for handling vector embeddings in AI applications. It’s a valuable tool for applications requiring fast, similarity-based searches and supports applications involving recommendation engines, semantic search, and personalization. Azure AI Infrastructure: Provides high-performance infrastructure for AI workloads. Whether training large models or performing real-time inference, Azure’s AI infrastructure meets demanding needs. Get Started with AI in Java. If you are a Java app developer, now is a great time to start integrating AI into your apps. Spring developers can use Spring AI for quick integration, and developers using Quarkus or Jakarta EE or any other app type can take advantage of LangChain4j. You can also use Microsoft Azure AI client libraries for Java. No matter what your framework is, Azure has the tools to help you add intelligence to your applications. Meet the Java team at the Microsoft Ignite 2024 Come meet the Java team at Microsoft Ignite 2024! Join our breakout session, "Java on Azure: Modernize and scale enterprise Java applications on Azure" BRK147, for a close look at the newest ways to build, scale, and modernize Java apps on Azure. In this session, our engineers and product experts will share the latest updates and tools for Java developers. You’ll learn about cost-saving options, new cloud tools, and how to add smart features to your apps. This is a session for all Java developers, whether you're moving apps to the cloud or building cloud-native apps from scratch. Everyone can join - either in person at Ignite or virtually from anywhere in the world. The virtual option is free, so you can attend without leaving your desk. Don’t miss the chance to connect with the Java team, ask questions, and get tips to make your Java apps succeed on Azure! Start Today! Join Us at Java + AI Events Worldwide. Sign Up for upcoming Java and AI events like JDConf 2025 and JavaOne 2025. You’ll also find our developer advocates sharing insights and tips at Java conferences and events around the world. Begin framing your app migration plans with resources to guide you through each step. Get started here – aka.ms/Start-Java. Explore the docs and deploy your first Java or Spring app in the cloud. Follow the quick steps here – aka.ms/Java-Hub. Use our tools and information to build a plan and show your leaders the benefits of Java app modernization. Get the details here – azure.com/Java. Start building, planning, and exploring Azure for Java today!226Views0likes0CommentsFrom Code to Cloud: Deploy Your Java Apps to Azure in Just 2 Steps!
Microsoft Azure is a great destination for Java applications, and our goal is to help Java developers easily onboard their applications to Azure. We understand that onboarding Java apps to Azure is not only about deploying Java apps but also includes the provisioning of dependent services like databases, messaging services as well as compute services like Azure Kubernetes Service, Azure Application Service, Azure Container Apps, etc. Java developers new to Azure often use either the Azure CLI or the Azure Portal to provision necessary Azure services. This requires learning many different CLI commands, different Portal experiences, and even different infrastructure-as code languages such as Bicep or Terraform. As my colleague Julia Muiruri wrote last week, Azure Developer CLI (AZD) helps developers quickly and efficiently onboard their applications to Azure.Today we are announcing the private preview of a set of new features designed specifically for Java developers. These features are implemented in the AZD CLI and exposed via a Visual Studio Code extension. Register now to try out these features and provide feedback. The new features in AZD simplify Azure onboarding for Java developers by providing a solution that requires only 2 steps to provision and deploy your application to Azure. New Java features in AZD This diagram illustrates the new features in AZD that simplify the developer experience for provisioning and deploying Java applications to Azure: Open your existing Java project in your preferred IDE. If you are using Maven, right click (the 1 st click) on the root pom.xml and choose “generate Azure deployment scripts”. Our tools will generate azure.yaml which declares all the necessary Azure services for your Java project and any connection information your Java application needs to connect to Azure services (such as a database or a messaging service). Next, right click (the 2 nd click) on azure.yaml and select “deploy to Azure”. As a result, all required Azure services will be provisioned automatically and your Java application will be deployed to the appropriate Azure compute service. To achieve this simplicity, AZD offers these key capabilities: An application-centric azure.yaml as the source definition to describe your application. There is no need to use different CLI commands, Bicep or Terraform scripts. Generate azure.yaml from any existing Java repository. You don’t need to write azure.yaml since AZD can analyze your Java project to auto detect resources and bindings between applications and dependent services. AZD will generate the necessary Bicep files from azure.yaml. If you need to view the Bicep files, AZD will synthesize these files for you to review. Getting started Let’s look at how to deploy a simple Spring application to Azure. The sample application is a to-do list management app, composed of three microservices: web, api and email services. When a user adds or updates a to-do item on the web page, the to-do item will be stored into MongoDB by the api service. For to-do items that are “due”, the api service will send messages into Azure Service Bus. These messages will be consumed by email service to send an email to you. This diagram shows the overall architecture and interaction relationship among the three services and their backing Azure services: MongoDB and Azure Service Bus. Putting it all together, to deploy this Spring application to Azure we need to: Provision Azure Cosmos DB for Mongo API. Provision Azure Service Bus. Provision the compute service to run the app, such as Azure Container Apps Deploy all three microservices onto the compute service with correct bindings to the DB or message service. Now let’s see how to do all this with just 2 clicks: Open VS code with this Spring project. Right click on the root pom.xml and choose “Generate Azure deployment Script”. azure.yaml will be generated under the root path with all necessary deployment artifacts declared inside. Right click on the azure.yaml and choose “Package, Provision and Deploy (up)” to trigger the overall provisioning and deployment. After deployment, you’ll see your dedicated resource group in the Azure Portal. In the resource group, you’ll see: Mongo DB: provisioned as Azure Cosmos DB service and running Azure Service Bus: provisioned as Azure Service Bus and running 3 Apps: deployed as Azure Container Apps all running inside a single Azure Container App environment. How to enroll in the private preview Are you a Java developer new to Azure and looking onboard your project or your team’s? Do you need to quickly set up an Azure environment for demonstrations, or aim to reduce development environment maintenance effort as the platform team? Azd might be the ideal solution for you. We encourage you to give it a try today! The new features are now in private preview. If interested, please register by scanning below QR Code or visitinghttps://aka.ms/azd-java-preview directly. We'll contact you within days. Your feedback will help us improve the public preview and GA releases.349Views3likes0CommentsSeamlessly Integrating Azure KeyVault with Jarsigner for Enhanced Security
Dive into the world of enhanced security with our step-by-step guide on integrating Azure KeyVault with Jarsigner. Whether you're a beginner or an experienced developer, this guide will walk you through the process of securely signing your Java applications using Azure's robust security features. Learn how to set up, execute, and verify digital signatures with ease, ensuring your applications are protected in an increasingly digital world. Join us to boost your security setup now!6.5KViews0likes1CommentIntroducing Serverless GPUs on Azure Container Apps
We're excited to announce the public preview ofAzure Container Apps Serverless GPUs accelerated by NVIDIA. This feature provides customers with NVIDIA A100 GPUs and NVIDIA T4 GPUs in a serverless environment, enabling effortless scaling and flexibility for real-time custom model inferencing and other machine learning tasks. Serverless GPUs accelerate the speed of your AI development team by allowing you to focus on your core AI code and less on managing infrastructure when using NVIDIA accelerated computing. They provide an excellent middle layer option betweenAzure AI Model Catalog's serverless APIs and hosting models on managed compute. It provides full data governance as your data never leaves the boundaries of your container while still providing a managed, serverless platform from which to build your applications. Serverless GPUs are designed to meet the growing demands of modern applications by providing powerful NVIDIA accelerated computing resources without the need for dedicated infrastructure management. "Azure Container Apps' serverless GPU offering is a leap forward for AI workloads. Serverless NVIDIA GPUs are well suited for a wide array of AI workloads from real-time inferencing scenarios with custom models to fine-tuning. NVIDIA is also working with Microsoft to bring NVIDIA NIM microservices to Azure Container Apps to optimize AI inference performance.” - Dave Salvator, Director, Accelerated Computing Products, NVIDIA Key benefits of serverless GPUs Scale-to zero GPUs: Support for serverless scaling of NVIDIA A100 and T4 GPUs. Per-second billing: Pay only for the GPU compute you use. Built-in data governance: Your data never leaves the container boundary. Flexible compute options: Choose between NVIDIA A100 and T4 GPUs. Middle-layer for AI development: Bring your own model on a managed, serverless compute platform. Scenarios Whether you choose to use NVIDIA A100 or T4 GPUs will depend on the types of apps you're creating. The following are a couple example scenarios. For each scenario with serverless GPUs, you pay only for the compute you use with per-second billing, and your apps will automatically scale in and out from zero to meet the demand. NVIDIA T4 Real-time and batch inferencing: Using custom open-source models with fast startup times, automatic scaling, and a per-second billing model, serverless GPUs are ideal for dynamic applications that don't already have a serverless API in the model catalog. NVIDIA A100 Compute intensive machine learning scenarios: Significantly speed up applications that implement fine-tuned custom generative AI models, deep learning, or neural networks. High performance computing (HPC) and data analytics: Applications that require complex calculations or simulations, such as scientific computing and financial modeling as well as accelerated data processing and analysis among massive datasets. Get started with serverless GPUs Serverless GPUs are now available for workload profile environments in West US 3 and Australia East regions with more regions to come. You will need to have quota enabled on your subscription in order to use serverless GPUs. By default, all Microsoft Enterprise Agreement customers will have one quota. If additional quota is needed, please request it here. Note: In order to achieve the best performance with serverless GPUs, use an Azure Container Registry (ACR) with artifact streaming enabled for your image tag.Follow steps here to enable artifact streaming on your ACR. From the portal, you can select to enable GPUs for your Consumption app in the container tab when creating your Container App or your Container App Job. You can also add a new consumption GPU workload profile to your existing Container App environment through the workload profiles UX in portal or through the CLI commands for managing workload profiles. Deploy a sample Stable Diffusion app To try out serverless GPUs, you can use the stable diffusion image which is provided as a quickstart during the container app create experience: In the container tab select the Use quickstart image box. In the quickstart image dropdown, selectGPU hello world container. If you wish to pull the GPU container image into your own ACR to enable artifact streaming for improved performance, or if you wish to manually enter the image, you can find the image atmcr.microsoft.com/k8se/gpu-quickstart:latest. For full steps on using your own image with serverless GPUs, see the tutorial onusing serverless GPUs in Azure Container Apps. Learn more about serverless GPUs With serverless GPUs, Azure Container Apps now simplifies the development of your AI applications by providing scale-to-zero compute, pay-as you go pricing, reduced infrastructure management, and more. To learn more, visit: Using serverless GPUs in Azure Container Apps (preview) | Microsoft Learn Tutorial: Generate images using serverless GPUs in Azure Container Apps (preview) | Microsoft Learn1.8KViews1like0CommentsConnect Privately to Azure Front Door with Azure Container Apps
Azure Container Apps is a fully managed serverless container service that enables you to deploy and run containerized applications with per-second billing and autoscaling without having to manage infrastructure. The service also provides support for a number of enhanced networking capabilities to address security and compliance needs such as network security groups (NSGs), Azure Firewall, and more. Today, Azure Container Apps is excited to announce public preview for another key networking capability, private endpoints for workload profile environments. This feature allows customers to connect to their Container Apps environment using a private IP address in their Azure Virtual Network, thereby eliminating exposure to the public internet and securing access to their applications. With the introduction of private endpoints for workload profile environments, you can now also establish a direct connection from Azure Front Door to your Container Apps environment viaPrivate Link. By enabling Private Link for an Azure Container Apps origin, customers benefit from an extra layer of security that further isolates their traffic from the public internet. Currently, you can configure this connectivity through CLI (portal support coming soon). In this post, we will do a brief overview of private endpoints on Azure Container Apps and the process of privately connecting it to Azure Front Door. Getting started with private endpoints on Azure Container Apps Private endpoints can be enabled either during the creation of a new environment or within an existing one. For new environments, you simply navigate to theNetworking tab, disable public network access, and enable private endpoints. To manage the creation of private endpoints in an existing environment, you can use the newNetworking blade, which is also in public preview. Since private endpoints use a private IP address, the endpoint for a container app is inaccessible through the public internet. This can be confirmed by the lack of connectivity when opening the application URL. If you prefer using CLI, you can find further guidance in enabling private endpoints at Use a private endpoint with an Azure Container Apps environment (preview). Adding container apps as a private origin for Azure Front Door With private endpoints, you can securely connect them to Azure Front Door through Private Link as well. The current process involves CLI commands that guide you in enabling an origin for Private Link and approving the private endpoint connection. Once approved, Azure Front Door assigns a private IP address from a managed regional private network, and you can verify the connectivity between your container app and the Azure Front Door. For a detailed tutorial, please navigate toCreate a private link to an Azure Container App with Azure Front Door (preview). Troubleshooting Have trouble testing the private endpoints? After creating a private endpoint for a container app, you can build and deploy a virtual machine to test the private connection. With no public inbound ports, this virtual machine would be associated with the virtual network defined during creation of the private endpoint. After creating the virtual machine, you can connect via Bastion and verify the private connectivity. You may find outlined instructions atVerify the private endpoint connection. Conclusion The public preview of private endpoints and private connectivity to Azure Front Door for workload profile environments is a long-awaited feature in Azure Container Apps. We encourage you to implement private endpoints for enhanced security and look forward to your feedback on this experience at our GitHub page. Additional Resources To learn more, please visit the following links to official documentation: Networking in Azure Container Apps environment - Private Endpoints Use a private endpoint with an Azure Container Apps environment Create a private link to an Azure Container App with Azure Front Door (preview) What is a private endpoint? What is Azure Private Link?901Views2likes3CommentsExciting Updates Coming to Conversational Diagnostics (Public Preview)
Last year, at Ignite 2023, we unveiled Conversational Diagnostics (Preview), a revolutionary tool integrated with AI-powered capabilities to enhance problem-solving for Windows Web Apps. This year, we're thrilled to share what’s new and forthcoming for Conversational Diagnostics (Preview). Get ready to experience a broader range of functionalities and expanded support across various Azure Products, making your troubleshooting journey even more seamless and intuitive.159Views0likes0CommentsBuild AI faster and run with confidence
Intelligence is the new baseline for modern apps. We’re seeing the energy and excitement from AI coming to life in apps being built and modernized today. AI is now the critical component of nearly every application and business strategy. And most importantly, we’ve reached an inflection point where organizations are moving from widespread experimentation to full production. In fact, we just completed a global survey of more than 1,500 developers and IT influencers: 60% of respondents have AI apps under development that will go to production over the next year. What’s even more exciting is the evolution of what organizations are looking to achieve. Agents and intelligent assistants continue to shape customer service, make customer experiences more personal and help employees make faster, more accurate decisions. And agents that unlock organizational knowledge are increasingly vital. But more than ever, AI is reinventing core business processes. From long-established business to startups (where our business has increased 200%), this holds incredible promise to drive revenue and growth as well as competitive differentiation. AI is both an inspiration for how the next generation of apps will be defined and an accelerator for how we will build and deliver on that promise. This is the tension that we see with every business; intense pressure to move fast, to take advantage of incredible innovation with speed, while delivering every app securely with the performance and scale that meets the unique needs of AI apps. Developers are on the front lines, responsible for moving quickly from idea to code to cloud while delivering secure, private and trusted AI services that meet ROI and cost requirements. This combination of building fast and running with confidence is core to Microsoft’s strategy. So, as we kick off Microsoft Ignite 2024 today, we’re bringing a slate of new and enhanced innovation that makes this transition to production faster and easier for all. With new innovations across the Copilot and AI stack, new integrations that bring together services across our AI platform and developer tools and an expanding set of partnerships across the AI toolchain, there’s a ton of great innovation at Ignite. Let’s look at just a few… GitHub Copilot GitHub Copilot is the most widely used AI pair-programming tool in the world, with millions of developers using it daily. We’re seeing developers code up to 55% faster through real-time code suggestions and solutions, and elevating quality, generating cleaner, more resilient, code that is easier to maintain. This week we’re showing how Copilot is evolving to drive efficiency across the entire application lifecycle. Developers in VS Code now have the flexibility to select from an array of industry-leading models, including OpenAI’s GPT-4o and Anthropic Claude Sonnet 3.5. And they have the power to use natural language chat to implement complex code changes across multiple files. GitHub Copilot upgrade assistant for Java Now the power of Copilot can help with existing apps as well. Keeping Java apps up to date can be a time-consuming task. GitHub Copilot upgrade assistant for Java uses AI to simplify Java applications upgrades with autonomous agents. The process is transparent, keeping a human in the loop, while your environment actively learns from your context and adjustments, improving accuracy for future upgrades. GitHub Copilot for Azure GitHub Copilot for Azure streamlines path from code to production on Azure for every developer, even those new to Azure. Through Copilot, you can use natural language to learn about your Azure services and resources, find sample applications and templates and quickly deploy to Azure while supporting your enterprise standards and guidelines. Once in production, GitHub Copilot for Azure helps you troubleshoot and resolve application issues stay on top of costs. Copilot knows the full context of you as a developer and your systems to make every recommendation tailored to your unique needs. Available now in public preview, it does it all from the tools you already use helping you minimize interruptions and stay focused. Azure AI Foundry New at Ignite, Azure AI Foundry brings together an end-to-end AI platform across models, tooling, safety, and monitoring to help you efficiently and cost-effectively design and scale your AI applications. By integrating with popular developer tools like GitHub, Visual Studio, and Copilot Studio, Azure AI Foundry opens up this full portfolio of services for developers, giving them access to the best, most advanced models in the world along with tools for building agents on Azure and a unified toolchain to access AI services through one interface. Azure AI Foundry is a key offering enabling easy integration of Azure AI capabilities into your applications. AI Template Gallery AI App Template Gallery is a new resource designed to help you build and deploy AI applications in a matter of minutes, with the flexibility to use the programming language, framework and architecture of your choice. The gallery offers more than 25 curated, ready-to-use application templates, creating a clear path to kickstart your AI projects with confidence and efficiency. And developers can easily discover and access each of them through GitHub Copilot for Azure, further simplifying access through your preferred developer tool. Azure Native Integrations Azure Native Integrations gives developers access to a curated set of ISV services available directly in the Azure portal, SDK, and CLI. This means that developers have the flexibility to work with their preferred vendors across the AI toolchain and other common solution areas, with simplified single sign-on and management, while staying in Azure. Joining our portfolio of integrated services arePinecone,Weights & Biases,Arize, and LambdaTest all now available in private preview. Neon,Pure Storage Cloud for Azure VMware Solution (AVS), and Dell APEX File Storage will also be available soon as part of Azure Native Integrations. Azure Container Apps with Serverless GPUs Azure Container Apps now supports Serverless GPUs in public preview, enabling effortless scaling and flexibility for real-time custom model inferencing and other machine learning tasks. Serverless GPUs enable you to seamlessly run your AI workloads on-demand, accessing powerful NVIDIA accelerated computing resources, with automatic scaling, optimized cold start and per-second billing without the need for dedicated infrastructure management. Azure Essentialsfor AI Adoption We also recognize great technology is only part of your success. Microsoft has published design patterns, baseline reference architectures, application landing zones, and a variety of Azure service guides for Azure OpenAI workloads along with FinOps guidance for AI. This week, we are excited to announce new AI specific guidance in the Cloud Adoption Framework and the Azure Well-Architected Framework to help adopt AI at scale while meeting requirements for reliability, security, operations and cost.350Views1like0CommentsUnlock Business Growth with Azure Application Platform innovations
AI is reshaping industries, driving transformation in how businesses operate, communicate, and serve customers. In today’s fast-evolving Generative AI landscape, businesses feel the urgency to transform their customer experiences and business processes with AI applications. In a recent study, IDC found that Generative AI usage has jumped significantly, up from 55% in 2023 to 75% in 2024. The return on investment for Generative AI is also significant. IDC found that companies on average experience a 3.7x return on their investment. To deliver business impact with AI applications, businesses are equally looking to modernizing existing applications. However, building and modernizing applications to deliver scalable, reliable, and highly performant AI applications can be complex, time consuming and resource intensive. Microsoft, a leader in Gartner Magic Quadrant for Cloud Application Platform and Container Management, provides a comprehensive application platform designed to address these challenges. Azure’s application platform and services provide developers and IT operators with a comprehensive suite of services to build, deploy, and scale modern intelligent applications in a quick, secure and efficient manner. Join us at Microsoft Ignite 2024 from Nov 18–22, in Chicago and online, to discover the latest Azure Application Platform updates, enhancements, and tools to accelerate your AI app development and app modernization journey. Accelerate business growth with AI Apps From container-based services with Azure Kubernetes Service (AKS) and Azure Container Apps (ACA), to Platform as a Service (PaaS) offering like Azure App Service, and powerful integration capabilities with Azure Integration Services and serverless capabilities with Azure Functions – Azure’s Application Platform provides a complete, end-to-end solution for building, deploying, and managing AI applications - all in one place. The unified platform enables businesses to go from ideas to production faster by leveraging an extensive array of 1600+ AI models in Azure AI Studio Model Catalog, integrated with popular developer tools like GitHub, GitHub Copilot and Visual Studio, and real-time transactional data in Azure databases. At Microsoft Ignite, we are announcing several new enhancements to our Application Platform to help you build transformational AI applications: Real-time AI inferencing with Serverless GPUs: Azure Container Apps now support serverless GPUs in public preview. Serverless GPU enables you to seamlessly run AI workloads on-demand with automatic scaling, optimized cold start, per-second billing with scale down to zero when not in use, and reduced operational overhead to support easy real-time custom model inferencing and other machine learning tasks. Learn more. Azure Container Apps Dynamic Sessions: Dynamic sessions in Azure Container Apps, now generally available, is a fast, sandboxed, ephemeral compute, suitable for running AI-generated, untrusted code at scale in multi-tenancy scenarios. Each session has full compute isolation using Hyper-V. You now have easy access to fast, ephemeral, sandboxed compute on Azure without managing infrastructure. Learn more. AI capabilities in Azure Logic Apps: AI capabilities are now available in the Azure Logic Apps Consumption SKU in public preview, offering advanced features like the Azure AI Search Connector, Azure OpenAI Connector, and Forms Recognizer. These enhancements enable intelligent document processing, enhanced search, and language capabilities for more intelligent workflows. Additionally, Azure Logic Apps Standard now supports Templates, providing pre-built workflow solutions to streamline integration workflow development.Learn more. AI Template Gallery: To help developers quickly get started with building AI apps, we’ve created an AI App Template Gallery that features templates for common AI use cases. With these templates, you can start building AI apps on Azure in as little as 5 minutes. Free services for AI apps: Start building AI apps with free Azure application, data, and AI services to minimize upfront costs. Explore which services offer free monthly amounts to estimate the cost for your project. Discover specialized programs for startups and students to develop AI-powered apps on Azure. Learn more. Learn how customers like H&R Block , Pimco, Novo Nordisk, Harvey, Jato Dynamics, Medigold Health and C.H.Robinson are delivering transformational business impact with AI applications on Azure. Modernize your Apps for AI and continuous innovation To remain competitive, organizations must keep up with modern app development trends and meet evolving customer expectations. This means accelerating application development speed, enhancing scalability and agility, and overcoming the limitations of legacy applications, which can be costly to maintain and challenging to innovate on. At Microsoft Ignite, we are announcing several new features and enhancements to help you accelerate your app modernization and become AI ready faster. Azure App Service: We are excited to announce the general availability of sidecars in Azure App Service, a versatile pathway for organizations to modernize existing apps and add powerful new capabilities without significant rewrites necessary in the main application code. They enable you to add new capabilities by adding SDKs—like AI, logging, monitoring, and security features—to your primary application without the need to significantly modify and redeploy the app. Learn more. GitHub Copilot upgrade assistant for Java: Keeping Java apps up to date with the latest versions can be a time-consuming task. GitHub Copilot upgrade assistant for Java, currently in private preview, enables you to use AI to simplify upgrading Java applications with autonomous AI agents, ensuring trust and transparency throughout the upgrade experience. Learn more. Bring Your Own License for JBoss EAP on Azure App Service: We are excited to announce General Availability of Bring Your Own License for JBoss EAP on Azure App Service is coming in December 2024. You can use existing JBoss EAP subscriptions to deploy applications to Azure App Service, making it far easier to move existing applications to the cloud while retaining support from both Red Hat and Microsoft across the application lifecycle. Learn more. Logic Apps Hybrid Deployment Model: Azure Logic Apps has introduced a new Hybrid Deployment Model, enabling businesses to run integration workflows on their own infrastructure—on-premises, private clouds, or other public clouds. This model allows greater flexibility for meeting specific regulatory, privacy, and network requirements while still benefiting from the rich 1400+ Logic Apps connector library for seamless integration with enterprise systems. Learn more. Azure Migrate application assessments: The newly released Application aware assessment capability in Azure Migrate helps with application-level migrations, rather than individual servers or application components. The Azure Migrate app and code assessment tool and GitHub Copilot Chat integration offer more granular code assessment and AI-assisted suggestions for changes required to successfully run .NET and Java applications on Azure Learn more. Learn how customers likeCommercial Bank of Dubai,Scandinavian Airlines,Nexi, Ossur, Sapiens, MSC Mediterranean Shipping Company andFinastraleverage Azure’s application platform to modernize their business-critical applications and deliver enhanced customer experience. Scale and secure enterprise-grade production apps When working with customers on modernizing their applications, we consistently hear about challenges in securing and scaling legacy systems. Outdated infrastructure often leads to operational bottlenecks, slows development, and impacts competitiveness. With rising cyber threats, older systems can lack the robust security measures necessary to protect data and applications. Microsoft Azure addresses these needs with a globally available, enterprise-grade platform that integrates security at every level. Check out Azure application platform announcements at Microsoft Ignite to help you scale and secure your enterprise applications: Scale, secure, and optimize Azure Kubernetes Service (AKS): AKS is designed to simplify Kubernetes adoption for developers and operators of all skill levels. With the latest updates, AKS is now more user-friendly, secure, and cost-efficient, allowing you to focus on building and scaling your applications without worrying about the underlying infrastructure. Read the summaries below and this blog for more details. Cluster upgrades are now more reliable and efficient, and multiple clusters can be upgraded automatically. Additionally, AKS Automatic (preview) will now dynamically select an appropriate virtual machine (VM) based on the capacity in your Azure subscription. With AKS, you now have full visibility over runtime and host vulnerabilities in your AKS cluster. The AKS security dashboard (now available in preview as a blade in Azure portal) offers a simplified and streamlined view of security vulnerabilities and remediation insights for resource owners or cluster administrators. Trusted launch (generally available) enhances the security of certain virtual machines (VMs) by protecting against advanced and persistent attack techniques. Intelligent workload scheduling in Azure Kubernetes Fleet Manager is now generally available, providing operators more control when deploying workloads to optimize resource utilization and simplify multi-cluster scenarios. Auto-instrumentation through Application Insights (coming soon) makes it easier to access telemetry like metrics, requests, and dependencies, as well as visualizations like the application dashboard and application map. Expanded GenAI gateway capabilities in Azure API Management: Azure API Management has expanded support for GPT-4o models, including both text and image-based capabilities. Additionally, the new Generative AI Gateway Token Quota allows flexible token quotas (daily, weekly, or monthly), helping organizations control costs and track usage patterns for effective resource management. Learn more. Achieve instant fast scaling with Azure Functions: Flex Consumption plan is a new Azure Functions hosting plan that builds on the consumption pay-per-second serverless billing model with automatic scale down to zero when not in use for cost efficiency. With Flex Consumption plan now in General Availability, you can integrate with your virtual network at no extra cost, ensuring secure and private communication, with no considerable impact to your app’s scale out performance. Learn more. Learn how customers like BMW, ABN-AMRO, SPAR, ClearBank are scaling and operating mission critical enterprise-grade production apps on Azure application platform. With the Azure Application Platform, you can build new AI applications and modernize your existing applications faster, while ensuring end-to-end security and scalability across the entire app development journey, from development to production. Join us at Ignite this week and learn more about updates for Azure Digital and App Innovation: Here.295Views0likes0Comments