cloud native
76 TopicsAzure DevOps - Agent pool report and replace.
As usage of Azure DevOps organisations grow so do the number of projects, repositories, pipelines and agent pools used. With new services available such as Managed DevOps Pools it can appear a mammoth task for central IT function to manually trawl through every pipeline noting down each agentpool being used. Replacing these values potentially even more complicated after creating the new agent pools and mapping them with potential for human error.232Views0likes0CommentsSelf Hosted AI Application on AKS in a day with KAITO and CoPilot.
In this blog post I document my experience of spending a full day using KAITO and Copilot to accelerate deployment and development of a self managed AI enabled chatbot deployed in a managed cluster. The goal is to showcase how quickly using a mix of AI tooling we can go from zero to a self hosted, tuned LLM and chatbot application. At the top of this article I want to share my perspective on the future of projects such as KAITO. At the moment I believe KAITO to be somewhat ahead of its time, as most enterprises begin adopting abstracted artificial intelligence it is brilliant to see projects like KAITO being developed ready for the eventual abstraction pendulum to swing back, motivated by usual factors such as increased skills in the market, cost and governance. Enterprises will undoubtedly in the future look to take centralised control of the AI models being used by their enterprises as GPU's become cheaper, more readily available and powerful. When this shift happens open source projects like KAITO will become common place in enterprises. It is also my opinion that Kubernetes lends itself perfectly to be the AI platform of the future a position shared by the CNCF (albeit both sources here may be somewhat biased). The resiliency, scaling and existence of Kuberentes primitives such as "Jobs" mean that Kubernetes is already the de-facto platform for machine learning training and inference. These same reasons also make Kuberentes the best underlying platform for AI development. Companies including DHL, Wayve and even OpenAI all run ML or AI workloads already on Kubernetes. That does not mean that Data Scientists and engineers will suddenly be creating Dockerfiles or exploring admission controllers, Kubernetes instead, as a platform will be multiple layers of abstraction away (Full scale self service platform engineering) however the engineers responsible for running and operating the platform will hail projects like KAITO.343Views1like0CommentsSeamlessly Integrating Azure KeyVault with Jarsigner for Enhanced Security
Dive into the world of enhanced security with our step-by-step guide on integrating Azure KeyVault with Jarsigner. Whether you're a beginner or an experienced developer, this guide will walk you through the process of securely signing your Java applications using Azure's robust security features. Learn how to set up, execute, and verify digital signatures with ease, ensuring your applications are protected in an increasingly digital world. Join us to boost your security setup now!6.7KViews0likes1CommentLearn New Skills in the New Year
New year’s resolution: Start writing better code faster in 2025. Kick off the new year by learning new developer skills and elevate your career to the next level. In this post, we explore learning resources and live events that will help you build critical skills and get started with cutting-edge technologies. Learn how to build custom agents, code intelligent apps with familiar tools, discover new possibilities in .NET 9, use Copilot for testing and debugging, and more. Plus, get details about using GitHub Copilot in Visual Studio Code—for free! New AI for Developers page Check out the new AI for Developers page. It's packed with free GitHub courses on building apps, machine learning, and mastering GitHub Copilot for paired programming. Learn your way and skill up for what's next in AI. Use GitHub Copilot in Visual Studio Code for free Did you hear the news? You can now use GitHub Copilot in Visual Studio Code for free. Get details about the new Copilot Free plan and add Copilot to your developer toolbox. What is Copilot Studio? Have questions about Copilot Studio? This article from Microsoft Learn covers all the basics you need to know about Copilot Studio—the low-code tool for easily building agents and extending Microsoft 365 Copilot. From C# to ChatGPT: Build Generative AI Solutions with Azure Combine your C# skills with the cutting-edge power of ChatGPT and Azure OpenAI Service. This free learning path introduces you to building GenAI solutions, using REST APIs, SDKs, and Azure tools to create more intelligent applications. Register for the Powerful Devs Conference + Hackathon Register for the Powerful Devs Conference + Hackathon (February 12-28, 2025) and get more out of Power Platform. This one-day online conference is followed by a 2-week hackathon focused on building intelligent applications with less effort. Code the future with Java and AI: RSVP for Microsoft JDConf 2025 today Get ready for the JDConf 2025—Microsoft's annual event for Java developers. Taking place April 9-10, this year’s event will have three separate live streams to cover different regions. Join to explore tools and skills for building modern apps in the cloud and integrating AI. Build custom agents for Microsoft Teams Learn how to build custom agents for Microsoft Teams. This free learning path will teach you about different copilot stacks, working with Azure OpenAI, building a custom engine agent. Start building intelligent Microsoft Teams apps using the LLMs and AI components. Microsoft Learn: Debug your app with GitHub Copilot in Visual Studio Debug more efficiently using GitHub Copilot. This Microsoft Learn article shows you how. Discover how Copilot will answer detailed questions about your code and provide bug fixes. Make Azure AI Real: Watch Season 2 Elevate your AI game with Make Azure AI Real on demand. Season 2 digs into the latest Azure AI advancements, with practical demos, code samples, and real-world use cases. GitHub Copilot Bootcamp Streamline your workflow with GitHub Copilot—craft more effective prompts and automate repetitive tasks like testing. This GitHub Copilot Bootcamp is a 4-part live streaming series that will help you master GitHub Copilot. 10 Days of GenAI – Gift Guide Edition Start building your own Gen AI application. These short videos outline 10 steps for creating your app—choose a model, add functions, fine tune responses, and more. Extend Microsoft 365 Copilot with declarative agents using Visual Studio Code Check out this new learning path from Microsoft Learn to discover how you can extend Microsoft 365 Copilot with declarative agents using VS Code. Learn about declarative agents and how they work. Developer's guide to building your own agents Want to build your own agents? Watch this Ignite session on demand for a look at the new agent development tools. Find out how to create agents built on Microsoft 365 Copilot or your custom AI engine. Master distributed application development with .NET Aspire Get started with .NET Aspire—an opinionated, cloud-ready stack for building distributed applications with .NET. This series covers everything from setup to deployment. Start your journey toward mastering distributed app development. Learn: What's new in .NET 9 Discover what's new in .NET 9. Learn about new features for AI, improvements for building cloud-native apps, performance enhancements, updates to C#, and more. Read the overview and get started with .NET 9. Become a .NET AI engineer using the OpenAI library for .NET Use your .NET skills to become an AI engineer. With the OpenAI library, .NET developers can quickly master critical AI skills and apply them to real world apps. Read the blog to learn more about the OpenAI library for .NET. Test like a pro with Playwright and GitHub Copilot Supercharge your testing using Playwright and GitHub Copilot. Watch this in-depth demo and discover how you can easily create end-to-end tests using Playwright's powerful built-in code generator. Other news and resources from around Microsoft · Microsoft Learn: Why and how to adopt AI in your organization · Microsoft Learn: Learn to use Copilot in Microsoft Fabric · AI Toolkit for Visual Studio Code: Update highlights · Teams Toolkit for Visual Studio Code update · RAG Deep Dive: Live streams · Learn Together: SQL database in Fabric · Become an AI security expert using OpenAI with Azure Managed Identity · Deploy, monitor, and manage development resources with Microsoft Dev Box · Microsoft Playwright testing · Introduction to artificial intelligence and Azure AI services · Azure AI-900 Fundamentals Training event series · Leveraging cloud-native infra for your intelligent apps · Platform engineering with GitHub · Extend declarative agents for Microsoft 365 Copilot with API plugins using Visual Studio Code · Introducing the Microsoft 365 Agents SDK · Azure Live Q&A events · Get started with multimodal parsing for RAG using GPT-4o, Azure AI Search, and LlamaParse2.4KViews2likes0CommentsUnlock New AI and Cloud Potential with .NET 9 & Azure: Faster, Smarter, and Built for the Future
.NET 9, now available to developers, marks a significant milestone in the evolution of the .NET platform, pushing the boundaries of performance, cloud-native development, and AI integration. This release, shaped by contributions from over 9,000 community members worldwide, introduces thousands of improvements that set the stage for the future of application development. With seamless integration with Azure and a focus on cloud-native development and AI capabilities, .NET 9 empowers developers to build scalable, intelligent applications with unprecedented ease. Expanding Azure PaaS Support for .NET 9 With the release of .NET 9, a comprehensive range of Azure Platform as a Service (PaaS) offerings now fully support the platform’s new capabilities, including the latest .NET SDK for any Azure developer. This extensive support allows developers to build, deploy, and scale .NET 9 applications with optimal performance and adaptability on Azure. Additionally, developers can access a wealth of architecture references and sample solutions to guide them in creating high-performance .NET 9 applications on Azure’s powerful cloud services: Azure App Service: Run, manage, and scale .NET 9 web applications efficiently. Check out this blog to learn more about what's new in Azure App Service. Azure Functions: Leverage serverless computing to build event-driven .NET 9 applications with improved runtime capabilities. Azure Container Apps: Deploy microservices and containerized .NET 9 workloads with integrated observability. Azure Kubernetes Service (AKS): Run .NET 9 applications in a managed Kubernetes environment with expanded ARM64 support. Azure AI Services and Azure OpenAI Services: Integrate advanced AI and OpenAI capabilities directly into your .NET 9 applications. Azure API Management, Azure Logic Apps, Azure Cognitive Services, and Azure SignalR Service: Ensure seamless integration and scaling for .NET 9 solutions. These services provide developers with a robust platform to build high-performance, scalable, and cloud-native applications while leveraging Azure’s optimized environment for .NET. Streamlined Cloud-Native Development with .NET Aspire .NET Aspire is a game-changer for cloud-native applications, enabling developers to build distributed, production-ready solutions efficiently. Available in preview with .NET 9, Aspire streamlines app development, with cloud efficiency and observability at its core. The latest updates in Aspire include secure defaults, Azure Functions support, and enhanced container management. Key capabilities include: Optimized Azure Integrations: Aspire works seamlessly with Azure, enabling fast deployments, automated scaling, and consistent management of cloud-native applications. Easier Deployments to Azure Container Apps: Designed for containerized environments, .NET Aspire integrates with Azure Container Apps (ACA) to simplify the deployment process. Using the Azure Developer CLI (azd), developers can quickly provision and deploy .NET Aspire projects to ACA, with built-in support for Redis caching, application logging, and scalability. Built-In Observability: A real-time dashboard provides insights into logs, distributed traces, and metrics, enabling local and production monitoring with Azure Monitor. With these capabilities, .NET Aspire allows developers to deploy microservices and containerized applications effortlessly on ACA, streamlining the path from development to production in a fully managed, serverless environment. Integrating AI into .NET: A Seamless Experience In our ongoing effort to empower developers, we’ve made integrating AI into .NET applications simpler than ever. Our strategic partnerships, including collaborations with OpenAI, LlamaIndex, and Qdrant, have enriched the AI ecosystem and strengthened .NET’s capabilities. This year alone, usage of Azure OpenAI services has surged to nearly a billion API calls per month, illustrating the growing impact of AI-powered .NET applications. Real-World AI Solutions with .NET: .NET has been pivotal in driving AI innovations. From internal teams like Microsoft Copilot creating AI experiences with .NET Aspire to tools like GitHub Copilot, developed with .NET to enhance productivity in Visual Studio and VS Code, the platform showcases AI at its best. KPMG Clara is a prime example, developed to enhance audit quality and efficiency for 95,000 auditors worldwide. By leveraging .NET and scaling securely on Azure, KPMG implemented robust AI features aligned with strict industry standards, underscoring .NET and Azure as the backbone for high-performing, scalable AI solutions. Performance Enhancements in .NET 9: Raising the Bar for Azure Workloads .NET 9 introduces substantial performance upgrades with over 7,500 merged pull requests focused on speed and efficiency, ensuring .NET 9 applications run optimally on Azure. These improvements contribute to reduced cloud costs and provide a high-performance experience across Windows, Linux, and macOS. To see how significant these performance gains can be for cloud services, take a look at what past .NET upgrades achieved for Microsoft’s high-scale internal services: Bing achieved a major reduction in startup times, enhanced efficiency, and decreased latency across its high-performance search workflows. Microsoft Teams improved efficiency by 50%, reduced latency by 30–45%, and achieved up to 100% gains in CPU utilization for key services, resulting in faster user interactions. Microsoft Copilot and other AI-powered applications benefited from optimized runtime performance, enabling scalable, high-quality experiences for users. Upgrading to the latest .NET version offers similar benefits for cloud apps, optimizing both performance and cost-efficiency. For more information on updating your applications, check out the .NET Upgrade Assistant. For additional details on ASP.NET Core, .NET MAUI, NuGet, and more enhancements across the .NET platform, check out the full Announcing .NET 9 blog post. Conclusion: Your Path to the Future with .NET 9 and Azure .NET 9 isn’t just an upgrade—it’s a leap forward, combining cutting-edge AI integration, cloud-native development, and unparalleled performance. Paired with Azure’s scalability, these advancements provide a trusted, high-performance foundation for modern applications. Get started by downloading .NET 9 and exploring its features. Leverage .NET Aspire for streamlined cloud-native development, deploy scalable apps with Azure, and embrace new productivity enhancements to build for the future. For additional insights on ASP.NET, .NET MAUI, NuGet, and more, check out the full Announcing .NET 9 blog post. Explore the future of cloud-native and AI development with .NET 9 and Azure—your toolkit for creating the next generation of intelligent applications.8.6KViews2likes1CommentConnect Privately to Azure Front Door with Azure Container Apps
Azure Container Apps is a fully managed serverless container service that enables you to deploy and run containerized applications with per-second billing and autoscaling without having to manage infrastructure. The service also provides support for a number of enhanced networking capabilities to address security and compliance needs such as network security groups (NSGs), Azure Firewall, and more. Today, Azure Container Apps is excited to announce public preview for another key networking capability, private endpoints for workload profile environments. This feature allows customers to connect to their Container Apps environment using a private IP address in their Azure Virtual Network, thereby eliminating exposure to the public internet and securing access to their applications. With the introduction of private endpoints for workload profile environments, you can now also establish a direct connection from Azure Front Door to your Container Apps environment via Private Link. By enabling Private Link for an Azure Container Apps origin, customers benefit from an extra layer of security that further isolates their traffic from the public internet. Currently, you can configure this connectivity through CLI (portal support coming soon). In this post, we will do a brief overview of private endpoints on Azure Container Apps and the process of privately connecting it to Azure Front Door. Getting started with private endpoints on Azure Container Apps Private endpoints can be enabled either during the creation of a new environment or within an existing one. For new environments, you simply navigate to the Networking tab, disable public network access, and enable private endpoints. To manage the creation of private endpoints in an existing environment, you can use the new Networking blade, which is also in public preview. Since private endpoints use a private IP address, the endpoint for a container app is inaccessible through the public internet. This can be confirmed by the lack of connectivity when opening the application URL. If you prefer using CLI, you can find further guidance in enabling private endpoints at Use a private endpoint with an Azure Container Apps environment (preview). Adding container apps as a private origin for Azure Front Door With private endpoints, you can securely connect them to Azure Front Door through Private Link as well. The current process involves CLI commands that guide you in enabling an origin for Private Link and approving the private endpoint connection. Once approved, Azure Front Door assigns a private IP address from a managed regional private network, and you can verify the connectivity between your container app and the Azure Front Door. For a detailed tutorial, please navigate to Create a private link to an Azure Container App with Azure Front Door (preview). Troubleshooting Have trouble testing the private endpoints? After creating a private endpoint for a container app, you can build and deploy a virtual machine to test the private connection. With no public inbound ports, this virtual machine would be associated with the virtual network defined during creation of the private endpoint. After creating the virtual machine, you can connect via Bastion and verify the private connectivity. You may find outlined instructions at Verify the private endpoint connection. Conclusion The public preview of private endpoints and private connectivity to Azure Front Door for workload profile environments is a long-awaited feature in Azure Container Apps. We encourage you to implement private endpoints for enhanced security and look forward to your feedback on this experience at our GitHub page. Additional Resources To learn more, please visit the following links to official documentation: Networking in Azure Container Apps environment - Private Endpoints Use a private endpoint with an Azure Container Apps environment Create a private link to an Azure Container App with Azure Front Door (preview) What is a private endpoint? What is Azure Private Link?1.4KViews2likes4CommentsAzure at KubeCon India 2024 | Delhi, India – 11-12 December
Welcome to KubeCon + CloudNativeCon India 2024! We're excited to be part of the inaugural event, where we'll highlight the newest advancements in Azure and Azure Kubernetes Service (AKS) and engage with the vibrant cloud-native community in India. We are pleased to announce several new capabilities in Azure Kubernetes Service focused on AI apps development, ease of use, and features to enhance security, scalability, and networking. Here are the key highlights: Simplifying AI Apps Development AI is becoming increasingly crucial as it empowers organizations to leverage cutting-edge technologies to drive innovation and improve their products and services. By providing intuitive tools and extensions, we aim to make AI more accessible to developers, enabling them to deploy and manage AI models with ease. The AI toolchain operator (KAITO) managed add-on is now available in the AKS Visual Studio Code extension. This add-on simplifies AI inference development with an intuitive and visually engaging UI, allowing customers to deploy open-source AI models to their AKS cluster directly from VSCode. AKS plugins in GitHub Copilot for Azure enable various tasks related to AKS directly from the GitHub Copilot Chat view, including creating an AKS cluster, deploying a manifest, and generating kubectl commands. Easily specify your GPU driver type for your Windows GPU node pools to ensure workload and driver compatibility and run compute-intensive Kubernetes workloads. Enhanced Security, Scalability, and Networking Security, scalability, and networking are critical for ensuring the robustness and reliability of Kubernetes deployments. We provide users with the tools they need to maintain high availability and secure their environments, and are rolling out features that improve disaster recovery, protection, and network management. A new managed solution in AKS restricts IMDS endpoint access for customer pods, enhancing overall security. Vaulted backups for AKS enable cross-region disaster recovery, long-term retention, and enhanced security with immutable backups through Azure Backup. Support for private ingress on cluster creation or through API grants users more granular control over ingress controller configuration. Ease of use AKS is also introducing new capabilities to streamline the user experience and reduce the complexity of managing Kubernetes environments. This includes simplifying notifications, defaulting to parallel image pulls in AKS 1.31, improving the UI for automated deployments, and enhancing logging capabilities to help users save time. AKS Communication Manager simplifies notifications for all AKS maintenance tasks, providing timely alerts on event triggers and outcomes. Enhanced AKS logs with Kubernetes metadata and logs filtering, provide richer context and improved visibility into workloads. We’re excited to meet up with you at KubeCon + CloudNativeCon We hope you’re as excited as we are about the first ever KubeCon + CloudNativeCon India 2024. Azure and Kubernetes have some exciting innovations to offer, and we’re eager to share them with you. Be sure to connect with our team on site: Don’t miss the keynote with Microsoft speaker: On Thursday 12 December 2024 at 10:00 AM IST, Lachlan Evenson will deliver a keynote on how to get started in the open-source and Kubernetes community. Check out these sessions by Microsoft engineers: 11 Dec 2024 5:40pm - 6:15pm IST: Effortless Clustering: Rethinking ClusterAPI with Systemd-Systext 12 Dec 2024 11:30am - 12:55pm IST: Flatcar Container Linux Deep Dive: Deploying, Managing and Automating Workloads Securely Visit the Microsoft booth (G1): Stop by our booth to watch live demos, learn from experts, ask questions, and more. Demos at Booth G1: 11 Dec 2024 Running LLMs on Azure Kubernetes Service with KAITO Partner demo: Cost optimization for AI on Kubernetes with CAST AI Persistent storage options for Kubernetes deployment Azure Linux for AKS Azure Backup for AKS Managed Prometheus and Grafana for AKS Application security in Kubernetes Enhance the security of your Container images with Continuous Patching Partner demo: Ultra-fast testing with HyperExecute on AKS 12 Dec 2024 End-to-end developer experience with AKS Automatic Application Gateway for Containers Partner demo: Choreo Internal Developer Platform Azure Container Networking Services Workload Identity at scale with Spinkube on AKS Securing AKS deployments with Azure Firewall AKS add-ons: KEDA, dapr, NAP and more We look forward to connecting with you and hearing your feedback and suggestions. You can also follow us on X for more updates and news. Happy KubeCon + CloudNativeCon!277Views0likes0CommentsIntroducing Serverless GPUs on Azure Container Apps
We're excited to announce the public preview of Azure Container Apps Serverless GPUs accelerated by NVIDIA. This feature provides customers with NVIDIA A100 GPUs and NVIDIA T4 GPUs in a serverless environment, enabling effortless scaling and flexibility for real-time custom model inferencing and other machine learning tasks. Serverless GPUs accelerate the speed of your AI development team by allowing you to focus on your core AI code and less on managing infrastructure when using NVIDIA accelerated computing. They provide an excellent middle layer option between Azure AI Model Catalog's serverless APIs and hosting models on managed compute. It provides full data governance as your data never leaves the boundaries of your container while still providing a managed, serverless platform from which to build your applications. Serverless GPUs are designed to meet the growing demands of modern applications by providing powerful NVIDIA accelerated computing resources without the need for dedicated infrastructure management. "Azure Container Apps' serverless GPU offering is a leap forward for AI workloads. Serverless NVIDIA GPUs are well suited for a wide array of AI workloads from real-time inferencing scenarios with custom models to fine-tuning. NVIDIA is also working with Microsoft to bring NVIDIA NIM microservices to Azure Container Apps to optimize AI inference performance.” - Dave Salvator, Director, Accelerated Computing Products, NVIDIA Key benefits of serverless GPUs Scale-to zero GPUs: Support for serverless scaling of NVIDIA A100 and T4 GPUs. Per-second billing: Pay only for the GPU compute you use. Built-in data governance: Your data never leaves the container boundary. Flexible compute options: Choose between NVIDIA A100 and T4 GPUs. Middle-layer for AI development: Bring your own model on a managed, serverless compute platform. Scenarios Whether you choose to use NVIDIA A100 or T4 GPUs will depend on the types of apps you're creating. The following are a couple example scenarios. For each scenario with serverless GPUs, you pay only for the compute you use with per-second billing, and your apps will automatically scale in and out from zero to meet the demand. NVIDIA T4 Real-time and batch inferencing: Using custom open-source models with fast startup times, automatic scaling, and a per-second billing model, serverless GPUs are ideal for dynamic applications that don't already have a serverless API in the model catalog. NVIDIA A100 Compute intensive machine learning scenarios: Significantly speed up applications that implement fine-tuned custom generative AI models, deep learning, or neural networks. High performance computing (HPC) and data analytics: Applications that require complex calculations or simulations, such as scientific computing and financial modeling as well as accelerated data processing and analysis among massive datasets. Get started with serverless GPUs Serverless GPUs are now available for workload profile environments in West US 3 and Australia East regions with more regions to come. You will need to have quota enabled on your subscription in order to use serverless GPUs. By default, all Microsoft Enterprise Agreement customers will have one quota. If additional quota is needed, please request it here. Note: In order to achieve the best performance with serverless GPUs, use an Azure Container Registry (ACR) with artifact streaming enabled for your image tag. Follow steps here to enable artifact streaming on your ACR. From the portal, you can select to enable GPUs for your Consumption app in the container tab when creating your Container App or your Container App Job. You can also add a new consumption GPU workload profile to your existing Container App environment through the workload profiles UX in portal or through the CLI commands for managing workload profiles. Deploy a sample Stable Diffusion app To try out serverless GPUs, you can use the stable diffusion image which is provided as a quickstart during the container app create experience: In the container tab select the Use quickstart image box. In the quickstart image dropdown, select GPU hello world container. If you wish to pull the GPU container image into your own ACR to enable artifact streaming for improved performance, or if you wish to manually enter the image, you can find the image at mcr.microsoft.com/k8se/gpu-quickstart:latest. For full steps on using your own image with serverless GPUs, see the tutorial on using serverless GPUs in Azure Container Apps. Learn more about serverless GPUs With serverless GPUs, Azure Container Apps now simplifies the development of your AI applications by providing scale-to-zero compute, pay-as you go pricing, reduced infrastructure management, and more. To learn more, visit: Using serverless GPUs in Azure Container Apps (preview) | Microsoft Learn Tutorial: Generate images using serverless GPUs in Azure Container Apps (preview) | Microsoft Learn2.9KViews1like0Comments