kubernetes
158 TopicsAKS enabled by Azure Arc: Powering AI Applications from Cloud to Edge [Ignite 2025]
A New Era for Hybrid Kubernetes and AI Microsoft Ignite 2025 continues to accelerate Azure’s hybrid vision, extending cloud-native innovation into datacenters, factories, retail sites, and remote, fully disconnected environments. This year’s announcements expand the capabilities of AKS enabled by Azure Arc, making it the most versatile and secure platform for deploying modern applications and AI workloads across any environment. AKS Arc now underpins Azure’s hybrid and edge strategy — and increasingly its hybrid AI strategy by delivering consistent operations, strong security, and flexible deployment models for distributed applications. TL;DR: New AKS Arc offering and features in 2025 AKS on Azure Local Disconnected Operations Public Preview AKS on Azure Local Small Form Factor Bare-Metal Private Preview Improvements to AKS on Azure Local Medium, including lifecycle, portability, additional GPU support and hardware support expansion. Improvements to AKS on Windows Server, improved platform reliability, security, and consistency through fixes to image packaging, dependency handling, node/agent synchronization, certificate and key management, error detection, telemetry and cleanup of stale resources 2-Node High Availability for AKS Arc at the edge Private Preview AI Foundry Local integration for offline/hybrid AI development KAITO on AKS Arc Public Preview for hybrid/edge model deployment Edge RAG on Azure Local Medium Arc Gateway for AKS Arc Public Preview KMS v2 for secrets encryption on AKS on Azure Local Medium Expanded GPU support for AKS Arc on Azure Local (RTX 6000 Ada GA, NVIDIA L-series Preview) AKS Container Apps on Azure Local Medium Public Preview AKS Edge Essentials release for improved stability and offline operations Arc-enabled Azure Monitor Pipeline, Workload Identity Federation, and Azure Container Storage enhancements Azure Linux 3.0 support, Key Vault Secret Store extension AKS on Azure Local: Evolving the Hybrid Managed Kubernetes Platform This year, AKS on Azure Local introduces several major enhancements that broaden where and how customers can deploy AKS as their managed Kubernetes platform at the edge. Disconnected Operations Public Preview AKS on Azure Local can now operate entirely offline, supporting customers in sovereign, regulated, or isolated environments. Clusters can be deployed, managed, and updated without continuous Azure connectivity, syncing only when connectivity is temporarily restored. Small Form Factor Bare-Metal Preview The new SFF edition brings AKS to compact industrial PCs and constrained retail or factory environments. It delivers bare-metal performance in a much smaller footprint, including optional GPU support for edge inferencing. Improvements to Azure Local Medium Azure Local Medium continues to mature with expanded hardware compatibility, improved lifecycle reliability, and better workload portability across cloud and local deployments — enabling enterprises to standardize on AKS across all tiers of infrastructure. 2-Node High Availability for the Edge For space- and cost-constrained environments, AKS Arc can support HA clusters with only two nodes, enabling robust production workloads in places where traditional 3-node clusters are not feasible. Operational Excellence with AKS Arc Enterprises operating distributed Kubernetes fleets will benefit from new governance and connectivity capabilities. AKS Arc Gateway Public Preview Arc Gateway simplifies hybrid connectivity by streamlining cluster onboarding and reducing required firewall rules. This creates a more secure and operationally efficient pattern for managing large fleets of Arc-enabled clusters. KMS v2 for Kubernetes secrets encryption at rest in etcd KMS v2 enhances Kubernetes secret encryption for hybrid and on-prem clusters, delivering improved reliability, stronger security boundaries, and consistency with Azure’s cloud-native cryptography approach. AKS as the Hybrid AI Application Platform AI is the defining theme of Ignite 2025 and AKS enabled by Azure Arc is now the foundation for deploying AI where the data resides. Organizations increasingly need to run AI models in datacenters, factories, field environments, and sovereign locations, and this year’s updates establish AKS Arc as Azure’s platform for distributed and offline AI workloads. AI Foundry Local: Build and Fine-Tune AI Models Anywhere AI Foundry Local brings Azure AI Foundry’s core capabilities: the curated model catalog, development tools, templates, and fine-tuning support into customer environments. It allows developers to run foundation models locally using optimized execution paths for GPUs, NPUs, and CPUs; fine-tune models with LoRA/QLoRA in regulated or offline scenarios; and package model artifacts for deployment on AKS clusters. This enables a complete hybrid AI development loop that works both online and fully disconnected. KAITO Public Preview on AKS Arc KAITO automates model serving across cloud, datacenter, and edge. Now available on AKS Arc, it provides one-click packaging, optimization, and deployment of models built in AI Foundry Local. Customers can run ONNX, Hugging Face, or custom models with edge-aware performance optimization across diverse hardware, including CPU-only and GPU-accelerated nodes. Expanded GPU Capabilities Hybrid AI workloads benefit from expanded GPU options, including general availability of the NVIDIA RTX 6000 Ada, preview support for NVIDIA L-series GPUs, and new GPU Partitioning (GPU-PV) support for efficient resource utilization. These capabilities make it possible to run high-performance inferencing and training workloads across a wide range of hybrid deployment scenarios. RAG on Azure Local: Bring Generative AI to On-Premises Data RAG (Retrieval-Augmented Generation) on Azure Local enables organizations to ground AI in their own on-premises data without moving information to the cloud. Delivered as a first-party Azure Arc extension, it provides an integrated retrieval pipeline for ingesting, indexing, and querying enterprise content stored in datacenters or edge locations. With support for hybrid search, multi-modal data, evaluation tooling, and responsible AI controls, organizations can build RAG applications that remain fully compliant with data sovereignty requirements while reducing latency and improving accuracy. By running the full RAG workflow locally — from retrieval to generation — customers can create intelligent applications that leverage proprietary documents, images, and other unstructured data directly within their secure environments. Expanding Application Capabilities at the Edge AKS Container Apps on the Edge A major milestone this year is the public preview of ACA on the edge, enabling teams to bring the simplicity of Azure Container Apps to Azure Local Medium. Developers can deploy AI-powered microservices, inference endpoints, and event-driven applications at the edge using the same ACA programming model used in Azure. AKS Edge Essentials The latest release improves cluster stability, enhances offline lifecycle operations, and strengthens both Linux and Windows support, making it easier to operate AKS at scale in constrained or intermittently connected environments. Enhanced Storage, Telemetry, and Security for Hybrid AI Distributed AI workloads require robust identity, storage, and observability patterns, and Ignite brings major updates in all three areas. The Arc-enabled Azure Monitor Pipeline improves telemetry ingestion across disconnected or segmented networks, caching data locally and syncing to Azure when connectivity is available. Workload Identity Federation for Arc enables secure, secret-less identity for workloads running at the edge. And Azure Container Storage enabled by Arc, now expanded for AKS Arc clusters, provides a high-performance persistent storage layer suited for vector stores, embedding caches, cloud ingest and mirror. Conclusion Ignite 2025 represents a major step forward for AKS enabled by Azure Arc as both a hybrid Kubernetes platform and a hybrid AI application platform. With disconnected operations, edge-native Container Apps, improved GPU acceleration, KAITO for unified model serving, AI Foundry Local for offline model development, and a fully consistent operational model across cloud, datacenter, and edge, AKS Arc now enables organizations to run their most critical cloud-native and AI workloads anywhere they operate. We look forward to continuing to support customers as they build the next generation of hybrid and edge AI applications.448Views0likes0CommentsWorkload Identity support for Azure Arc-enabled Kubernetes clusters now Generally Available!
We’re excited to announce that Workload Identity support for Azure Arc-enabled Kubernetes is now Generally Available (GA)! This milestone brings a secure way for applications running on Arc-connected clusters running outside of Azure to authenticate to Azure services without managing secrets. Traditionally, workloads outside Azure relied on static credentials or certificates to access Azure resources like Event Hubs, Azure Key Vault, and Azure Storage. Managing these secrets introduces operational overhead and security risks. With Microsoft Entra Workload ID federation, your Kubernetes workloads can now: Authenticate securely using OpenID Connect (OIDC) without storing secrets. Exchange trusted tokens for Azure access tokens to interact with services securely. This means no more manual secret rotation and reduced attack surface, all while maintaining compliance and governance. How It Works The integration uses Service Account Token Volume Projection and aligns with Kubernetes best practices for identity federation. The process involves a few concise steps: Enable OIDC issuer and workload identity on your Arc-enabled cluster using Azure CLI. az connectedk8s connect --name "${CLUSTER_NAME}" --resource-group "${RESOURCE_GROUP}" --enable-oidc-issuer –-enable-workload-identity Configure a user-assigned managed identity in Azure to trust tokens from your Azure Arc enabled Kubernetes cluster's OIDC issuer URL. This involves creating a federated identity credential that links the Azure identity with the Kubernetes service account. Applications running in pods, using the annotated Kubernetes service account, can then request Azure tokens via Microsoft Entra ID and access resources they’re authorized for (e.g., Azure Storage, Azure Key Vault). This integration uses Kubernetes-native construct of Service Account Token Volume Projection and aligns with Kubernetes best practices for identity federation. Supported platforms We support a broad ecosystem of distributions, including: Red Hat OpenShift Rancher K3s AKS-Arc (In preview) VMware Tanzu Kubernetes Grid (TKGm) So, whether you’re running clusters in retail stores, manufacturing plants, or remote edge sites, you can connect them to Azure Arc and enable secure identity federation for your workloads to access Azure services. Ready to get started? Follow our step-by-step guide on Deploying and Configuring Workload Identity Federation in Azure Arc-enabled Kubernetes to secure your edge workloads today!135Views0likes0CommentsPublic Preview: Multicloud connector support for Google Cloud
We are excited to announce that the Multicloud connector is now in preview for GCP environments. With the Multicloud connector, you can easily connect your GCP projects and AWS accounts to Azure with the following capabilities: Inventory: Get an up-to-date, comprehensive view of your cloud assets across different cloud providers. Now supporting GCP services (Compute VM, GKE, Storage, Functions, and more), you can now gain insights into your Azure, AWS, and GCP environments in a single pane of glass. The agentless inventory solution will periodically scan your GCP environment, project the discovered resources in GCP as Azure resources, including all of the GCP metadata like GCP labels. Now, you can easily view, query, and tag these resources from a centralized location. Azure Arc onboarding: Automatically Arc-enable your existing and future GCP VMs so you can leverage Azure and Microsoft services, like Azure Monitor and Microsoft Defender for Cloud. Through the multicloud connector, the Azure Arc agent will be automatically installed for machines that meet the prerequisites. How do I get started? You can easily set up the multicloud connector by following our getting started guide which provides step by step instructions on creating the connector and setting up the permissions in GCP which leveraged OIDC federation. What can I do after my connector is set up? With the inventory offering, you can see and query for all of your GCP and Azure resources via Azure Resource Graph. For Azure Arc onboarding, you can apply the Azure management services on your GCP VMs that are Arc-enabled. Learn more here. We are very excited about the expanded support in Google Cloud. Set up your multicloud connector now for free! Please let us know if you have any questions by posting on the Azure Arc forum or via Microsoft support. Here is the mutlicloud capabilities technical documentation. Check out the Ignite session here!226Views0likes0CommentsTransforming City Operations: How Villa Park and DataON Deliver Real-Time Decisions with Edge RAG
In today’s connected world, customers expect instant, context-rich interactions- even in environments where cloud connectivity isn’t guaranteed. That’s where Edge Retrieval-Augmented Generation (RAG) at the edge comes in. Edge RAG, enabled by Azure Arc, combining local data retrieval with intelligent reasoning to empowers conversational experiences that are fast, secure, and deeply personalized. Together with our Edge Infrastructure partners, we’re applying this technology to transform customer engagement - enabling real-time insights, autonomous workflows, and resilient operations across industries. Edge RAG is a core part of our Adaptive Cloud pillar for Edge AI, ensuring flexibility, resilience, and intelligence wherever customers operate. It uses Foundry language models and together with Foundry Local shape Microsoft’s Foundry Anywhere commitment. Today we’re excited to announce a public preview refresh of Edge RAG at Ignite 2025, bringing new capabilities to accelerate adoption and unlock even more value at the edge: Production-Class LazyGraph RAG with Industry-leading RAG inferencing quality High-Fidelity Parsing: OCR-enabled support for documents, tables, and images SharePoint Server integration (limited access; to register, click here ) Multimodal search with image retrieval & image-rich outputs Chat UI Upgrades and performance improvements Fully Disconnected scenarios enabled by Azure Local for Disconnected Operations The new features in this release are informed by our engagement with the City of Villa Park, in partnership with DataON, where we’ve applied Edge RAG to improve operational efficiency and deliver smarter, real-time services for urban environments. Together, we pilot compliance assistant agentic workflow with OCR & LLM integration. Villa Park: A Blueprint for Smart Cities The City of Villa Park, California, faced challenges common to many municipalities: complex zoning regulations that slowed approvals, lengthy CEQA compliance processes requiring deep environmental analysis, backlogs in accessory dwelling unit (ADU) permit reviews. Working with DataON, a Microsoft partner, and Microsoft, Villa Park deployed Edge RAG on Azure Local, creating a resilient, intelligent planning system that operates seamlessly; even offline. Environmental assessments that once required days are now completed in minutes. The partnership between the City of Villa Park and DataON is a standout example of how municipalities and technology providers can co-innovate to solve real-world challenges. Ray Pascua, Villa Park’s Planning Manager, has led this transformation: “Having the opportunity to utilize AI to perform research and retrieve large datasets specifically from the California Environmental Quality Act (CEQA) Guidelines (Statutory/Categorical Exemptions), and State law relative to Accessory Dwelling Units (ADUs), has been an overall positive experience. AI algorithm is a revolutionary medium that can streamline and improve workflow efficiencies by automating routine and repetitive planning-related tasks and analysis, and would be of particular value and benefit to local government agencies that have limited personnel and resources. While this cutting-edge technological tool is still evolving and has room to improve accuracy and speed, it certainly has a place in the realm of City Planning, as well as other land use development fields and disciplines.” Howard Lo, VP of Sales & Marketing at DataON, shares: “Our collaboration with Microsoft and the City of Villa Park showcases Azure Local's transformative potential for municipal government AI. As a leading Azure Local partner, DataON has optimized our infrastructure to run Microsoft's Edge RAG solution, enabling Villa Park to address real planning challenges while maintaining data control and security. Working directly with Microsoft's engineering team and a forward-thinking city partner, we've proven that Azure Local delivers practical AI value for government operations. We're excited to help other municipalities achieve similar results on our Azure Local platform.” Villa Park’s deployment leverages DataON’s Azure Local-certified hardware, Microsoft’s Arc-enabled AI stack, and the expertise of city planners to deliver: End-to-end digital workflows for CEQA, zoning, and ADU permitting Conversational AI interfaces that empower staff to ask questions and get cited, regulatory-compliant answers instantly Operational resilience with full offline support, ensuring continuity even during network outages A replicable model for other municipalities seeking to modernize planning and compliance About DataON DataON’s edge infrastructure, combined with Azure Local and Edge RAG, forms the core of this transformation. DataON provides robust hardware and delivers deployment, integration, and training services, ensuring a seamless Azure Local experience. Their close support helps organizations quickly adopt and confidently manage edge solutions, resulting in secure, high-performance, and scalable deployments for multi-site environments. Let’s take a closer look at the features we’re announcing today: Deep Search for Complex Reasoning with LazyGarph RAG With the Ignite release, Edge RAG introduces Deep Search powered by LazyGraph RAG; a dynamic graph-based retrieval method that enables advanced, multi-document reasoning. This means Villa Park planners can now ask complex, multi-part questions that span zoning, CEQA, and ADU regulations, and Edge RAG will synthesize answers by connecting information from multiple sources in real time. Image 1: Deep Search capabilities on Edge RAG The system incrementally explores only the most relevant document chunks, reducing compute cost and latency while delivering comprehensive, cited responses. For Villa Park, this translates to resolving intricate regulatory scenarios, such as “What are the environmental constraints for ADUs in zones X, Y, and Z?”. With answers that reference and link multiple regulatory documents and historical decisions, all in a single query. Advanced Document Parsing for Structured Data Edge RAG’s advanced document parsing, introduced in this release, transforms how Villa Park’s planning documents are utilized. During data ingestion, the system now extracts not only free-form text but also tables, images, headings, and rich metadata. This includes full indexing of multi-page tables, column headers, and section context, with each chunk annotated by page number, section heading, and table index. As a result, planners can search for specific permit statistics, environmental impact scores, or compliance tables and retrieve results directly from structured data within city documents; enabling precise, source-attributed answers that were previously difficult or impossible to obtain. Image 2: Advanced document parsing on Edge RAG Enhanced Chat Experience The new model-only chat mode allows staff to interact directly with the language model, bypassing contextual data for general queries or troubleshooting. This flexibility enables Villa Park staff to quickly switch between knowledge-based chat-grounded in city data, and model-only chat for training, testing, or handling ambiguous queries, streamlining both day-to-day operations and onboarding of new team members. Additional Edge RAG Preview Refresh Updates We also improved Edge RAG based on customer feedback, adding these features: Agentic RAG for autonomous workflows: Systems can reason and act at the edge with less manual work. Full offline support: Operates and accesses data even without a network. SharePoint integration (private preview): users will also be able to query Edge RAG directly over SharePoint, enabling enhanced information retrieval and analysis within their workflows. Image 3: Sharepoint as a data source on Edge RAG Performance optimizations: Query responses for every search type, excluding Deep Search, are now delivered in under 15 seconds on legacy A2 and A16 GPUs; a fivefold speed boost. Additionally, streaming image processing has increased one hundred times, allowing 600 images to be handled continuously in just 36 seconds. Since late May, Edge RAG has supported “bring your own model” (BYOM), allowing organizations to deploy their preferred language models such as OpenAI GPT-4o or other advanced models, directly on their own infrastructure. This capability enables advanced features like deep search and hybrid multimodal search, while ensuring that sensitive data remains on-premises. BYOM empowers organizations to tailor Edge RAG’s AI capabilities to their unique compliance, performance, or customization requirements, maintaining full control over both data and model selection. Security, Compliance, and Sustainability Edge RAG is built for trust: data sovereignty ensures sensitive data remains on-premises, zero-trust architecture integrates with Microsoft security stack, and compliance-ready design supports municipal, state, and industry regulations. Sustainability is also a priority, with energy-efficient edge hardware reducing carbon footprint. Looking Ahead: The Future of Edge Intelligence Edge RAG enables flexible edge intelligence deployment in various environments. Its adaptable design handles dynamic workloads, supporting frontline teams as operations evolve. Instead of just speeding up processes or boosting connectivity, Edge RAG fosters innovative applications and smarter decision-making, helping organizations stay agile amid changing technology and business needs. Resources Explore these resources to learn more about Edge RAG, deployment best practices, customer stories, and technical documentation: Product documentation: Edge RAG Preview, enabled by Azure Arc Documentation | Microsoft Learn Get Started: Quickstart: Install Edge RAG Preview enabled by Azure Arc Release notes: What's New in Edge RAG – Azure Arc Tech Talk Distribution List: EdgeRAGTalk@microsoft.com Join the conversation, ask questions, and connect with the Edge RAG team Recommended Ignite sessions: BRK147: What’s new in Azure Local ODSP1467: Unlock your IT potential with Azure Local & DataON Plus Solutions BRK199: From cloud to edge: Building and shipping Edge AI apps with Foundry203Views1like0CommentsAzure IoT Operations 2510 Now Generally Available
Introduction We’re thrilled to announce the general availability of Azure IoT Operations 2510, the latest evolution of the adaptive cloud approach for AI in industrial and large scale commercial IoT. With this release, organizations can unlock new levels of scalability, security, and interoperability, empowering teams to seamlessly connect, manage, and analyze data from edge to cloud. What is Azure IoT Operations? Azure IoT Operations is more than an edge-to-cloud data plane, it’s the foundation for AI in physical environments, enabling intelligent systems to perceive, reason, and act in the real world. Built on Arc-enabled Kubernetes clusters, Azure IoT Operations unifies operational and business data across distributed environments, eliminating silos and delivering repeatability and scalability. By extending familiar Azure management concepts to physical sites, AIO creates an AI-ready infrastructure that supports autonomous, adaptive operations at scale. This approach bridges information technology (IT), operational technology (OT), and data domains, empowering customers to discover, collect, process, and send data using open standards while laying the groundwork for self-optimizing environments where AI agents and human supervisors collaborate seamlessly. We've put together a quick demo video showcasing the key features of this 2510 release. Watch below to discover how Azure IoT Operations' modular and scalable data services empowers IT, OT and developers. What’s New in Azure IoT Operations 2510? Management actions: Powerful management actions put you in control of processes and asset configurations, making operations simpler and smarter. Web Assembly (Wasm) data graphs: Wasm-powered data graphs for advanced edge processing, delivering fast, modular analytics and business logic right where your data lives. New connectors: Expanded connector options now include OPC UA, ONVIF, Media, REST/HTTP, and Server-Sent Events (SSE), opening the door to richer integrations across diverse industrial and IT systems. OpenTelemetry (OTel) endpoints: Data flows now support sending data directly to OpenTelemetry collectors, integrating device and system telemetry into your existing observability infrastructure. Improved observability: Real-time health status for assets gives you unmatched visibility and confidence in your IoT ecosystem. Reusable Connector templates: Streamline connector configuration and deployment across clusters. Device support in Azure Device Registry: Azure Device Registry (ADR) now treats devices as first‑class resources within ADR namespaces, enabling logical isolation and role‑based access control at scale. Automatic device and asset discovery and onboarding: Akri‑powered discovery continuously detects devices and industrial assets on the network, then automatically provisions and onboards them (including creating the right connector instances) so telemetry starts flowing with minimal manual setup. MQTT Data Persistence: Data can now be persisted to disk, ensuring durability across broker restarts. X.509 Auth in MQTT broker: The broker now supports X.509 authentication backed by Azure's Device Registry. Flexible RBAC: Built-in roles and custom role definitions to simplify and secure access management for AIO resources. Customers and partners Chevron, through its Facilities and Operations of the Future initiative, deployed Azure IoT Operations with Azure Arc to manage edge-to-cloud workloads across remote oil and gas sites. With a single management plane, the strategy unifies control over thousands of distributed sensors, cameras, robots, and drones. Real-time monitoring and AI enabled anomaly detection not only to enhance operational efficiency but also significantly improve worker safety by reducing routine inspections and enabling remote issue mitigation. This reuse of a global, AI-ready architecture positions Chevron to deliver more reliable, cleaner energy. [microsoft.com] Husqvarna implemented Azure IoT Operations across its global manufacturing network as part of a comprehensive strategy. This adaptive cloud approach integrates cloud, on-premises, and edge systems, preserves legacy investments, and enables real-time edge analytics. The result: data operationalization is 98% faster, imaging costs were slashed by half, productivity was improved, and downtime was reduced. Additionally, AI-driven capabilities like the Factory Companion powered by Azure AI empower technicians with instant, data-informed troubleshooting, shifting maintenance from reactive to predictive across sites. [microsoft.com] Together, these success stories show how Azure IoT Operations, combined with capabilities like Azure Arc, can empower industrial leaders to advance from siloed operations to unified, intelligent systems that boost efficiency, safety, and innovation. Additionally, this year we are celebrating how our partners are integrating, co-innovating, and scaling real customer outcomes. You can learn more about our partner successes at https://aka.ms/Ignite25/DigitalOperationsBlog. Learn more at our launch event Join us at Microsoft Ignite to dive deeper into the latest innovations in Azure IoT Operations 2510. Our sessions will showcase real-world demos plus expert insights on how new capabilities accelerate industrial transformation. Don’t miss the chance to connect with product engineers, explore solution blueprints, and see how Azure IoT Operations lays the foundation for building and scaling physical AI. Get Started Ready to experience the new capabilities in Azure IoT Operations 2510? Explore the latest documentation and quickstart guides at https://aka.ms/AzureIoTOperations Connect with the Azure IoT Tech Community to share feedback and learn from peers.303Views0likes0CommentsAnnouncing General Availability of the Azure Key Vault Secret Store Extension
We are thrilled to announce Azure Key Vault Secret Store Extension (SSE) is now generally available for Arc-enabled on-premises Kubernetes, including clusters that you connect yourself and AKS Arc managed clusters. SSE automatically fetches secrets from an Azure Key Vault to the on-premises cluster for offline access. This means you can use Azure Key Vault to store, maintain, and rotate your secrets, even when running your Kubernetes cluster in a semi-disconnected state. Key Benefits Offline Access: It’s important that workloads continue even when there are temporary disruptions to connectivity. This includes regular operation as well as the ability to restart pods while there’s a connectivity interruption. With SSE, workloads can access secrets from the local Kubernetes secrets store regardless of connectivity interruptions. Standard K8s Secret Access: Secrets can be accessed via volume mounting, environment variables, or via the Kubernetes API. Workloads and ingress controllers do not need to be customized to access Azure Key Vault and developers have options on how to access secrets. Security: SSE has limited permissions and leverages the latest Kubernetes security features so that cluster admins do not need to configure and limit permissions themselves. Secrets are critical business assets, so the Secret Store helps to secure them through isolated namespaces, role-based access control (RBAC) policies, federated identities for accessing AKV, and limited permissions for the secrets synchronizer. Scalability: SSE helps very large distributed deployments with hundreds or thousands of clusters to work with Azure Key Vault by spreading demand over time. By effectively caching secrets in the Kubernetes secrets store, SSE can also help to lower overall demand on Azure Key Vault instances. Low maintenance: Auto-updates can keep your SSE up to date with security and performance improvements as they are released. Additionally, changes to configured secrets are even easier now with the new simplified configuration experience (in preview). With the simplified configuration style, a single custom resource is all that’s needed, reducing the effort and surface area for misconfigurations. How to Use the Secret Store Extension Install the Secret Store Extension to an Arc-enabled cluster or AKS managed on-premises cluster with configuration parameters such as sync intervals. Configure an Azure managed identity that has permission to read secrets from AKV and federate it with a Kubernetes service account. Configure a secret provider class custom resource (CR) in the cluster with connection details to the Key Vault. Configure a secret sync custom resource (CR) in the cluster for each secret to be synchronized. (Steps 3 and 4 are now even easier with the new simplified configuration!) Apply the CRs in the cluster and secrets will automatically begin syncing to the cluster. Relax knowing that your configured secrets will be kept up to date on the cluster, as frequently as you want. Try out the Secret Store Extension Today! Get started by visiting the SSE documentation Get hands-on even faster with the SSE Jumpstart Drop817Views2likes0CommentsOperate everywhere with AI-enhanced management and security
Farzana Rahman and Dushyant Gill from Microsoft discuss new AI-enhanced features in Azure that make it simpler to acquire, connect, and operate with Azure's management offerings across multiple clouds, on-premises, and at the edge. Key updates include enhanced management for Windows servers and virtual machines with Windows Software Assurance, Windows Server 2025 hotpatching support in Azure Update Manager, simplified hybrid environment connectivity with Azure Arc gateway, a multicloud connector for AWS, and Log Analytics Simple Mode. Additionally, Azure Migrate Business Case helps compare the total cost of ownership, and new Copilot in Azure capabilities that simplify cloud management and provide intelligent recommendations.2.2KViews1like1CommentBuilt a Real-Time Azure AI + AKS + DevOps Project – Looking for Feedback
Hi everyone, I recently completed a real-time project using Microsoft Azure services to build a cloud-native healthcare monitoring system. The key services used include: Azure AI (Cognitive Services, OpenAI) Azure Kubernetes Service (AKS) Azure DevOps and GitHub Actions Azure Monitor, Key Vault, API Management, and others The project focuses on real-time health risk prediction using simulated sensor data. It's built with containerized microservices, infrastructure as code, and end-to-end automation. GitHub link (with source code and documentation): https://github.com/kavin3021/AI-Driven-Predictive-Healthcare-Ecosystem I would really appreciate your feedback or suggestions to improve the solution. Thank you!132Views0likes2Comments