hybrid
92 TopicsIgnite 2024: AKS enabled by Azure Arc - New Capabilities and Expanded Workload Support
Microsoft Ignite 2024 has been a showcase of innovation across the Azure ecosystem, bringing forward major advancements in AI, cloud-native applications, and hybrid cloud solutions. This year’s event featured key updates, including enhancements to AKS enabled by Azure Arc, which introduced new capabilities and expanded workload support. These updates reinforce the value and versatility that AKS enabled by Azure Arc brings to organizations looking to scale and optimize their operations. With these advancements, AKS Arc continues to support seamless management, increased scalability, and enhanced workload performance across diverse infrastructures. AKS Enabled by Azure Arc AKS enabled by Azure Arc brings the power of Azure’s managed Kubernetes service to any environment, providing consistent management and security across on-premises, edge, and multi-cloud deployments. It encompasses: AKS on Azure Local: A full-featured Kubernetes platform integrated with Azure Local for comprehensive container orchestration in hybrid setups. Notably, AKS on Azure Local has earned recognition as a leader in the 2024 Gartner Magic Quadrant for Distributed Hybrid Infrastructure, underscoring Microsoft's dedication to delivering comprehensive, enterprise-ready solutions for hybrid cloud deployments. AKS Edge Essentials: A lightweight version designed for edge computing, ensuring operational consistency on constrained hardware. AKS on Azure Local Disconnected Operations: It is now available on Azure Local Disconnected Operations. This latest addition to AKS enabled by Azure Arc portfolio is the support for fully disconnected scenario. It allows AKS enabled by Azure Arc to operate in air-gapped, isolated environments without the need for continuous Azure connectivity. It is crucial for organizations that require secure, self-sufficient Kubernetes operations in highly controlled or remote locations. With this support, businesses can maintain robust Kubernetes functionality while meeting stringent compliance and security standards. Key Features and Expanded Workload Support This year's Ignite announcements unveiled a series of public preview and GA features that enhance the capabilities of AKS enabled by Azure Arc. These advancements reflect our commitment to delivering robust, scalable solutions that meet the evolving needs of our customers. Below are the key highlights that showcase the enhanced capabilities of AKS enabled by Azure Arc: Edge Workload Azure IoT Operations - enabled by Azure Arc: Available on AKS Edge Essentials (AKS-EE) and AKS on Azure Local with public preview support. Azure IoT Operations in the management and scaling of IoT solutions. It provides robust support for deploying and overseeing IoT applications within Kubernetes environments, enhancing operational control and scalability. Organizations can leverage this tool to maintain seamless management of distributed IoT workloads, ensuring consistent performance and simplified scaling across diverse deployment scenarios. Azure Container Storage - enabled by Azure Arc: Available on both AKS Edge Essentials (AKS-EE) and AKS on Azure Local, this support enables seamless integration for persistent storage needs in Kubernetes environments. It provides scalable, reliable, and high-performance storage solutions that enhance data management and support stateful applications running in hybrid and edge deployments. This addition ensures that organizations can efficiently manage their containerized workloads with robust storage capabilities. Azure Key Vault Secret Store extension for Kubernetes: Now available as public preview on AKS Edge Essentials and AKS on Azure Local, this extension automatically synchronizes secrets from an Azure Key Vault to an AKS enabled by Azure Arc cluster for offline access, providing essential tools for proactive monitoring and policy enforcement. It offers advanced security and compliance capabilities tailored for robust governance and regulatory adherence, ensuring that organizations can maintain compliance with industry standards and best practices while safeguarding their infrastructure. Azure Monitor Pipeline: The Azure Monitor pipeline is a data ingestion solution designed to provide consistent, centralized data collection for Azure Monitor. Once deployed for AIO on AKS cluster enabled by Azure Arc, it enables at-scale telemetry data collection and routing at the edge. The pipeline can cache data locally, syncing with the cloud when connectivity is restored, and supports segmented networks where direct data transfer to the cloud isn’t possible. Built on OpenTelemetry Collector, the pipeline’s configuration includes data flows, cache properties, and destination rules defined in the DCR to ensure seamless data processing and transmission to the cloud. Arc Workload Identity Federation: Now available as public preview on AKS Edge Essentials and AKS on Azure Local, providing secure federated identity management to enhance security for customer workloads Arc Gateway: Now available as public preview for AKS Edge Essentials and AKS on Azure Local. Arc Gateway support on AKS enabled by Azure Arc enhances secure connectivity across hybrid environments, reducing required firewall rules and improving security for customer deployments. Azure AI Video Indexer - enabled by Azure Arc: Supported on AKS Edge Essentials and AKS on Azure Local. Arc-enabled Video Indexer enables comprehensive AI-powered video analysis, including transcription, facial recognition, and object detection. It allows organizations to deploy sophisticated video processing solutions within hybrid and edge environments, ensuring efficient local data processing with improved security and minimal latency. MetalLB - Azure Arc Extension: Now supported on AKS Edge Essentials and AKS on Azure Local, MetalLB ensures efficient load balancing capabilities. This addition enhances network resilience and optimizes traffic distribution within Kubernetes environments. Comprehensive AI and Machine Learning Capabilities GPUs for AI Workloads: Now AKS enabled by Azure Arc supports a range of GPUs tailored for AI and machine learning workloads with GPU Partitioning) and GPU Passthrough Virtualization support. These options enable robust performance for resource-intensive AI and machine learning workloads, allowing for efficient use of GPU resources to run complex models and data processing tasks. Arc-enabled Azure Machine Learning: Support on AKS on Azure Local, AML capabilities for running sophisticated AI models. Businesses can leverage Azure’s powerful machine learning tools seamlessly across different environments, enabling them to develop, deploy, and manage machine learning models effectively on-premises and at the edge. Arc-enabled Video Indexer: It extends Azure's advanced video analytics capabilities to AKS enabled by Azure Arc. Organizations can now process and analyze video content in real-time, harnessing Azure's robust video AI tools to enhance video-based insights and operations. This support provides businesses with greater flexibility to conduct video analysis seamlessly in remote or hybrid environments Kubernetes AI Toolchain Orchestrator (Kaito + LoRA + QLoRA): Fully validated and support for fine-tuning and optimizing AI models, Kaito, LoRA and QLoRA are designed for edge deployments such as AKS on Azure Local. This combination enhances the ability to run and refine AI applications effectively in edge environments, ensuring performance and flexibility. Flyte Integration: Now supported on AKS on Azure Local, Flyte offers a scalable orchestration platform for managing machine learning workflows. This capability enables teams to build, execute, and manage complex AI pipelines efficiently, enhancing productivity and simplifying the workflow management process. Enhanced Infrastructure and Operations Management Infrastructure as Code (IaC) with Terraform: Now supported on AKS on Azure Local for both Connected and Air-gapped scenario, providing streamlined deployment capabilities through code. This support enables teams to automate and manage their Kubernetes infrastructure at scale more efficiently with Terraform. Anti-affinity, Pod CIDR, Taints/Labels: Available on AKS on Azure Local, these features provide enhanced infrastructure capabilities by allowing refined workload placement and advanced network configuration. Anti-affinity rules help distribute pods across different nodes to avoid single points of failure, while Pod CIDR simplifies network management by allocating IP ranges to pods. Taints and labels offer greater control over node selection, ensuring that specific workloads run on designated nodes and enhancing the overall efficiency and reliability of Kubernetes operations. Optimized Windows Node Pool Management: AKS enabled by Azure Arc now includes the capability to enable and disable Windows node pools for clusters. This enhancement helps prevent unnecessary binary downloads, benefiting customers with low-speed or limited internet connection. It optimizes resource usage, reduces bandwidth consumption, and enhances overall deployment efficiency, making it ideal for environments with network constraints. Kubernetes Development AKS-WSL: With AKS-WSL, developers can set up a local environment that mimics the experience of working with AKS. This makes it easier for developers to write, debug, and test Kubernetes applications locally before deploying them to a full AKS cluster. AKS-WSL VSCode Extension: The Visual Studio Code extension for AKS-WSL allows developers to write, debug, and deploy Kubernetes applications locally, streamlining development workflows. This setup improves productivity by providing efficient tools and capabilities, making it easier to develop, test, and refine Kubernetes workloads directly from a local machine. Arc Jumpstart: Supported AKS Edge Essentials and AKS on Azure Local. Arc Jumpstart simplifies deployment initiation, providing developers with a streamlined way to set up and start working with Kubernetes environments quickly. It makes it easier for teams to evaluate and experiment with AKS enabled by Azure Arc, offering pre-configured scenarios and comprehensive guidance. By reducing complexity and setup time, Arc Jumpstart enhances the developer experience, facilitating faster prototyping and smoother onboarding for new projects in hybrid and edge settings. Conclusion Microsoft Ignite 2024 has underscored the continued evolution of AKS enabled by Azure Arc, bringing more comprehensive, scalable, and secure solutions to diverse environments. These advancements support organizations in running cloud-native applications anywhere, enhancing operational efficiency and innovation. We welcome your feedback (aksarcfeedback@microsoft.com) and look forward to ongoing collaboration as we continue to evolve AKS enabled by Azure Arc.1KViews4likes0CommentsAnnouncing General Availability: Windows Server Management enabled by Azure Arc
Windows Server Management enabled by Azure Arc offers customers with Windows Server licenses that have active Software Assurances or Windows Server licenses that are active subscription licenses the following key benefits: Azure Update Manager Azure Change Tracking and Inventory Azure Machine Configuration Windows Admin Center in Azure for Arc Remote Support Network HUD Best Practices Assessment Azure Site Recovery (Configuration Only) Upon attestation, customers receive access to the following at no additional cost beyond associated networking, compute, storage, and log ingestion charges. These same capabilities are also available for customers enrolled in Windows Server 2025 Pay as you Go licensing enabled by Azure Arc. Learn more atWindows Server Management enabled by Azure Arc - Azure Arc | Microsoft Learn or watch Video: Free Azure Services for Non-Azure Windows Servers Covered by SA Powered by Azure Arc! To get started, connect your servers to Azure Arc, attest for these benefits, and deploy management services as you modernize to Azure's AI-enabled set of server management capabilities across your hybrid, multi-cloud, and edge infrastructure!5KViews8likes10CommentsExtending Azure's AI Platform with an adaptive cloud approach
Ignite 2024 is here, and nothing is more top of mind for customers than the potential to transform their businesses with AI wherever they operate. Today, we are excited to announce the preview of two new Arc-enabled services that extend the power of Azure’s AI platform to on-premises and edge environments. Sign up to join the previews here! An adaptive cloud approach to AI The goal of Azure’s adaptive cloud approach is to extend just enough Azure to customers’ distributed environments. For many of these customers, valuable data is generated and stored locally, outside of the hyperscale cloud, whether due to regulation, latency, business continuity, or simply the large volume of data being generated in real time. AI inferencing can only occur where the data exists. So, while the cloud has become the environment of choice for training models, we see a tremendous need to extend inferencing services beyond the cloud to enable complete cloud-to-edge AI scenarios. Search on-premises data with generative AI Over the past couple of years, generative AI has come to the forefront of AI innovation. Language models give any user the ability to interact with large, complex data sets in natural language. Public tools like ChatGPT are great for queries about general knowledge, but they can’t answer questions about private enterprise data on which they were not trained. Retrieval Augmented Generation, or "RAG", helps address this need by augmenting language models with private data. Cloud services like Azure AI Search and Azure AI Foundry simplify how customers can use RAG to ground language models in their enterprise data. Today, we are announcing the preview of a new service that brings generative AI and RAG to your data at the edge. Within minutes, customers can deploy an Arc extension that contains everything needed to start asking questions about their on-premises data, including: Popular small and large language models running locally with support for both CPU and GPU hardware A turnkey data ingestion and RAG pipeline that keeps all data completely local, with RBAC controls to prevent unauthorized access An out-of-the-box prompt engineering and evaluation tool to find the best settings for a particular dataset Azure-consistent APIs to integrate into business applications, as well as apre-packaged UI to get started quickly This service is available now in gated private preview for customers running Azure Local infrastructure, and we plan to make it available on other Arc-enabled infrastructure platforms in the near future. Sign up here! Deploy curated open-source AI models via Azure Arc Another great thing about Azure’s AI platform is that it provides a catalog of curated AI models that are ready to deploy and provide consistent inferencing endpoints that can be integrated directly into customer applications. This not only makes deployment easy, but customers can also be confident that the models are secure and validated These same needs exist on the edge as well, which is why we are now making a set of curated models deployable directly from the Azure Portal. These models have been selected, packaged, and tested specifically for edge deployments, and are currently available on Azure Local infrastructure. Phi-3.5 Mini (3.8 billion parameter language model) Mistral 7B (7.3 billion parameter language model) MMDetection YOLO (object detection) OpenAI Whisper Large (speech to text) Google T5 Base (translation) Models can be deployed from a familiar Azure Portal wizard to an Arc AKS cluster running on premises. All available models today can be run on just a CPU. Phi-3.5 and Mistral 7B also have GPU versions available for better performance. Once complete, the deployment can be managed directly in Azure ML Studio, and an inferencing endpoint is available on your local network. Wrap up Sign up now to join either of the previews at the link below or stop by and visit us in person in the Azure Arc and Azure Local Expert Meet Up station in the Azure Infrastructure neighborhood at Ignite. We’re excited to get these new capabilities into our customers’ hands and hear from you how it’s going. Sign up to join the previews here822Views6likes2CommentsNew capabilities to aid Migration and Hybrid Cloud Management
When we support customers looking to modernize their IT estate, our vision is to empower businesses to seamlessly manage and migrate their on-premises and cloud environments with efficiency and insight. To help customers with their migration and hybrid cloud journey, we have been working to bridge the gap between on-premises and cloud environments by providing consistent management experiences for both. This week at Ignite, we are excited to announce the public preview of two capabilities as part of customer’s migration journey that bring us closer to this vision: "Business Case for Arc" and "Enabling Arc” capabilities. With these new capabilities in Azure Migrate, customers can now visualize the value of Azure Arc for their on-premises estates throughout their migration journey, making informed decisions with confidence. Customers in their migration journey can now check if their on-premises machine is already Arc enabled, and if not, get the value/savings of using Arc while their resource remains on-premises and leverage the Arc capabilities based on the same. Envision the Benefits of Azure Arc and Azure Management with Business case for Arc The Azure Migrate business case enables customers to create a detailed comparison of the Total Cost of Ownership (TCO) for their on-premises estate versus the TCO on Azure, along with a year-on-year cash flow analysis as they transition their workloads to Azure. With this new capability, customers can now visualize the added value of Azure Arc for their on-premises estates throughout their migration journey. Customers , in addition to getting the TCO of migrating all their resources to Azure, can now compare their estimated current on-premises TCO with the estimated TCO of their on-premises estate with Azure Arc, visualize the cost savings and other benefits of using Azure security (Microsoft Defender for Cloud) and management tools (Azure Monitor and Azure Update Manager) via Azure Arc for their on-premises servers, and see the licensing benefits of Extended Security Updates (ESUs) enabled by Azure Arc, as well as SQL Pay-As-You-Go via Azure Arc-enabled SQL Server. For those not planning to migrate their entire estate or on a long migration journey, they can compare their current on-premises TCO with the combined Azure and Azure Arc TCO in the final planned state to help plan better. Customers can edit the assumptions for Azure and on-premises costs and download the business case report using the export option to share with other stakeholders. Enable Arc: Onboarding to Arc to leverage Azure Management Services The Azure Migrate inventory now integrates with Azure Arc, allowing customers to identify which of their discovered machines are already Arc-enabled and onboard those that are not, directly from the Azure Migrate portal. This integration provides a seamless experience, offering enhanced control and visibility over the migration process while managing the remaining on-premises inventory. Next Steps and Actions Get started here and generate a business case: https://learn.microsoft.com/en-us/azure/migrate/how-to-build-a-business-case392Views1like0CommentsAnnouncing Public Preview of Workload Identity Federation for Azure Arc enabled Kubernetes clusters
Announcing the public preview of Workload Identity Federation support for Azure Arc-enabled Kubernetes clusters. This feature enhances security by allowing applications in Azure Arc enabled Kubernetes clusters to securely access Azure resources like Azure Key Vault and Azure Blob Storage without managing secrets.225Views2likes0CommentsAKS Arc - Optimized for AI Workloads
Overview Azure is the world’s AI supercomputer providing the most comprehensive AI capabilities ranging from infrastructure, platform services to frontier models. We’ve seen emerging needs among Azure customers to use the same Azure-based solution for AI/ML on the edge with minimized latencies while staying compliant with industry regulation or government requirement. Azure Kubernetes Service enabled by Azure Arc (AKS Arc) is a managed Kubernetes service that empowers customers to deploy and manage containerized workload whether they are in data centers or at edge locations. We want to ensure AKS Arc provides optimal experience for AI/ML workload on the edge, throughout the whole development lifecycle from AI infrastructure, Model deployment, Inference, Fine-tuning, and Application. AI infrastructure AKS Arc supports Nvidia A2, A16, and T4 for compute-intensive workload such as machine learning, deep learning, model training. When GPUs are enabled in Azure Local; AKS Arc customers can provision GPU node pools from Azure and host AI/ML workload in the Kubernetes cluster on the edge. For more details, please visit instructions from GPU Nodepool in AKS Arc. Model deployment and fine tuning Use KAITO for language model deployment, inference and fine tuning Kubernetes AI Toolchain Operator (KAITO) is an open-source operator that automates and simplifies the management of model deployments on a Kubernetes cluster. With KAITO, you can deploy popular open-source language models such as Phi-3 and Falcon, and host them in the cloud or on the edge. Along with the currently supported models from KAITO, you can also onboard and deploy custom language models following this guidance in just a few steps. AKS Arc has been validated with the latest KAITO operator via helm-based installation, and customers can now use KAITO in the edge to: Deploy language models such as Falcon, Phi-3, or their custom models Automate and optimize AI/ML model inferencing for cost-effective deployments, Fine-tune a model directly in a Kubernetes cluster, Perform parameter efficient fine tuning using low-rank adaptation (LoRA) Perform parameter efficient fine tuning using quantized adaptation (QLoRA) You can get started by installing KAITO and deploying a model for inference on your edge GPU nodes with KAITO Quickstart Guidance. You may also refer to KAITO experience in AKS in cloud: Deploy an AI model with the AI toolchain operator (Preview) Use Arc-enabled Machine Learning to train and deploy models in the edge For customers who are already familiar with Azure Machine Learning (AML), Azure Arc-enabled ML extends AML in Azure and enables customers to target any Arc enabled Kubernetes cluster for model training, evaluation and inferencing. With Arc ML extension running in AKS Arc, customers can meet data-residency requirements by storing data on premises during model training and deploy models in the cloud for global service access. To get started with Arc ML extension, please view instructions from Azure Machine Learning document . In addition, AML extension can now be used for a fully automated deployment of a curated list of pre-validated language and traditional AI models to AKS clusters, perform CPU and GPU-based inferencing, and subsequently manage them via Azure ML Studio. This experience is currently in gated preview, please view anotherIgnite blog for more details. Use Azure AI Services with disconnected container in the edge Azure AI services enable customers to rapidly create cutting-edge AI applications with out-of-the-box and customizable APIs and models. It simplified the developer experience to use APIs and embed the ability to see, hear, speak, search, understand and accelerate decision-making into the application. With disconnected Azure AI service containers, customers can now download the container to an offline environment such as AKS Arc and use the same APIs available from Azure. Containers enable you to run Azure AI services APIs in your own environment and are great for your specific security and data governance requirements. Disconnected containers enable you to use several of these APIs disconnected from the internet. Currently, the following containers can be run in this manner: Speech to text Custom Speech to text Neural Text to speech Text Translation (Standard) Azure AI Vision - Read Document Intelligence Azure AI Language Sentiment Analysis Key Phrase Extraction Language Detection Summarization Named Entity Recognition Personally Identifiable Information (PII) detection To get started with disconnected container, please view instructions at Use Docker containers in disconnected environments . Build and deploy data and machine learning pipelines with Flyte Flyte is an open-source orchestrator that facilitates building production-grade data and ML pipelines. It is a Kubernetes native workflow automation tool. Customers can focus on experimentation and providing business value without being an expert in infrastructure and resource management. Data scientists and ML engineers can use Flyte to create data pipelines for processing petabyte-scale data, building analytics workflow for business or finance, or leveraging it as ML pipeline for industry applications. AKS Arc has been validated with the latest Flyte operator via helm-based installation, customers are welcome to use Flyte for building data or ML pipelines. For more information, please view instructions fromIntroduction to Flyte - Flyte and Build and deploy data and machine learning pipelines with Flyte on Azure Kubernetes Service (AKS). AI-powered edge applications with cloud-connected control plane Azure AI Video Indexer, enabled by Azure Arc Azure AI Video Indexer enabled by Arc enables video and audio analysis, generative AI on edge devices. It runs as Azure Arc extension on AKS Arc and supports many video formats including MP4 and other common formats. It also supports several languages in all basic audio-related models. The Phi 3 language model is included and automatically connected with your Video Indexer extension. With Arc enabled VI, you can bring AI to the content for cases when indexed content can’t move to the cloud due to regulation or data store being too large. Other use cases include using on-premises workflow to lower the indexing duration latency or pre-indexing before uploading to the cloud. You can find more details from What is Azure AI Video Indexer enabled by Arc (Preview) Search on-premises data with a language model via Arc extension Retrieval Augmented Generation (RAG) is emerging to augment language models with private data, and this is especially important for enterprise use cases. Cloud services like Azure AI Search and Azure AI Studio simplify how customers can use RAG to ground language models in their enterprise data in cloud. The same experience is coming to the edge and now customers can deploy an Arc extension and ask questions about on-premises data within a few clicks. Please note this experience is currently in gated preview and please see another Ignite blog for more details. Conclusion Developing and running AI workload at distributed edges brings clear benefits such as using cloud as universal control plane, data residency, reduced network bandwidth, and low latency. We hope the products and features we developed above can benefit and enable new scenarios in Retail, Manufacturing, Logistics, Energy, and more. As Microsoft-managed Kubernetes on the edge, AKS Arc not only can host critical edge applications but also optimized for AI workload from hardware, runtime to application. Please share your valuable feedback with us (aksarcfeedback@microsoft.com) and we would love to hear from you regarding your scenarios and business impact.372Views2likes0CommentsFault Tolerant, Durable, Edge Kubernetes Storage with Azure Container Storage enabled by Azure Arc
Azure Container Storage enabled by Azure Arc [Previously named “Edge Storage Accelerator”] – a groundbreaking addition to our Azure storage solutions, designed to revolutionize data handling at the edge. We invite you to explore the capabilities of Edge Volumes and experience the benefits of advanced edge storage solutions firsthand. At the edge, customers have many struggles with data: sharing, resiliency, storage capacity, space management, and cloud connection, among others. We are proud to announce Azure Container Storage enabled by Azure Arc, a first-party Arc Extension designed to solve these customer Kubernetes storage challenges. ACSA offers a fault tolerant, highly available, persistent ReadWriteMany file system using Kubernetes native PVCs. Simply write to an ACSA PVC as if it were your local file system. ACSA offers two main storage configuration options. The Local Shared Edge Volume allows for shared ReadWriteMany storage that remains local to your Kubernetes cluster. This configuration is ideal for persistent application storage, such as databases, data historians, and other data processing scenarios. The Ingest Cloud Edge Volume, ACSA’s second configuration option, uploads data written by applications to Azure storage destinations, such as Blob, ADLSgen2, and OneLake Fabric. Ingest volumes also have user configurable policies for flexibility of data upload throughput and ordering. ACSA follows security best practices by leveraging Managed Identity capabilities and always implementing the latest version of the Blob SDK. Cloud Edge Volumes are disconnection tolerant to losses in network connectivity: ACSA will continue to accept application writes and will automatically upload that data once the connection is reestablished. ACSA accepts all file data you create at the edge, whether it be parquet files, time series data, photos, video, etc. With the option to keep it local to your Kubernetes cluster or send it to a cloud destination, ACSA can handle it for you. ACSA is available as a standard component of the Azure IoT Operations GA release and is suitable for production workloads. Try Out Edge Volumes Today! 📄 Get started by visiting this documentation. Jumpstart Drops make installation a breeze. Try out both Local Shared and Cloud Ingest.229Views2likes0CommentsOperate everywhere with AI-enhanced management and security
Farzana Rahman and Dushyant Gill from Microsoft discuss new AI-enhanced features in Azure that make it simpler to acquire, connect, and operate with Azure's management offerings across multiple clouds, on-premises, and at the edge. Key updates include enhanced management for Windows servers and virtual machines with Windows Software Assurance, Windows Server 2025 hotpatching support in Azure Update Manager, simplified hybrid environment connectivity with Azure Arc gateway, a multicloud connector for AWS, and Log Analytics Simple Mode. Additionally, Azure Migrate Business Case helps compare the total cost of ownership, and new Copilot in Azure capabilities that simplify cloud management and provide intelligent recommendations.896Views1like0CommentsAnnouncing the Azure Arc ISV Partner Program at Ignite
Empowering Partners and Enhancing Customer Experience We are thrilled to introduce the newly launched Azure Arc ISV Partner Program at Ignite! This innovative and growing ecosystem partner program allows them to publish offers on the Azure Marketplace that can be deployed to Arc-enabled Kubernetes clusters. Customers can now access validated, enterprise-grade applications and tools to enhance their Azure Arc development, while ISVs benefit from a deeper integration with Azure Arc services and access to the Arc enabled customer base. All marketplace images have been validated across the Azure Arc platform with the support of both Microsoft and partner teams. With the solutions each partner has made available on the Azure marketplace, the integration with Azure Arc offers central governance to build robust applications with consistent security and reliability for any hybrid deployments. What is Azure Arc? Azure Arc is a platform that extends Azure to datacenters, on-premises, edge, or even multi-cloud environments. It simplifies governance and management by delivering the consistency of the Azure platform. The ability to create offerings for Azure Arc in the marketplace is a significant benefit to our partners, allowing them to integrate with Azure services and tools and access a large and diverse customer base. Azure Arc also provides validated applications for customers to manage their Kubernetes clusters on our platform. Edge developers leverage the open-source community to build their enterprise applications, and we aim to provide them with a one-stop shop in Azure Marketplace, offering a choice of Kubernetes-based building blocks needed to develop their applications. Meet our partners With our Ignite launch, we have built the foundation of an ecosystem that is designed to bring the best capabilities and innovations to our marketplace, focused on leading building block categories: databases, big data/analytics, and messaging. We are excited to introduce our esteemed partners, (CloudCasa, MongoDB, Redis, MinIO, DataStax) who have Arc enabled their application and will now be available on the Azure Marketplace. Here’s a closer look at their offerings: CloudCasa CloudCasa is a leading provider of Kubernetes backup and recovery solutions. By Arc-enabling their application, CloudCasa offers robust, secure, and easy-to-use backup services for Kubernetes, ensuring the protection and availability of critical data. With CloudCasa, your Arc enabled Kubernetes deployments across hybrid environments are fully protected, ensure that your data is safe and recoverable, no matter the scenario. CloudCasa’s integration with Azure Arc offers three key components: handling persistent volume with or without CSI snapshots, unified management and monitoring across environments, and disaster recovery and migration for AKS hybrid. One-way CloudCasa manages persistent storage is that it natively integrates with Container Storage Interface snapshots, ensuring that all your persistent volumes can be captured and protected without interrupting your workloads. CloudCasa also provides a powerful disaster recovery and migration solution. For AKS on Azure Stack HCI, this means you can confidently deploy hybrid and edge clusters, knowing that you have a trusted solution to recover from any disaster, or even perform seamless migrations from edge to cloud or vice versa. To explore CloudCasa’s full capabilities for Azure Arc-enabled Kubernetes clusters, visit the CloudCasa Marketplace listing for Azure Arc or find out more at cloudcasa.io. For personalized assistance, feel free to contact casa@cloudcasa.io. DataStax DataStax is a leading provider of Gen AI solutions for AI developers. With DataStax HCD (Hyper-Converged Database), businesses can harness the power of Apache Cassandra, the highly scalable and resilient NoSQL database, to manage large volumes of structured and vector data with ease. By Arc-enabling their applications, DataStax HCD offers users a “single pane of glass” for streamlined deployment, monitoring, and lifecycle management of their entire infrastructure. Ensuring consistent operations across on-premises, Azure, and multi-cloud environments makes Azure with HCD an ideal choice for mission-critical applications. The combination of Azure Arc central governance and Mission Control, DataStax’s operations platform, on HCD will allow for provisioning of resources for on-premises and on the cloud. HCD brings to Azure Arc database management and the ability to support workloads and AI systems at scale with no single point of failure. DataStax HCD brings three key benefits to Azure Arc: data replication and distribution, node repair, and vector search capabilities to enhance your enterprise data workloads. To learn more about the full capabilities of DataStax HCD, please visit the DataStax HCD for Azure Arc or find out more on the HCD product page. MongoDB MongoDB Enterprise Advanced (EA) empowers customers to securely self-manage their MongoDB deployments on-premises or in hybrid environments, driving operational efficiency, performance, and control to meet specific infrastructure needs. Now with Arc-enablement, MongoDB EA allows developers to build, scale, and innovate faster by providing a robust and dynamic database solution across a multitude of environments. MongoDB’s document data model is intuitive and powerful, and it can easily handle a variety of data types and use cases efficiently. MongoDB EA includes advanced automation, reliable backups, monitoring capabilities, updating deployments, and integrating with various Kubernetes services. The MongoDB integration with Azure Arc provides three key benefits: support for multi-Kubernetes cluster deployments, centralized provisioning through the Azure portal, and leveraging the resilience of Kubernetes deployments. As Azure Arc provides centralized management of Kubernetes environments across a multitude of environments, MongoDB EA adds value with databases that can run across multiple Kubernetes clusters. To explore MongoDB EA on Azure Marketplace for Azure Arc - and to learn more about the full potential of this offering - please visit MongoDB Enterprise Advanced for Azure Arc. For licensing inquiries and to learn more about MongoDB Enterprise Advanced, please visit MongoDB's website. Redis Redis Software, an enterprise-grade, real-time data platform, offers an in-memory data structure store used as a cache, vector database, document database, streaming engine, and message broker. With its Arc-enabled application, Redis Software provides ultra-fast data access, real-time analytics, and seamless scalability. This makes Redis Software ideal for applications requiring high performance and low latency. Integrating with Azure Arc allows users to deploy Redis workloads across on-premises, Cloud and hybrid infrastructure. The benefits Redis Software brings to Azure Arc are support multi-core deployments, Active-Active geo-distribution, data tiering, high-availability with seamless failover and multiple level of on-disk persistence. As it is integrated with Arc, these Redis instances are located on-premises or on the cloud and can be managed centrally on the Azure portal. To explore Redis Software on the Azure Marketplace for Azure Arc, please visit Redis Software for Kubernetes for Azure Arc. You can learn more about licensing inquiries at Redis Software. MinIO MinIO AIStor is the standard for building large scale AI data infrastructure. It is a software-defined, S3 compatible object store that is optimized for the private cloud but will run anywhere - from the public cloud to the edge. Enterprises use AIStor to deliver against artificial intelligence, machine learning, analytics, application, backup and archival workloads - all from a single platform. It was built for the cloud operating model, so it is native to the technologies and architectures that define the cloud, such as: containerization, orchestration with Kubernetes, microservices and multi-tenancy. By Arc-enabling their application, MinIO ensures that Azure users can experience the unmatched scalability, robust security, and lightning-fast storage performance that has made MinIO the most widely integrated object store in the market today. Users can now run these hybrid or multi-cloud deployments on Azure Arc and manage them in a single pane of glass on the Azure portal. To deploy and learn more about MinIO AIStor on Azure Arc, please visit MinIO AIStor for Azure Arc here. For further information on MinIO AIStor for Azure Arc, please visit MinIO | AI Storage is Object Storage. Become an Arc Enabled Partner These partners have collaborated with Microsoft to join our ISV ecosystem, providing resilient and scalable applications readily accessible for our Azure Arc customers via the Azure Marketplace. Joining forces with Microsoft enables partners to stay ahead of the technological curve, strengthen customer relationships, and contribute to transformative digital changes across industries. We look forward to expanding this program to include more ISVs, enhancing the experience for customers using Arc enabled Kubernetes clusters. As we continue to expand our Azure Arc ISV Partner Program, stay tuned for more blogs on the new partners being published to the Azure Marketplace. To reach out and learn more about the Azure Arc ISV Partner Program, please feel free to reach out to us at https://aka.ms/AzureArcISV.675Views1like0Comments