Hybrid
97 TopicsExtend your clusters capabilities through Kubernetes Apps from Azure Marketplace
Seeking to enhance your cloud-native capabilities? Kubernetes Apps on Azure Marketplace offers a transformative solution for your deployment needs. Designed by esteemed industry partners, these applications are crafted to address the complexities of contemporary cloud environments. Envision having a comprehensive array of partner and open-source Kubernetes solutions readily available. With a single click, you can effortlessly deploy these applications to your Azure Kubernetes Service (AKS) and Arc-connected clusters. The entire process is streamlined, with integrated Azure billing and efficient lifecycle management capabilities, ensuring a smooth and hassle-free experience. The cumbersome procurement processes of the past are now obsolete. Through the trusted procurement channels of Azure Marketplace, you can swiftly and securely acquire your Kubernetes solutions. Furthermore, each purchase contributes to your Microsoft Azure Consumption Commitment (MACC), presenting a clear advantage for your organization. Ready to dive into the world of cloud-native applications? Kubernetes Apps on Azure Marketplace offer unparalleled benefits for those eager to run their apps on Kubernetes. Here's why these apps stand out and why they should be your top choice: Secure Deployment You Can Trust When it comes to deploying Kubernetes apps on Azure Marketplace, security and reliability are paramount. Each app undergoes a meticulous certification process and rigorous vulnerability scans before it’s made available to you. This means that solution providers must address any security issues detected, ensuring the app is safe and secure from the get-go. But the security doesn’t stop there. Once an app is published, it continues to be scanned for malware and vulnerabilities regularly. This continuous monitoring ensures that the apps you deploy remain secure and free from known threats, offering robust protection for your applications and data. With these comprehensive security measures, you can deploy Kubernetes apps on your clusters with confidence, knowing that your apps are shielded by multiple layers of protection. Leveraging ARM for Kubernetes Kubernetes applications benefit immensely from cluster extensions that provide ARM-driven lifecycle management. When deployed as a Kubernetes cluster extension, these applications become Azure ARM resources. This transformation allows you to apply all Azure management capabilities, including Role-Based Access Control (RBAC), policies, and monitoring, directly to your Kubernetes apps. ARM and RBAC integration on Kubernetes means end users no longer need to log in to the Kubernetes cluster to install Helm charts. This significant improvement enhances security by reducing potential attack vectors to the cluster. Moreover, the cluster state automatically reconciles to the declarative state if any unwanted changes or errors occur, ensuring consistency and reliability. By turning your Kubernetes apps into ARM resources, you unlock the full spectrum of Azure management tools. This includes centralized control, policy enforcement, and comprehensive monitoring, all underpinned by the robust security measures provided by Azure. Lifecycle Management of Kubernetes Apps Keeping your Kubernetes apps up-to-date is a breeze with the auto-upgrade feature, which ensures you always have the latest features and patches. Scheduled during planned maintenance windows, these updates help maintain seamless operations. Additionally, you’ll enjoy version compatibility support along with a comprehensive matrix of supported cluster types, making Kubernetes apps the smart, secure, and reliable choice for your deployment needs. Benefit from the power of CI/CD automation through ARM-based APIs. By using ARM templates, you can define and deploy your Kubernetes apps as a cohesive unit, streamlining your deployment and configuration processes. This not only simplifies management but also enhances the overall efficiency and effectiveness of your application lifecycle. Programmatic Deployments of Apps In today's dynamic cloud environment, flexibility and ease of deployment are key. That's why we're excited to support the deployment of Kubernetes apps through multiple programmatic methods. Whether you prefer Terraform, CLI, ARM, or the Azure Portal, we have you covered. These tools offer seamless integration and streamline the process, making sure your applications are up and running with minimal effort. We understand that your workloads may be diverse and spread across various environments. Our hybrid deployment capabilities ensure that Kubernetes apps can be enabled not only on Azure Kubernetes Service (AKS) clusters but also on Arc-enabled clusters, whether on-premises or at the edge. This flexibility means you can manage and deploy your Kubernetes apps through the Azure Portal or CLI, no matter where your workloads reside. Embrace the power and convenience of hybrid deployments and take your Kubernetes management to the next level. Unlocking Benefits with Azure Consumption Commitment If your organization has Azure Consumption Commitment (MACC) agreements with Microsoft, you're in for a treat! Kubernetes Apps on Azure Marketplace are MACC eligible, meaning you can leverage these commitments for additional benefits over time. This not only helps in optimizing costs but also ensures you get the most out of your Azure investments. Flexible Billing Options for Kubernetes Apps One size doesn't fit all, especially when it comes to billing for Kubernetes Apps. That's why we offer a range of flexible billing models to suit your needs. Whether you prefer being billed based on usage (per core/per node, etc.), a flat rate, or custom dimensions, we've got you covered. Plus, we support upfront billing through private offers, giving you even more control and predictability over your expenditures. Choose the billing option that works best for you and focus on what matters most—running and scaling your applications with ease. How to Get Started with Kubernetes Apps Deploying Kubernetes apps has never been easier, thanks to a variety of methods at your disposal: - Programmatically deploy using Terraform: Utilize the power of Terraform to automate and manage your Kubernetes applications. - Deploy programmatically with Azure CLI: Leverage the Azure CLI for straightforward, command-line based deployments. - Use ARM templates for programmatic deployment: Define and deploy your Kubernetes applications efficiently with ARM templates. - Deploy via AKS in the Azure portal: Take advantage of the user-friendly Azure portal for a seamless deployment experience. Choose the method that best fits your workflow and get your Kubernetes applications up and running with ease. We hope this guide has been helpful and has made the process of deploying Kubernetes apps on Azure a bit clearer. Stay tuned for more tips and tricks, and happy deploying! 😄128Views0likes0CommentsDeploy a Kubernetes Application Programmatically Using Terraform and CLI
Today we will cover deploying a Kubernetes application programmatically using Terraform and CLI. These deployment methodscan streamline your workflow and automate repetitive tasks. Deploying your Kubernetes Application using Terraform This walkthrough assumes you have previous knowledge of Terraform. For additional information and guidance on using Terraform to provision a cluster, please refer here. Prerequisites Before we begin, ensure you have the following: Terraform Azure CLI Sample Location You can find the Terraform sample we will be using at this location: Terraform Sample Prepare the Environment First, initialize Terraform in the current directory where you have copied the k8s-extension-install sample by running the following command: terraform init In the directory, you will find two example tfvars files. These files can be used to deploy the application with different configurations: azure-vote-without-config.tfvars - Deploy the application with the default configuration for azure-vote. azure-vote-with-config.tfvars - Deploy/update the application with a custom configuration for azure-vote. Before you test run the sample tfvars files, update the following in the tfvars files: cluster_name - The name of the AKS cluster. resource_group_name - The name of the resource group where the AKS cluster is located. subscription_id - The subscription ID where the AKS cluster is located. Deploy the Application To deploy the application with the default configuration for azure-vote, run: terraform apply -var-file="azure-vote-without-config.tfvars" To deploy or update the application with a custom configuration for azure-vote, use: terraform apply -var-file="azure-vote-with-config.tfvars" Conclusion And that's it! You've successfully deployed your Kubernetes application programmatically using Terraform. This process can drastically reduce the time and effort involved in managing and scaling your applications. By using Terraform, you can ensure that your deployment is consistent and repeatable, making it easier to maintain your infrastructure as code. Deploying a Kubernetes Application from Azure CLI Deploying a Kubernetes application using Azure CLI can seem daunting, but we’re here to make it simple and accessible. Follow these steps, and you’ll have your azure-vote application up and running in no time! Prerequisites Before we get started, ensure you have the following: Azure CLI installed on your machine Deploying the Sample Azure-Vote Application from the Marketplace Step 1: Log in to Azure Open your terminal and log in to your Azure account by running: az login Step 2: Set Your Subscription Specify the subscription you want to use with: az account set --subscription Step 3: Deploy the Azure-Vote Application Now, deploy the azure-vote application to your Kubernetes cluster with the following command: az k8s-extension create --name azure-vote --scope cluster ` --cluster-name <clusterName> --resource-group <resourceGroupName> --cluster-type managedClusters ` --extension-type commercialMarketplaceServices.AzureVote ` --plan-name azure-vote-paid ` --plan-product azure-vote-final-1 ` --plan-publisher microsoft_commercial_marketplace_services ` --configuration-settings title=VoteAnimal value1=Cats value2=Dogs Updating Configuration Settings If you want to update the configuration settings of the azure-vote application, you can do so easily. Use the following command to change the configuration settings: az k8s-extension update --name azure-vote ` --cluster-name <clusterName> --resource-group <resourceGroupName> --cluster-type managedClusters ` --configuration-settings value1=Elephant value2=Horse And there you have it! By following these steps, you can deploy and update the azure-vote application on your Kubernetes cluster using Azure CLI. Conclusion Deploying Kubernetes applications using Azure CLI is a powerful way to manage and scale your applications. The process described above helps ensure your deployments are consistent and repeatable, simplifying the maintenance of your infrastructure as code. We hope this guide has been helpful. Stay tuned for more tips and demos. Happy deploying! 😄116Views0likes0CommentsEvolving Stretch Clustering for Azure Local
Stretched clusters in Azure Local, version 22H2 (formerly Azure Stack HCI, version 22H2) entail a specific technical implementation of storage replication that spans a cluster across two sites. Azure Local, version 23H2 has evolved from a cloud-connected operating system to an Arc-enabled solution with Arc Resource Bridge, Arc VM, and AKS enabled by Azure Arc. Azure Local, version 23H2 expands the requirements for multi-site scenarios beyond the OS layer, while Stretched clusters do not encompass the entire solution stack. Based on customer feedback, the new Azure Local release will replace the Stretched clusters defined in version 22H2 with new high availability and disaster recovery options. For Short Distance Rack Aware Cluster is a new cluster option which spans two separate racks or rooms within the same Layer-2 network at a single location, such as a manufacturing plant or a campus. Each rack functions as a local availability zone across layers from OS to Arc management including Arc VMs and AKS enabled by Azure Arc, providing fault isolation and workload placement within the cluster. The solution is configured with one storage pool to reduce additional storage replication and enhance storage efficiency. This solution delivers the same Azure deployment and management experience as a standard cluster. This setup is suitable for edge locations and can scale up to 8 nodes, with 4 nodes in each rack. Rack Aware Cluster is currently in private preview and is slated to public preview and general release in 2025. For Long Distance Azure Site Recovery can be used to replicate on-premises Azure Local virtual machines into Azure and protect business-critical workloads. This allows Azure cloud to serve as a disaster recovery site, enabling critical VMs to be failed over to Azure in case of a local cluster disaster, and then failed back to the on-premises cluster when it becomes operational again. If you cannot fail over certain workloads to cloud and require long distance of disaster recovery, like in two different cities, you can leverage Hyper-V Replica to replicate Arc VMs to the secondary site. Those VMs will become Hyper-V VMs on the secondary site, they will become Arc VMs once they fail back to the original cluster on the first site. Additional Options beyond Azure Local If the above solutions in Azure Local do not cover your needs, you can fully customize your solution with Windows Server 2025 which introduces several advanced hybrid cloud capabilities designed to enhance operational flexibility and connectivity across various environments. Additionally, it offers various replication technologies like Hyper-V Replica, Storage Replica and external SAN replication that enable the development of tailored datacenter disaster recovery solutions. Learn more from the Windows Server 2025 now generally available, with advanced security, improved performance, and cloud agility - Microsoft Windows Server Blog What to do with existing Stretched clusters on version 22H2 Stretched clusters and Storage Replica are not supported in Azure Local, version 23H2 and beyond. However, version 22H2 stretched clusters can stay in supported state in version 23H2 by performing the first step of operating system upgrade as shown in the following diagram to 23H2 OS. The second step of the solution upgrade to Azure Local is not applicable to stretched clusters. This provides extra time to assess the most suitable future solution for your needs. Please refer to the About Azure Local upgrade to version 23H2 - Azure Local | Microsoft Learn for more information on the 23H2 upgrade. Refer the blog on Upgrade from Azure Stack HCI, version 22H2 to Azure Local | Microsoft Community Hub. Conclusion We are excited to be bringing Rack Aware Clusters and Azure Site Recovery to Azure Local. These high availability and disaster recovery options allow customers to address various scenarios with a modern cloud experience and simplified management.3.4KViews11likes0CommentsMicrosoft Ignite 2024: Celebrating the Success of Our First Show Floor Interview Series
🔥 Microsoft Ignite 2024 has wrapped, and what an event it was! This year, we tried something new - theIgnite Show Floor Interview Series - and it is safe to say it was a huge success. Alongside liorkamrat and thomasmaurer, I had the privilege of conducting interviews with some of the most innovative minds in our Adaptive Cloud ecosystem. We spoke with partners, customers, ISVs from the Azure Arc ISV program (Announcing the Azure Arc ISV Partner Program at Ignite), and Microsoft MVPs, diving into their unique stories, their takeaways from Ignite, and how they’re leveraging Microsoft technologies to drive innovation. 17 Interviews, Countless Stories In total, we produced 17 videos during Ignite, each offering a fresh perspective on cloud innovation. Here is a snapshot on everyone we talked to: Why This Matters This was our first year running the Show Floor Interview Series, and it exceeded our expectations. Here’s why we’re excited: Showcasing Innovation: These interviews highlighted the incredible work being done across the Adaptive Cloud ecosystem, from large enterprises to individual experts Building Community: The series wasn’t just about interviews; it was about connecting with people, hearing their challenges, and celebrating their successes Expanding Reach: By sharing these conversations on our Jumpstart YouTube channel and LinkedIn, we helped bring these stories to a wider audience - even for those that were not able to attend Ignite this year What's Next? While Ignite 2024 is over, this is just the beginning. We are already thinking about how to expand and improve this series for future events. Expect more interviews, more insights, and more opportunities to engage with the Microsoft community next year. And we didn’t just stick to business - we even had a little fun with attendees, asking them what their favorite Microsoft Ignite swag was! 🚀 Haven’t seen the interviews yet? Check out our full playlist on the Arc Jumpstart YouTube channel here. Thank You! A huge thank you to everyone who participated and a huge thank you to everyone who tuned in to our series! To our partners, customers, ISVs, and MVPs - thank you for sharing your time and insights. You made this series what it is, and we are excited to continue building on this momentum! Here’s to the power of innovation, collaboration, and community. Let’s keep the conversation going!181Views0likes0CommentsIgnite 2024: AKS enabled by Azure Arc - New Capabilities and Expanded Workload Support
Microsoft Ignite 2024 has been a showcase of innovation across the Azure ecosystem, bringing forward major advancements in AI, cloud-native applications, and hybrid cloud solutions. This year’s event featured key updates, including enhancements to AKS enabled by Azure Arc, which introduced new capabilities and expanded workload support. These updates reinforce the value and versatility that AKS enabled by Azure Arc brings to organizations looking to scale and optimize their operations. With these advancements, AKS Arc continues to support seamless management, increased scalability, and enhanced workload performance across diverse infrastructures. AKS Enabled by Azure Arc AKS enabled by Azure Arc brings the power of Azure’s managed Kubernetes service to any environment, providing consistent management and security across on-premises, edge, and multi-cloud deployments. It encompasses: AKS on Azure Local: A full-featured Kubernetes platform integrated with Azure Local for comprehensive container orchestration in hybrid setups. Notably, AKS on Azure Local has earned recognition as a leader in the 2024 Gartner Magic Quadrant for Distributed Hybrid Infrastructure, underscoring Microsoft's dedication to delivering comprehensive, enterprise-ready solutions for hybrid cloud deployments. AKS Edge Essentials: A lightweight version designed for edge computing, ensuring operational consistency on constrained hardware. AKS on Azure Local Disconnected Operations: It is now available on Azure Local Disconnected Operations. This latest addition to AKS enabled by Azure Arc portfolio is the support for fully disconnected scenario. It allows AKS enabled by Azure Arc to operate in air-gapped, isolated environments without the need for continuous Azure connectivity. It is crucial for organizations that require secure, self-sufficient Kubernetes operations in highly controlled or remote locations. With this support, businesses can maintain robust Kubernetes functionality while meeting stringent compliance and security standards. Key Features and Expanded Workload Support This year's Ignite announcements unveiled a series of public preview and GA features that enhance the capabilities of AKS enabled by Azure Arc. These advancements reflect our commitment to delivering robust, scalable solutions that meet the evolving needs of our customers. Below are the key highlights that showcase the enhanced capabilities of AKS enabled by Azure Arc: Edge Workload Azure IoT Operations - enabled by Azure Arc: Available on AKS Edge Essentials (AKS-EE) and AKS on Azure Local with public preview support. Azure IoT Operations in the management and scaling of IoT solutions. It provides robust support for deploying and overseeing IoT applications within Kubernetes environments, enhancing operational control and scalability. Organizations can leverage this tool to maintain seamless management of distributed IoT workloads, ensuring consistent performance and simplified scaling across diverse deployment scenarios. Azure Container Storage - enabled by Azure Arc: Available on both AKS Edge Essentials (AKS-EE) and AKS on Azure Local, this support enables seamless integration for persistent storage needs in Kubernetes environments. It provides scalable, reliable, and high-performance storage solutions that enhance data management and support stateful applications running in hybrid and edge deployments. This addition ensures that organizations can efficiently manage their containerized workloads with robust storage capabilities. Azure Key Vault Secret Store extension for Kubernetes: Now available as public preview on AKS Edge Essentials and AKS on Azure Local, this extension automatically synchronizes secrets from an Azure Key Vault to an AKS enabled by Azure Arc cluster for offline access, providing essential tools for proactive monitoring and policy enforcement. It offers advanced security and compliance capabilities tailored for robust governance and regulatory adherence, ensuring that organizations can maintain compliance with industry standards and best practices while safeguarding their infrastructure. Azure Monitor Pipeline: The Azure Monitor pipeline is a data ingestion solution designed to provide consistent, centralized data collection for Azure Monitor. Once deployed for AIO on AKS cluster enabled by Azure Arc, it enables at-scale telemetry data collection and routing at the edge. The pipeline can cache data locally, syncing with the cloud when connectivity is restored, and supports segmented networks where direct data transfer to the cloud isn’t possible. Built on OpenTelemetry Collector, the pipeline’s configuration includes data flows, cache properties, and destination rules defined in the DCR to ensure seamless data processing and transmission to the cloud. Arc Workload Identity Federation: Now available as public preview on AKS Edge Essentials and AKS on Azure Local, providing secure federated identity management to enhance security for customer workloads Arc Gateway: Now available as public preview for AKS Edge Essentials and AKS on Azure Local. Arc Gateway support on AKS enabled by Azure Arc enhances secure connectivity across hybrid environments, reducing required firewall rules and improving security for customer deployments. Azure AI Video Indexer - enabled by Azure Arc: Supported on AKS Edge Essentials and AKS on Azure Local. Arc-enabled Video Indexer enables comprehensive AI-powered video analysis, including transcription, facial recognition, and object detection. It allows organizations to deploy sophisticated video processing solutions within hybrid and edge environments, ensuring efficient local data processing with improved security and minimal latency. MetalLB - Azure Arc Extension: Now supported on AKS Edge Essentials and AKS on Azure Local, MetalLB ensures efficient load balancing capabilities. This addition enhances network resilience and optimizes traffic distribution within Kubernetes environments. Comprehensive AI and Machine Learning Capabilities GPUs for AI Workloads: Now AKS enabled by Azure Arc supports a range of GPUs tailored for AI and machine learning workloads with GPU Partitioning) and GPU Passthrough Virtualization support. These options enable robust performance for resource-intensive AI and machine learning workloads, allowing for efficient use of GPU resources to run complex models and data processing tasks. Arc-enabled Azure Machine Learning: Support on AKS on Azure Local, AML capabilities for running sophisticated AI models. Businesses can leverage Azure’s powerful machine learning tools seamlessly across different environments, enabling them to develop, deploy, and manage machine learning models effectively on-premises and at the edge. Arc-enabled Video Indexer: It extends Azure's advanced video analytics capabilities to AKS enabled by Azure Arc. Organizations can now process and analyze video content in real-time, harnessing Azure's robust video AI tools to enhance video-based insights and operations. This support provides businesses with greater flexibility to conduct video analysis seamlessly in remote or hybrid environments Kubernetes AI Toolchain Orchestrator (Kaito + LoRA + QLoRA): Fully validated and support for fine-tuning and optimizing AI models, Kaito, LoRA and QLoRA are designed for edge deployments such as AKS on Azure Local. This combination enhances the ability to run and refine AI applications effectively in edge environments, ensuring performance and flexibility. Flyte Integration: Now supported on AKS on Azure Local, Flyte offers a scalable orchestration platform for managing machine learning workflows. This capability enables teams to build, execute, and manage complex AI pipelines efficiently, enhancing productivity and simplifying the workflow management process. Enhanced Infrastructure and Operations Management Infrastructure as Code (IaC) with Terraform: Now supported on AKS on Azure Local for both Connected and Air-gapped scenario, providing streamlined deployment capabilities through code. This support enables teams to automate and manage their Kubernetes infrastructure at scale more efficiently with Terraform. Anti-affinity, Pod CIDR, Taints/Labels: Available on AKS on Azure Local, these features provide enhanced infrastructure capabilities by allowing refined workload placement and advanced network configuration. Anti-affinity rules help distribute pods across different nodes to avoid single points of failure, while Pod CIDR simplifies network management by allocating IP ranges to pods. Taints and labels offer greater control over node selection, ensuring that specific workloads run on designated nodes and enhancing the overall efficiency and reliability of Kubernetes operations. Optimized Windows Node Pool Management: AKS enabled by Azure Arc now includes the capability to enable and disable Windows node pools for clusters. This enhancement helps prevent unnecessary binary downloads, benefiting customers with low-speed or limited internet connection. It optimizes resource usage, reduces bandwidth consumption, and enhances overall deployment efficiency, making it ideal for environments with network constraints. Kubernetes Development AKS-WSL: With AKS-WSL, developers can set up a local environment that mimics the experience of working with AKS. This makes it easier for developers to write, debug, and test Kubernetes applications locally before deploying them to a full AKS cluster. AKS-WSL VSCode Extension: The Visual Studio Code extension for AKS-WSL allows developers to write, debug, and deploy Kubernetes applications locally, streamlining development workflows. This setup improves productivity by providing efficient tools and capabilities, making it easier to develop, test, and refine Kubernetes workloads directly from a local machine. Arc Jumpstart: Supported AKS Edge Essentials and AKS on Azure Local. Arc Jumpstart simplifies deployment initiation, providing developers with a streamlined way to set up and start working with Kubernetes environments quickly. It makes it easier for teams to evaluate and experiment with AKS enabled by Azure Arc, offering pre-configured scenarios and comprehensive guidance. By reducing complexity and setup time, Arc Jumpstart enhances the developer experience, facilitating faster prototyping and smoother onboarding for new projects in hybrid and edge settings. Conclusion Microsoft Ignite 2024 has underscored the continued evolution of AKS enabled by Azure Arc, bringing more comprehensive, scalable, and secure solutions to diverse environments. These advancements support organizations in running cloud-native applications anywhere, enhancing operational efficiency and innovation. We welcome your feedback (aksarcfeedback@microsoft.com) and look forward to ongoing collaboration as we continue to evolve AKS enabled by Azure Arc.1.7KViews5likes0CommentsAnnouncing General Availability: Windows Server Management enabled by Azure Arc
Windows Server Management enabled by Azure Arc offers customers with Windows Server licenses that have active Software Assurances or Windows Server licenses that are active subscription licenses the following key benefits: Azure Update Manager Azure Change Tracking and Inventory Azure Machine Configuration Windows Admin Center in Azure for Arc Remote Support Network HUD Best Practices Assessment Azure Site Recovery (Configuration Only) Upon attestation, customers receive access to the following at no additional cost beyond associated networking, compute, storage, and log ingestion charges. These same capabilities are also available for customers enrolled in Windows Server 2025 Pay as you Go licensing enabled by Azure Arc. Learn more atWindows Server Management enabled by Azure Arc - Azure Arc | Microsoft Learn or watch Video: Free Azure Services for Non-Azure Windows Servers Covered by SA Powered by Azure Arc! To get started, connect your servers to Azure Arc, attest for these benefits, and deploy management services as you modernize to Azure's AI-enabled set of server management capabilities across your hybrid, multi-cloud, and edge infrastructure!7.8KViews9likes10CommentsExtending Azure's AI Platform with an adaptive cloud approach
Ignite 2024 is here, and nothing is more top of mind for customers than the potential to transform their businesses with AI wherever they operate. Today, we are excited to announce the preview of two new Arc-enabled services that extend the power of Azure’s AI platform to on-premises and edge environments. Sign up to join the previews here! An adaptive cloud approach to AI The goal of Azure’s adaptive cloud approach is to extend just enough Azure to customers’ distributed environments. For many of these customers, valuable data is generated and stored locally, outside of the hyperscale cloud, whether due to regulation, latency, business continuity, or simply the large volume of data being generated in real time. AI inferencing can only occur where the data exists. So, while the cloud has become the environment of choice for training models, we see a tremendous need to extend inferencing services beyond the cloud to enable complete cloud-to-edge AI scenarios. Search on-premises data with generative AI Over the past couple of years, generative AI has come to the forefront of AI innovation. Language models give any user the ability to interact with large, complex data sets in natural language. Public tools like ChatGPT are great for queries about general knowledge, but they can’t answer questions about private enterprise data on which they were not trained. Retrieval Augmented Generation, or "RAG", helps address this need by augmenting language models with private data. Cloud services like Azure AI Search and Azure AI Foundry simplify how customers can use RAG to ground language models in their enterprise data. Today, we are announcing the preview of a new service that brings generative AI and RAG to your data at the edge. Within minutes, customers can deploy an Arc extension that contains everything needed to start asking questions about their on-premises data, including: Popular small and large language models running locally with support for both CPU and GPU hardware A turnkey data ingestion and RAG pipeline that keeps all data completely local, with RBAC controls to prevent unauthorized access An out-of-the-box prompt engineering and evaluation tool to find the best settings for a particular dataset Azure-consistent APIs to integrate into business applications, as well as apre-packaged UI to get started quickly This service is available now in gated private preview for customers running Azure Local infrastructure, and we plan to make it available on other Arc-enabled infrastructure platforms in the near future. Sign up here! Deploy curated open-source AI models via Azure Arc Another great thing about Azure’s AI platform is that it provides a catalog of curated AI models that are ready to deploy and provide consistent inferencing endpoints that can be integrated directly into customer applications. This not only makes deployment easy, but customers can also be confident that the models are secure and validated These same needs exist on the edge as well, which is why we are now making a set of curated models deployable directly from the Azure Portal. These models have been selected, packaged, and tested specifically for edge deployments, and are currently available on Azure Local infrastructure. Phi-3.5 Mini (3.8 billion parameter language model) Mistral 7B (7.3 billion parameter language model) MMDetection YOLO (object detection) OpenAI Whisper Large (speech to text) Google T5 Base (translation) Models can be deployed from a familiar Azure Portal wizard to an Arc AKS cluster running on premises. All available models today can be run on just a CPU. Phi-3.5 and Mistral 7B also have GPU versions available for better performance. Once complete, the deployment can be managed directly in Azure ML Studio, and an inferencing endpoint is available on your local network. Wrap up Sign up now to join either of the previews at the link below or stop by and visit us in person in the Azure Arc and Azure Local Expert Meet Up station in the Azure Infrastructure neighborhood at Ignite. We’re excited to get these new capabilities into our customers’ hands and hear from you how it’s going. Sign up to join the previews here1.4KViews6likes2CommentsNew capabilities to aid Migration and Hybrid Cloud Management
When we support customers looking to modernize their IT estate, our vision is to empower businesses to seamlessly manage and migrate their on-premises and cloud environments with efficiency and insight. To help customers with their migration and hybrid cloud journey, we have been working to bridge the gap between on-premises and cloud environments by providing consistent management experiences for both. This week at Ignite, we are excited to announce the public preview of two capabilities as part of customer’s migration journey that bring us closer to this vision: "Business Case for Arc" and "Enabling Arc” capabilities. With these new capabilities in Azure Migrate, customers can now visualize the value of Azure Arc for their on-premises estates throughout their migration journey, making informed decisions with confidence. Customers in their migration journey can now check if their on-premises machine is already Arc enabled, and if not, get the value/savings of using Arc while their resource remains on-premises and leverage the Arc capabilities based on the same. Envision the Benefits of Azure Arc and Azure Management with Business case for Arc The Azure Migrate business case enables customers to create a detailed comparison of the Total Cost of Ownership (TCO) for their on-premises estate versus the TCO on Azure, along with a year-on-year cash flow analysis as they transition their workloads to Azure. With this new capability, customers can now visualize the added value of Azure Arc for their on-premises estates throughout their migration journey. Customers , in addition to getting the TCO of migrating all their resources to Azure, can now compare their estimated current on-premises TCO with the estimated TCO of their on-premises estate with Azure Arc, visualize the cost savings and other benefits of using Azure security (Microsoft Defender for Cloud) and management tools (Azure Monitor and Azure Update Manager) via Azure Arc for their on-premises servers, and see the licensing benefits of Extended Security Updates (ESUs) enabled by Azure Arc, as well as SQL Pay-As-You-Go via Azure Arc-enabled SQL Server. For those not planning to migrate their entire estate or on a long migration journey, they can compare their current on-premises TCO with the combined Azure and Azure Arc TCO in the final planned state to help plan better. Customers can edit the assumptions for Azure and on-premises costs and download the business case report using the export option to share with other stakeholders. Enable Arc: Onboarding to Arc to leverage Azure Management Services The Azure Migrate inventory now integrates with Azure Arc, allowing customers to identify which of their discovered machines are already Arc-enabled and onboard those that are not, directly from the Azure Migrate portal. This integration provides a seamless experience, offering enhanced control and visibility over the migration process while managing the remaining on-premises inventory. Next Steps and Actions Get started here and generate a business case: https://learn.microsoft.com/en-us/azure/migrate/how-to-build-a-business-case729Views1like0Comments