well architected
108 TopicsAI Azure Landing Zone: Shared Capabilities and Models to Enable AI as a Platform
This architecture diagram illustrates a Microsoft Azure AI Landing Zone Pattern — a scalable, secure, and well-governed framework for deploying AI workloads across multiple subscriptions in an enterprise environment. Let's walk through it end-to-end, breaking down each section, the flow, and key Azure services involved. 🧭 Overview: The architecture is split into 4 major landing zones: Connectivity Subscription AI Apps Landing Zone Subscription AI Hub Landing Zone Subscription AI Services Landing Zone Subscription 🔁 Step-by-Step Breakdown 🔹 1. Users → Application Gateway (WAF) Users (e.g., enterprise employees or external users) access the system via the Application Gateway with Web Application Firewall (WAF). This is part of the Connectivity Subscription and provides: Centralized ingress control Zone redundancy Protection against common exploits 🔹 2. Route to AI Apps Landing Zone Subscription Traffic is routed to the AI Apps Landing Zone Subscription via the Application Gateway. This subscription hosts applications that use AI services, typically in a containerized or App Service-based architecture. 🔹 3. AI Apps Workload Components This section includes: App Hosting: Azure App Services Container Apps (with Container Registry) Networking: Private Endpoints Subnets Network Security Groups Monitoring: Log Analytics Workspaces Diagnostic Settings App Agents: Represent container/app service instances (Agent 1, 2, 3) 🔹 4. Integration with AI Services & Secrets Management These apps securely connect to: Azure Key Vault (secrets, credentials) Azure AI Search Azure Cosmos DB Azure Storage Azure OpenAI App Insights is used for application performance monitoring. Logic Apps & Functions handle: Knowledge Management Processing LLM Integration Workflows 🔹 5 & 6. Connectivity to Centralized Services Virtual Network Peering connects AI Apps Landing Zone with: Connectivity Subscription Hub Virtual Network in the Platform Landing Zone Subscription These provide access to shared infrastructure: Azure Firewall Azure Bastion VPN Gateway / ExpressRoute Azure DNS / Private Resolver Azure DDoS Protection 🔹 7. AI Hub Landing Zone Subscription This acts as a centralized workload processing zone with components like: Event Hubs Azure Key Vault App Insights Power BI Cosmos DB API Management (OpenAI Endpoints) Used for: Observability Usage processing API integration 🔹 8 & 9. FTU Usage Processing & Reporting Function Apps & Logic Apps: Process usage data (e.g., for chargebacks, monitoring) FTU = "Fair Tenant Usage" Reporting is done using Power BI and stored in Cosmos DB 🔹 10 & 11. Network Peering to Platform Zone AI Hub connects back to Platform Landing Zone via Virtual Network Peering Provides access to shared DNS zones and network services 🔹 12. AI Services Landing Zone Subscription This is where core AI capabilities live, such as: Azure OpenAI Azure AI Services: Speech Vision Language Machine Learning Foundry Project: OpenAI Agents Agent Service Dependencies Models hosted in Azure (e.g., GPT) This zone is accessed securely via: Private Endpoints Azure Key Vault Network rules 📦 Subscription Vending (All Zones) Each subscription includes a Subscription Vending Framework for: Spoke VNet placement Route configurations Policy/role assignments Defender for Cloud & cost management This ensures a consistent and compliant environment across the enterprise. 📌 Key Architectural Benefits Feature Purpose 🔐 Zero Trust Network Controlled access via WAF, private endpoints 📡 Scalable AI Apps Container Apps & App Services 🧠 Central AI Services Managed in isolated subscriptions 🔍 Monitoring Deep insights via App Insights, Log Analytics 🧾 Governance Role-based access, policy enforcement 🔌 Secure Integration VNet Peering, Azure Key Vault, API Management 🔚 End-to-End Data Flow Summary Users access app through Application Gateway (WAF) Apps in AI Apps Landing Zone process input Apps call AI services (OpenAI, Cognitive) via private endpoints Data usage and insights flow to AI Hub for logging and analysis FTU and usage metrics processed and stored Platform services support routing, DNS, security 🎯 Goal of the User Journey The user interacts with an AI-powered application (e.g., chatbot, document summarizer, recommendation engine) deployed on Azure. The app is secure, scalable, and integrated with advanced Azure AI services (like OpenAI). 👣 User Journey: Step-by-Step Breakdown ✅ 1. User Access (Public Entry Point) The user (browser or mobile app) sends a request (e.g., opens an AI web app or sends a prompt to a chatbot). The request hits the Azure Application Gateway with Web Application Firewall (WAF). ✅ Filters and protects against malicious traffic. ✅ Ensures high availability with zone redundancy. 🧠 Think of it as the front door to the AI platform. ✅ 2. Routing to AI Application The Application Gateway securely routes the request to the AI Apps Landing Zone Subscription. The user request reaches the App Service or Container App hosting the AI-based application logic. Example: A user submits a product question via a chatbot UI hosted here. ✅ 3. Processing the Request (App Logic) The app receives the input and begins processing: App uses App Insights for performance telemetry. Secrets or config (API keys, connection strings) are securely pulled from Azure Key Vault. Based on the business logic, the app needs to call an AI model (e.g., OpenAI). ✅ 4. Calling AI Services (via Private Endpoints) The app securely connects (using private endpoints) to the AI Services Landing Zone to: 🔹 Call Azure OpenAI (e.g., ChatGPT, DALL·E, embeddings) 🔹 Use Azure Cognitive Services (e.g., speech, vision, search) These services are isolated in their own subscription for security, scalability, and cost governance. 🧠 Here’s where the “AI magic” happens. ✅ 5. Retrieval-Augmented Generation (Optional) If the AI needs additional knowledge (RAG pattern), the app can: Query Azure AI Search for documents. Pull knowledge from Azure Cosmos DB or Azure Storage. AI results are processed via Logic Apps / Functions (e.g., post-processing, formatting). ✅ 6. Return the Response to the User The application receives the AI-generated output. It formats the result (e.g., chatbot message, visual, PDF, etc.) and returns it to the user via the original secure path. ✅ 7. Observability & Usage Logging App, AI service usage, and telemetry are logged in: Log Analytics / App Insights Event Hub → Streamed to AI Hub Landing Zone This enables centralized monitoring and analytics (Power BI dashboards, anomaly detection, etc.) ✅ 8. Usage Reporting & Governance Function App & Logic App in the AI Hub Landing Zone process usage logs. Usage is stored in Azure Cosmos DB. FTU (Fair Tenant Usage) policies are enforced and reported via Power BI dashboards. ✅ 9. Admin/Platform Layer All resources and subscriptions are governed via the Platform Landing Zone: Shared services like DNS, security policies, firewalls Cost controls, Defender for Cloud, DDoS protection Subscription vending and network segmentation 🗺️ Visual Recap: User Journey Flow User → App Gateway (WAF) → App in AI Apps Landing Zone → Call to Azure OpenAI / AI Services → (Optional: Knowledge retrieval) → AI Response →Returned to User → Usage logged & monitored → Usage reporting in AI Hub User Workflow 🔐 Security Throughout the Journey Step Security Feature App Gateway Web Application Firewall App Hosting Private Endpoints, Managed Identity Secrets Azure Key Vault Network Virtual Network Peering, NSGs Governance Role-based access, Policy Assignments 🧠 Example: Real-World Use Case Scenario: A doctor uses a medical AI assistant to analyze patient notes. Logs in via secure portal (WAF gateway) Submits patient notes (App Service) App calls OpenAI with prompt: "Summarize this diagnosis." App also queries internal document store (RAG) OpenAI returns result → displayed in UI Usage tracked for audit and reporting 🧭 User Journey Flow Users End users initiate a request (e.g., accessing an AI-powered app). Application Gateway + WAF (Connectivity Subscription) Request is routed through the Application Gateway with Web Application Firewall for security and traffic filtering. AI Apps Landing Zone Subscription Request enters the AI Apps subscription. Workloads run on App Services or Container Apps (Agents 1, 2, 3). Secure Access Application services authenticate and securely retrieve data from Azure Key Vault, Cosmos DB, Azure Storage, and Azure AI Search. Knowledge Management Processing Logic Apps / Function Apps process the request, enabling workflows, integrations, and knowledge enrichment. AI Hub Gateway Application Requests requiring AI services are routed to the AI Hub for centralized management. API Management (OpenAI Endpoints) APIs handle communication with downstream AI services. Event Hub + App Insights Telemetry and logs are captured for monitoring and troubleshooting. Power BI + Cosmos DB Usage data is aggregated and analyzed for reporting (FTU usage tracking). AI Services Subscription API calls are directed to the AI Services subscription. Azure AI Models Execution Requests hit Azure OpenAI, Azure AI Foundry, Cognitive Services (Speech, Vision, Search, etc.). Foundry/Agent services provide additional AI processing. Response back to User Processed AI output is routed back through the pipeline → API → Hub → Apps → Application Gateway → returned to the user. High Level Architecture Diagram Security & Governance Overview AI Landing Zone Lifecycle Workflow URL Reference Architectures: Baseline Azure AI Foundry Chat Reference Architecture in an Azure Landing Zone - Azure Architecture Center | Microsoft Learn84Views0likes0CommentsAzure OpenAI Landing Zone reference architecture
In this article, delve into the synergy of Azure Landing Zones and Azure OpenAI Service, building a secure and scalable AI environment. Unpack the Azure OpenAI Landing Zone architecture, which integrates numerous Azure services for optimal AI workloads. Explore robust security measures and the significance of monitoring for operational success. This journey of deploying Azure OpenAI evolves alongside Azure's continual innovation.207KViews42likes20CommentsAzure Course Blueprints
Overview The Course Blueprint is a comprehensive visual guide to the Azure ecosystem, integrating all the resources, tools, structures, and connections covered in the course into one inclusive diagram. It enables students to map out and understand the elements they've studied, providing a clear picture of their place within the larger Azure ecosystem. It serves as a 1:1 representation of all the topics officially covered in the instructor-led training. Formats available include PDF, Visio, Excel, and Video. Links: Each icon in the blueprint has a hyperlink to the pertinent document in the learning path on Learn. Layers: You have the capability to filter layers to concentrate on segments of the course Integration: The Visio Template+ for expert courses like SC-100 and AZ-305 includes an additional layer that enables you to compare SC-100, AZ-500, and SC-300 within the same diagram. Similarly, you can compare any combination of AZ-305, AZ-700, AZ-204, and AZ-104 to identify differences and study gaps. Since SC-300 and AZ-500 are potential prerequisites for the expert certification associated with SC-100, and AZ-204 or AZ-104 for the expert certification associated with AZ-305, this comparison is particularly useful for understanding the extra knowledge or skills required to advance to the next level. Advantages for Students Defined Goals: The blueprint presents learners with a clear vision of what they are expected to master and achieve by the course’s end. Focused Learning: By spotlighting the course content and learning targets, it steers learners’ efforts towards essential areas, leading to more productive learning. Progress Tracking: The blueprint allows learners to track their advancement and assess their command of the course material. Topic List: A comprehensive list of topics for each slide deck is now available in a downloadable .xlsx file. Each entry includes a link to Learn and its dependencies. Download links Associate Level PDF Visio Contents Video Overview AZ-104 Azure Administrator Associate R: 12/14/2023 U: 04/16/2025 Blueprint Visio Excel Mod 01 AZ-204 Azure Developer Associate R: 11/05/2024 U: 11/11/2024 Blueprint Visio Excel AZ-500 Azure Security Engineer Associate R: 01/09/2024 U: 10/10/2024 Blueprint Visio+ Excel AZ-700 Azure Network Engineer Associate R: 01/25/2024 U: 11/04/2024 Blueprint Visio Excel SC-200 Security Operations Analyst Associate R: 04/03/2025 U:04/09/2025 Blueprint Visio Excel SC-300 Identity and Access Administrator Associate R: 10/10/2024 Blueprint Excel Specialty PDF Visio AZ-140 Azure Virtual Desktop Specialty R: 01/03/2024 U: 02/27/2025 Blueprint Visio Excel Expert level PDF Visio AZ-305 Designing Microsoft Azure Infrastructure Solutions R: 05/07/2024 U: 02/05/2025 Blueprint Visio+ AZ-104 AZ-204 AZ-700 AZ-140 Excel SC-100 Microsoft Cybersecurity Architect R: 10/10/2024 U: 04/09/2025 Blueprint Visio+ AZ-500 SC-300 SC-200 Excel Skill based Credentialing PDF AZ-1002 Configure secure access to your workloads using Azure virtual networking R: 05/27/2024 Blueprint Visio Excel AZ-1003 Secure storage for Azure Files and Azure Blob Storage R: 02/07/2024 U: 02/05/2024 Blueprint Excel Subscribe if you want to get notified of any update like new releases or updates. Author: Ilan Nyska, Microsoft Technical Trainer My email ilan.nyska@microsoft.com LinkedIn https://www.linkedin.com/in/ilan-nyska/ I’ve received so many kind messages, thank-you notes, and reshares — and I’m truly grateful. But here’s the reality: 💬 The only thing I can use internally to justify continuing this project is your engagement — through this survey https://lnkd.in/gnZ8v4i8 ⏳ Unless I receive enough support via this short survey, the project will be sunset. Thank you for your support! ___ Benefits for Trainers: Trainers can follow this plan to design a tailored diagram for their course, filled with notes. They can construct this comprehensive diagram during class on a whiteboard and continuously add to it in each session. This evolving visual aid can be shared with students to enhance their grasp of the subject matter. Explore Azure Course Blueprints! | Microsoft Community Hub Visio stencils Azure icons - Azure Architecture Center | Microsoft Learn ___ Are you curious how grounding Copilot in Azure Course Blueprints transforms your study journey into smarter, more visual experience: 🧭 Clickable guides that transform modules into intuitive roadmaps 🌐 Dynamic visual maps revealing how Azure services connect ⚖️ Side-by-side comparisons that clarify roles, services, and security models Whether you're a trainer, a student, or just certification-curious, Copilot becomes your shortcut to clarity, confidence, and mastery. Navigating Azure Certifications with Copilot and Azure Course Blueprints | Microsoft Community Hub28KViews13likes13CommentsBoosting Productivity with Ansys RedHawk-SC and Azure NetApp Files Intelligent Data Infrastructure
Discover how integrating Ansys Access with Azure NetApp Files (ANF) is revolutionizing cloud-based engineering simulations. This article reveals how organizations can harness enterprise-grade storage performance, seamless scalability, and simplified deployment to supercharge Ansys RedHawk-SC workloads on Microsoft Azure. Unlock faster simulations, robust data management, and cost-effective cloud strategies—empowering engineering teams to innovate without hardware limitations. Dive in to learn how intelligent data infrastructure is transforming simulation productivity in the cloud!444Views0likes0CommentsBuilding an Enterprise RAG Pipeline in Azure with NVIDIA AI Blueprint for RAG and Azure NetApp Files
Transform your enterprise-grade RAG pipeline with NVIDIA AI and Azure NetApp Files. This post highlights the challenges of scaling RAG solutions and introduces NVIDIA's AI Blueprint adapted for Azure. Discover how Azure NetApp Files boosts performance and handles dynamic demands, enabling robust and efficient RAG workloads.2.2KViews1like0CommentsGranting Azure Resources Access to SharePoint Online Sites Using Managed Identity
When integrating Azure resources like Logic Apps, Function Apps, or Azure VMs with SharePoint Online, you often need secure and granular access control. Rather than handling credentials manually, Managed Identity is the recommended approach to securely authenticate to Microsoft Graph and access SharePoint resources. High-level steps: Step 1: Enable Managed Identity (or App Registration) Step 2: Grant Sites.Selected Permission in Microsoft Entra ID Step 3: Assign SharePoint Site-Level Permission Step 1: Enable Managed Identity (or App Registration) For your Azure resource (e.g., Logic App): Navigate to the Azure portal. Go to the resource (e.g., Logic App). Under Identity, enable System-assigned Managed Identity. Note the Object ID and Client ID (you’ll need the Client ID later). Alternatively, use an App Registration if you prefer a multi-tenant or reusable identity. How to register an app in Microsoft Entra ID - Microsoft identity platform | Microsoft Learn Step 2: Grant Sites.Selected Permission in Microsoft Entra Open Microsoft Entra ID > App registrations. Select your Logic App’s managed identity or app registration. Under API permissions, click Add a permission > Microsoft Graph. Select Application permissions and add: Sites.Selected Click Grant admin consent. Note: Sites.Selected ensures least-privilege access — you must explicitly allow site-level access later. Step 3: Assign SharePoint Site-Level Permission SharePoint Online requires site-level consent for apps with Sites.Selected. Use the script below to assign access. Note: You must be a SharePoint Administrator and have the Sites.FullControl.All permission when running this. PowerShell Script: # Replace with your values $application = @{ id = "{ApplicationID}" # Client ID of the Managed Identity displayName = "{DisplayName}" # Display name (optional but recommended) } $appRole = "write" # Can be "read" or "write" $spoTenant = "contoso.sharepoint.com" # Sharepoint site host $spoSite = "{Sitename}" # Sharepoint site name # Site ID format for Graph API $spoSiteId = $spoTenant + ":/sites/" + $spoSite + ":" # Load Microsoft Graph module Import-Module Microsoft.Graph.Sites # Connect with appropriate permissions Connect-MgGraph -Scope Sites.FullControl.All # Grant site-level permission New-MgSitePermission -SiteId $spoSiteId -Roles $appRole -GrantedToIdentities @{ Application = $application } That's it, Your Logic App or Azure resource can now call Microsoft Graph APIs to interact with that specific SharePoint site (e.g., list files, upload documents). You maintain centralized control and least-privilege access, complying with enterprise security standards. By following this approach, you ensure secure, auditable, and scalable access from Azure services to SharePoint Online — no secrets, no user credentials, just managed identity done right.3.9KViews2likes5CommentsAzure NetApp Files solutions for three EDA Cloud-Compute scenarios
Table of Contents Abstract Introduction EDA Cloud-Compute scenarios Scenario 1: Burst to Azure from on-premises Data Center Scenario 2: “24x7 Single Set Workload” Scenario 3: "Data Center Supplement" Summary Abstract Azure NetApp Files (ANF) is transforming Electronic Design Automation (EDA) workflows in the cloud by delivering unparalleled performance, scalability, and efficiency. This blog explores how ANF addresses critical challenges in three cloud compute scenarios: Cloud Bursting, 24x7 All-in-Cloud, and Cloud-based Data Center Supplement. These solutions are tailored to optimize EDA processes, which rely on high-performance NFS file systems to design advanced semiconductor products. With the ability to support clusters exceeding 50,000 cores, ANF enhances productivity, shortens design cycles, and eliminates infrastructure concerns, making it the default choice for EDA workloads in Azure. Additionally, innovations such as increased L3 cache and the transition to DDR5 memory enable performance boosts of up to 60%, further accelerating the pace of chip design and innovation. Co-authors: Andy Chan, Principal Product Manager Azure NetApp Files Arnt de Gier, Technical Marketing Engineer Azure NetApp Files Introduction Azure NetApp Files (ANF) solutions support three major cloud compute scenarios running Electronic Design Automation (EDA) in Azure: Cloud Bursting 24x7 All-in-Cloud Cloud based Data Center Supplement ANF solutions can address the key challenges associated with each scenario. By providing an optimized solution stack for EDA engineers ANF will increase productivity and shorten design cycles, making ANF the de facto standard file system for running EDA workloads in Azure. Electronic Design Automation (EDA) processes are comprised of a suite of software tools and workflows used to design semiconductor products such as advanced computer processors (chips) which are all in need of high performance NFS file system solutions. The increasing demand for chips with superior performance, reduced size, and lower power consumption (PPA) is driven by today's rapid pace of innovation to power workloads such as AI. To meet this growing demand, EDA tools require numerous nodes and multiple CPUs (cores) in a cluster. This is where Azure NetApp Files (ANF) comes into play with its high-performance, scalable file system. ANF ensures that data is efficiently delivered to these compute nodes. This means a single cluster—sometimes encompassing more than 50,000 cores—can function as a unified entity, providing both scale-out performance and consistency which is essential for designing advanced semiconductor products. ANF is the most performance optimized NFS storage in Azure making it the De facto solution for EDA workloads. According to Philip Steinke, AMD's Fellow of CAD Infrastructure and Physical Design, the main priority is to maximize the productivity of chip designers by eliminating infrastructure concerns related to compute and file system expansion typically experienced with on-premises deployments that require long planning cycles and significant capital expenditure. In register-transfer level (RTL) simulations, Microsoft Azure showcased that moving to a CPU with greater amounts of L3 Cache can give EDA users a performance boost of up to 60% for their workloads. This improvement is attributed to increased L3 cache, higher clock speeds (instructions per cycle), and the transition from DDR4 to DDR5 memory. Azure’s commitment to providing high-performing, on-demand HPC (High-Performance Computing) infrastructure is a well-known advantage and has become the primary reason EDA companies are increasingly adopting Azure for their chip design needs. In this paper, three different scenarios of Azure for EDA are explored, namely “Cloud Bursting”, “24x7 Single Set Workload” and “Data Center Supplement” as a reference framework to help guide engineer’s Azure for EDA journey. EDA Cloud-Compute scenarios The following sections delve into three key scenarios that address the computational needs of EDA workflows: “Cloud Bursting,” “24x7 Single Set Workload,” and “Data Center Supplement.” Each scenario highlights how Azure's robust infrastructure, combined with high-performance solutions like Azure NetApp Files, enables engineering teams to overcome traditional limitations, streamline chip design processes, and significantly enhance productivity. Scenario 1: Burst to Azure from on-premises Data Center An EDA workload is made up of a series of workflows where certain steps are bursty which can lead to incidents in semiconductor project cycles where compute demand exceeds the on-premises HPC server cluster capacity. Many EDA customers have been bursting to Azure to speed up their engineering projects. In one example, a total of 120,000 cores were deployed serving in many clusters, all were well supported with the high-performance capabilities of ANF. As design projects approach completion, the design is continuously and incrementally modified to fix bugs, synthesis and timing issues, optimization of area, timing and power, resolving issues associated with manufacturing design rule checks, etc. When design changes are made, many if not all the design steps must be re-run to ensure the change did not break the design. As a result, “design spins” or “large regression” jobs will put a large compute demand on the HPC server cluster. This leads to long job scheduler queues (IBM LSF and Univa Grid Engine are two common schedulers for EDA) where jobs wait to be dispatched to run on an available compute node. Competing project schedules are another reason HPC server cluster demands can exceed on-premises fixed capacity. Most engineering divisions within a company share infrastructure resources across teams and projects which inevitably leads to oversubscription of compute capacity and long job queues resulting in production delays. Bursting EDA jobs into Azure with its available compute capacity, is a way to alleviate these delays. For example, Azure’s latest CPU offering can deliver up to 47% shorter turnaround times for RTL simulation than on-premises. Engineering management tries to increase productivity with effective use of their EDA tool licensing. Utilizing Azure's on-demand compute resources and high-performance storage solutions like Azure NetApp Files, enables engineering teams to accelerate design cycles and reduce Non-recurring Engineering (NRE) costs, enhancing productivity significantly. For “burst to Azure” scenarios that allow engineers quick access to compute resources to finish a job without worrying about the underlying NFS infrastructure and traditional complex management overhead, ANF delivers: High Performance: up to 826,000 IOPS per large volume, serving the data for the most demanding simulations with ease to reduce turn-around-time. Scalability: As EDA projects advance, the data generated can grow exponentially. ANF provides large-capacity single namespaces with volumes up to 2PiB, enabling your storage solution to scale seamlessly, while supporting compute clusters with more than 50,000 cores. Ease of Use: ANF is designed for simplicity, with SaaS-like user experience, allowing deployment and management with a few clicks or API automation. Since storage deployment can be done rapidly, engineering to access their EDA HPC hardware quickly for their jobs. Cost-Effectiveness: ANF offers cool access, which transparently moves ‘cold’ data blocks to lower-cost Azure Storage. Additionally, Reserved Capacity (RC) can provide significant cost savings compared to pay-as-you-go pricing, further reducing the high upfront CapEx costs and long procurement cycle associated with on-premises storage solutions. Use the ANF effective pricing estimator to estimate your savings. Reliability and Security: ANF provides enterprise-grade data management and security features, ensuring that your critical EDA data is protected and available when you need it with key management and encryption built-in. Scenario 2: “24x7 Single Set Workload” As Azure for EDA matured over time and the value of providing engineers with available and faster HPC Infrastructure is becoming more widely shared, more users are now moving a entire sets of workloads into Azure that run 24x7. In addition to SPICE or RTL simulations, one such set of workloads is "digital signoff” with the same goal of increasing productivity. Scenario 1 concerns cloud bursting which involves batch processes with high performance and rapid deployment, whereas Scenario 2 involves operating a set of workloads with additional ANF capabilities for data security and user control needs. QoS support: ANF's QoS function fine-tunes storage utilization by establishing a direct correlation between volume size (quota) and performance, which set storage limit an EDA tool or workload may have access to. Snapshot data protection: As more users are using Azure resources, data protection is crucial. ANF snapshots protect primary data often and efficiently for fast recovery from corruption or loss, by restoring a volume to a snapshot in seconds or by restoring individual files from a snapshot. Enabling snapshots is recommended for user home directories and group shares for this reason as well. Large volume support: A set of workloads generates greater output than a single workload, and as such ANF’s large volume support is a feature that’s being widely adopted by EDA users of this scenario. ANF now supports single volumes up to 2PiB in size, allowing a more fine-tuned management of user’s storage footprint. Cool access: Cool access is an ANF feature that enables better cost control because only data that is being worked on at any given time remains in the hot tier. This functionality enables inactive data blocks from the volume and volume snapshots to be transferred from the hot tier to an Azure storage account (the cool tier), saving cost. Because EDA workloads are known to be metadata heavy, ANF does not relocate metadata to the cool tier, ensuring that metadata operations operate as expected. Dynamic capacity pool resizing: Cloud compute resources can be dynamically allocated. To support this deployment model, Azure NetApp Files (ANF) also offers dynamic pool resizing, which further enhances Azure-for-EDA's value proposition. If the size of the pool remains constant but performance requirements fluctuate, enabling dynamic provisioning and deprovisioning of capacity pools of different types provides just-in-time performance. This approach lowers costs during periods when high performance is not needed. Reserved Capacity: Azure allows compute resources to be reserved as a way to guarantee access to that capacity and allowing you to receive significant cost savings compared to the standard "pay-as-you-go" pricing model. This Azure offering is available to ANF. A reservation in 100-TiB and 1-PiB units per month for a one- or three-year term for a particular service level within a region is now available. Scenario 3: "Data Center Supplement" This scenario builds on Scenarios 1 and 2, while Scenario 3 involves EDA users expanding their workflow into Azure as their data center. In this scenario, a mixed EDA flow is hosted with tools from several EDA ISVs, spanning frontend, backend, and Analog mixed signal are being deployed. EDA Companies such as d-Matrix were able to design an entire AI chip, all in Azure as an example of Scenario 3. In this data center supplement scenario, data mobility and additional data life cycle management solutions are essential. Once again, Azure NetApp Files (ANF) rises to the challenge by offering additional features within its solution stack Backup support: ANF has a policy-based backup feature that uses AES-256-bit encryption during the encoding of the received backup data. Backup frequency is defined by a policy. Cross-region replication: ANF data can be replicated asynchronously between Azure NetApp Files volumes (source and destination) with cross-region replication. The source and destination volumes must be deployed in different Azure regions. The service level for the destination capacity pool might be the same or different, allowing customers to fine-tune their data protection demands as efficiently as possible. Cross-zone replication: Similar to the Azure NetApp Files cross-region replication feature, the cross-zone replication (CZR) capability provides data protection between volumes in different availability zones. You can asynchronously replicate data from an Azure NetApp Files volume (source) in one availability zone to another Azure NetApp Files volume (destination) in another availability zone. This capability enables you to fail over your critical application if a zone-wide outage or disaster happens. BC/DR: Users can construct their own solution based on their own goals by using a variety of BC/DR templates that include snapshots, various replication types, failover capabilities, backup, and support for REST API, Azure CLI, and Terraform. Summary The integration of ANF into the EDA workflow addresses the limitations of traditional on-premises infrastructure. By leveraging the latest CPU generations and Azure's on-demand HPC infrastructure, EDA users can achieve significant performance gains and improve productivity, all while being connected by the most optimized, performant file system that’s simple to deploy and support. The three Azure for EDA scenarios—Cloud Bursting, 24x7 Single Set Workload, and Data Center Supplement—showcase Azure's adaptability and effectiveness in fulfilling the changing needs of the semiconductor industry. As a result, ANF has become the default NFS solution for EDA in Azure, allowing businesses to innovate even faster.446Views1like0Comments