Cost Management
95 TopicsUnderstand New Sentinel Pricing Model with Sentinel Data Lake Tier
Introduction on Sentinel and its New Pricing Model Microsoft Sentinel is a cloud-native Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) platform that collects, analyzes, and correlates security data from across your environment to detect threats and automate response. Traditionally, Sentinel stored all ingested data in the Analytics tier (Log Analytics workspace), which is powerful but expensive for high-volume logs. To reduce cost and enable customers to retain all security data without compromise, Microsoft introduced a new dual-tier pricing model consisting of the Analytics tier and the Data Lake tier. The Analytics tier continues to support fast, real-time querying and analytics for core security scenarios, while the new Data Lake tier provides very low-cost storage for long-term retention and high-volume datasets. Customers can now choose where each data type lands—analytics for high-value detections and investigations, and data lake for large or archival types—allowing organizations to significantly lower cost while still retaining all their security data for analytics, compliance, and hunting. Please flow diagram depicts new sentinel pricing model: Now let's understand this new pricing model with below scenarios: Scenario 1A (PAY GO) Scenario 1B (Usage Commitment) Scenario 2 (Data Lake Tier Only) Scenario 1A (PAY GO) Requirement Suppose you need to ingest 10 GB of data per day, and you must retain that data for 2 years. However, you will only frequently use, query, and analyze the data for the first 6 months. Solution To optimize cost, you can ingest the data into the Analytics tier and retain it there for the first 6 months, where active querying and investigation happen. After that period, the remaining 18 months of retention can be shifted to the Data Lake tier, which provides low-cost storage for compliance and auditing needs. But you will be charged separately for data lake tier querying and analytics which depicted as Compute (D) in pricing flow diagram. Pricing Flow / Notes The first 10 GB/day ingested into the Analytics tier is free for 31 days under the Analytics logs plan. All data ingested into the Analytics tier is automatically mirrored to the Data Lake tier at no additional ingestion or retention cost. For the first 6 months, you pay only for Analytics tier ingestion and retention, excluding any free capacity. For the next 18 months, you pay only for Data Lake tier retention, which is significantly cheaper. Azure Pricing Calculator Equivalent Assuming no data is queried or analyzed during the 18-month Data Lake tier retention period: Although the Analytics tier retention is set to 6 months, the first 3 months of retention fall under the free retention limit, so retention charges apply only for the remaining 3 months of the analytics retention window. Azure pricing calculator will adjust accordingly. Scenario 1B (Usage Commitment) Now, suppose you are ingesting 100 GB per day. If you follow the same pay-as-you-go pricing model described above, your estimated cost would be approximately $15,204 per month. However, you can reduce this cost by choosing a Commitment Tier, where Analytics tier ingestion is billed at a discounted rate. Note that the discount applies only to Analytics tier ingestion—it does not apply to Analytics tier retention costs or to any Data Lake tier–related charges. Please refer to the pricing flow and the equivalent pricing calculator results shown below. Monthly cost savings: $15,204 – $11,184 = $4,020 per month Now the question is: What happens if your usage reaches 150 GB per day? Will the additional 50 GB be billed at the Pay-As-You-Go rate? No. The entire 150 GB/day will still be billed at the discounted rate associated with the 100 GB/day commitment tier bucket. Azure Pricing Calculator Equivalent (100 GB/ Day) Azure Pricing Calculator Equivalent (150 GB/ Day) Scenario 2 (Data Lake Tier Only) Requirement Suppose you need to store certain audit or compliance logs amounting to 10 GB per day. These logs are not used for querying, analytics, or investigations on a regular basis, but must be retained for 2 years as per your organization’s compliance or forensic policies. Solution Since these logs are not actively analyzed, you should avoid ingesting them into the Analytics tier, which is more expensive and optimized for active querying. Instead, send them directly to the Data Lake tier, where they can be retained cost-effectively for future audit, compliance, or forensic needs. Pricing Flow Because the data is ingested directly into the Data Lake tier, you pay both ingestion and retention costs there for the entire 2-year period. If, at any point in the future, you need to perform advanced analytics, querying, or search, you will incur additional compute charges, based on actual usage. Even with occasional compute charges, the cost remains significantly lower than storing the same data in the Analytics tier. Realized Savings Scenario Cost per Month Scenario 1: 10 GB/day in Analytics tier $1,520.40 Scenario 2: 10 GB/day directly into Data Lake tier $202.20 (without compute) $257.20 (with sample compute price) Savings with no compute activity: $1,520.40 – $202.20 = $1,318.20 per month Savings with some compute activity (sample value): $1,520.40 – $257.20 = $1,263.20 per month Azure calculator equivalent without compute Azure calculator equivalent with Sample Compute Conclusion The combination of the Analytics tier and the Data Lake tier in Microsoft Sentinel enables organizations to optimize cost based on how their security data is used. High-value logs that require frequent querying, real-time analytics, and investigation can be stored in the Analytics tier, which provides powerful search performance and built-in detection capabilities. At the same time, large-volume or infrequently accessed logs—such as audit, compliance, or long-term retention data—can be directed to the Data Lake tier, which offers dramatically lower storage and ingestion costs. Because all Analytics tier data is automatically mirrored to the Data Lake tier at no extra cost, customers can use the Analytics tier only for the period they actively query data, and rely on the Data Lake tier for the remaining retention. This tiered model allows different scenarios—active investigation, archival storage, compliance retention, or large-scale telemetry ingestion—to be handled at the most cost-effective layer, ultimately delivering substantial savings without sacrificing visibility, retention, or future analytical capabilities.713Views0likes0CommentsStreamline Analytics Spend on Microsoft Fabric with Azure Reservations
Introduction As organizations accelerate their cloud adoption, optimizing spend while maintaining performance and scalability is a top priority. Microsoft Fabric, our all-in-one, software-as-a-service data platform, is designed to help teams accomplish any data project. But how can you ensure your investment delivers maximum value? In this post, we’ll explore how Azure reservations for Microsoft Fabric can help you optimize costs, simplify purchasing, and streamline your data analytics spend. What Is Microsoft Fabric? Microsoft Fabric is an end-to-end data platform that brings together data orchestration, transformation, machine learning, real-time event processing, reporting, and databases in a single SaaS experience. It’s designed for data engineers, scientists, analysts, and business users, offering role-specific workloads and integrated AI experiences, all backed by a unified data lake—OneLake. Unlike traditional services that require managing different pricing and capacities for each workload, Fabric simplifies this with a single capacity model. You purchase Fabric Capacity Units (CUs), which power all workloads, and jobs consume CUs at different rates depending on your compute needs. How Azure reservations Work with Fabric Azure reservation is a pricing model that helps you save when you commit to a specific resource, region, and term - ideal for stable, predictable workloads. Reservations don’t affect capacity or runtime; they simply provide discount benefits. Azure analyzes your usage and recommends reservation options, and tools like Azure Advisor guide you toward the right purchase. You may be familiar with how reservations work with virtual machines, but Azure reservations can also apply to other services such as Microsoft Fabric. When purchasing Fabric, you can choose a pay-as-you-go model for flexible usage or save substantially with Azure reservations for consistent workloads. For example, if you deploy an F64 SKU in Fabric for ongoing reporting and analytics, buying an Azure reservation for this SKU ensures you pay the discounted rate for your consistent usage. How to Purchase a Reservation for Microsoft Fabric Let’s look at how you can purchase a reservation for Microsoft Fabric. Follow these steps below or watch this video: 1. Start in the Microsoft Marketplace or Azure Portal: Visit the Microsoft Marketplace and look up Microsoft Fabric. Click “Get it now” and it will take you to the Azure Portal. 2. Create Fabric Capacity: Select Configuration: In Azure Portal, on the Create Fabric Capacity page, select your subscription and resource group, name your capacity, and select your region. Select Size: Use the Fabric SKU estimator to determine the right capacity for your workloads, or start with a free trial and monitor usage with the capacity metrics app. You can always upgrade or create multiple capacities as needed. Organize with Tags: Use Azure tags to track costs and automate management across environments. 3. Buy Fabric Reservations: In the Azure portal, search for and select “Reservations.” On the Reservations page, click “Add” and select Microsoft Fabric. Choose your scope and subscription, select Fabric Capacity with upfront or monthly payments, and adjust the quantity as needed. Best Practices for Maximizing Savings To get the most value from your reservations, follow these best practices: Estimate Carefully: Avoid over-committing (which leads to wasted resources) or under-committing (which leads to higher costs). Understand your resource needs and usage and use Azure Advisor recommendations to make informed decisions. Enable Auto-Renew: Ensure your reservations automatically renew so you don’t lose the discount but adjust your reservation needs as workloads change. Monitor Usage: Use Azure Cost Management to continuously track reservation usage. Choose the Right Scope: Align reservation benefits with your organizational structure to maximize savings. Conclusion: Microsoft Fabric and Azure reservations empower organizations to streamline analytics spend, simplify purchasing, and unlock significant savings without sacrificing performance or scalability. Following best practices and leveraging the right tools ensures your cloud investments deliver maximum value. Get started today by visiting the Microsoft Marketplace and Azure Portal to purchase Fabric Capacity units and reservations or read Save costs with Microsoft Fabric Capacity reservations - Microsoft Cost Management | Microsoft Learn to learn more.163Views2likes0Commentsneed to create monitoring queries to track the health status of data connectors
I'm working with Microsoft Sentinel and need to create monitoring queries to track the health status of data connectors. Specifically, I want to: Identify unhealthy or disconnected data connectors, Determine when a data connector last lost connection Get historical connection status information What I'm looking for: A KQL query that can be run in the Sentinel workspace to check connector status OR a PowerShell script/command that can retrieve this information Ideally, something that can be automated for regular monitoring Looking at the SentinelHealth table, but unsure about the exact schema,connector, etc Checking if there are specific tables that track connector status changes Using Azure Resource Graph or management APIs Ive Tried multiple approaches (KQL, PowerShell, Resource Graph) however I somehow cannot get the information I'm looking to obtain. Please assist with this, for example i see this microsoft docs page, https://learn.microsoft.com/en-us/azure/sentinel/monitor-data-connector-health#supported-data-connectors however I would like my query to state data such as - Last ingestion of tables? How much data has been ingested by specific tables and connectors? What connectors are currently connected? The health of my connectors? Please help288Views2likes3CommentsSmarter Cloud, Smarter Spend: How Azure Powers Cost-Efficient Innovation
In today’s dynamic business environment, organizations are under pressure to innovate rapidly while managing costs effectively. The cloud, especially with AI at the forefront, is driving transformation across industries. But with rising cloud spend and economic uncertainty, cost efficiency has become a top priority. To help customers navigate this challenge, Microsoft commissioned Forrester Consulting to conduct a Total Economic Impact™ (TEI) study. The study reveals how organizations can unlock significant financial and operational benefits by leveraging Microsoft Azure’s tools and strategic pricing offers, especially when migrating to the cloud and adopting AI. Cloud Migration: Accelerating Modernization with Azure Migrating to the cloud is a foundational step for digital transformation. But many organizations initially struggled with cost overruns, lack of visibility, and inefficient resource usage. The TEI study shows how Azure’s native tools and strategic pricing offers helped overcome these challenges. Key Enablers of Cost-Efficient Migration: Azure Hybrid Benefit: Allows organizations to optimize savings in their migration journey by giving a discount on server licenses and subscriptions and granting hosting and outsourcing benefits Azure reservations: A commitment model that enables customers to save on their spend when they commit to a specific resource, in a region, and for a specific term. Azure savings plan for compute: A flexible pricing model that enables customers to save on their cost when they commit to a spend a fixed hourly amount for a specific term. Microsoft Cost Management & Azure Advisor: Provide real-time insights and optimization recommendations. Azure Pricing Calculator: Accurately estimate migration costs and forecast budgets. “Azure reservations and Azure Hybrid Benefit facilitate moving to the cloud. With these offers and better cost management, we are saving 30% to 35%. Managing our costs without these tools would be an unpredictable nightmare.” — Senior IT Director, Manufacturing Quantified Impact*: 25% reduction in cloud spending in Year 1 using Microsoft tools. $4.9M in direct savings over three years from tool-based optimization. $3.8M in additional savings from strategic pricing offers. AI Adoption: Driving Innovation with Cost-Efficient Azure Solutions Once migrated, organizations are increasingly turning to AI to drive innovation and competitive advantage. Azure doesn’t just support AI workloads, it makes them more cost-effective and scalable. Key Enablers of AI Adoption: Azure AI Foundry Provisioned Throughput reservations: Get cost savings compared to provisioned throughput (PTU) hourly pricing. Microsoft Cost Management & Azure Advisor: Help forecast and optimize AI-related cloud spend. Strategic Pricing Offers: Enable predictable budgeting and reinvestment into AI initiatives. “Leveraging Microsoft tools and strategic pricing offers is not only increasing profitability but also putting those savings into more interesting projects like AI with fraud prevention work and the customer experience.” — VP of Analytics Engineering, Financial Services “Azure reservations for AI projects using PTUs has helped our AI initiatives. It has helped better predict costs.” — CIO, Healthcare Unquantified Benefits: Increased cloud insights and visibility. Improved governance and accountability. Enhanced productivity (24% by Year 3). The Bottom Line: Unified Transformation with Azure The TEI study concludes that Azure delivers a unified approach to cloud computing, enabling organizations to: Migrate to the cloud efficiently with cost control Adopt AI rapidly by reinvesting savings into innovation Achieve a net present value (NPV) of $8.7 million over three years Realize a 35% reduction in cloud spending in Year 1 “With Microsoft tools, we can focus on higher value projects. Instead of reviewing reports and conducting cost audits, we can build new tools and build with genAI. We can innovate faster.” — Vice President of Analytics Engineering, Financial Services Ready to Dive Deeper? This blog is just the beginning. The full Forrester TEI study is packed with insights, customer stories, and financial modeling to help you build your business case for Azure. Read the full study here *To understand benefits, costs, and risks, Forrester interviewed eight decision-makers with experience using the evaluated Azure solutions. For the purposes of this study, Forrester aggregated the results from these decision-makers into a single composite organization.252Views1like0CommentsUnlock Savings with Copilot Credit Pre-Purchase Plan
Introduction As organizations scale their use of Microsoft Copilot Studio from building custom agents to integrating with Dynamics 365, cost predictability and optimization becomes critical. To help you plan confidently and save more, we’re introducing Copilot Credit Pre-Purchase Plan (P3)*, a simple, one‑year plan that delivers volume discounts and automatic billing, so your team can focus on outcomes, not invoices. What is Copilot Credit Pre-Purchase Plan? The P3 is a one‑year, pay‑up‑front option for Copilot Credits. You purchase Copilot Credit Commit Units (CCCU) and your usage automatically draws from this pool. Higher tiers unlock progressive discounts enabling more growth. See pricing here. How it works You pre-purchase a pool of Copilot Credits Commit Unit (CCCUs) for one year. Every time you use Copilot Credits for Microsoft Copilot Studio, Dynamics 365 first-party agents, or Copilot Chat*, the CCCUs are automatically drawn down from your P3 balance. If you use up your balance before the year ends, you can add another P3 plan or switch to pay-as-you-go. If you don’t use all your credits by the end of the year, the remaining balance expires. Example: A retail company runs 15 custom Copilot Studio agents to handle inventory checks, store operations, and customer service. They expect seasonal spike (holiday campaigns, back-to-school, and clearance events) but want predictable costs. Without P3: On the pay-as-you-go model their total cost can vary and spike as their usage goes up and down. During peak months, usage surges, and so does the bill make budgeting tough. With P3: Based on forecasting they expect to use 1,500,000 Copilot Credits over 12 months. Because they know how many Copilot Credits they plan on consuming, the retail company buys P3: Purchase P3 Tier 2: 15,000 CCCUs (this covers 1,500,000 Copilot Credits) P3 Upfront cost: $14,100 P3 discount: 6% vs PAYG Now, every time their agents run, CCCUs are automatically deducted from the P3 balance. P3 Tiers: Pricing as of October 2025, subject to change. See pricing here. Key Benefits Cost savings: Up to 20% discount at highest tier**. Budget predictability: One‑year term with upfront payment and no surprise bills. Flexibility: Add more P3 anytime; works alongside capacity packs and PAYG. No redeployment: Applies automatically to eligible usage. Defined Scope: Decide where the P3 applies based on your business needs. It can be applied at the resource group, subscription, management group, or shared level. How to Purchase Copilot Studio Pre-purchase plan Sign in to the Azure portal → Reservations → + Add → Copilot Credit Pre‑purchase Plan. Select your subscription and scope Choose your tier and complete payment. Link your subscription to Copilot Studio in Power Platform Admin Center. Best Practices Estimate accurately: Use historical usage or Copilot consumption estimator. Deploy first: Ensure environments are PAYG‑enabled before buying. Monitor utilization: Set alerts to avoid unexpected PAYG charges. Plan for renewal: Auto‑renew is on by default; adjust as needed. Conclusion If you’re scaling fast or managing multiple environments, P3 gives you predictable costs and meaningful savings without sacrificing flexibility. It’s ideal for customers with variable or growing usage who want to optimize spend and simplify billing. Copilot Credit Pre-Purchase Plan makes it easier to innovate without worrying about unpredictable costs. By committing upfront, you unlock discounts, streamline billing, and gain confidence in your budget. Ready to start? Visit the Azure portal to purchase your plan today or read the Copilot Credits Pre-purchase Plans documents to learn more. Resources: Read how to confidently scale AI agent deployments with Copilot Studio Learn more about Microsoft Copilot Studio See Microsoft Copilot Studio Licensing Guide *Copilot Credit covers Microsoft Copilot Studio, Dynamics 365 first-party agents, and Copilot Chat. Microsoft reserves the right to update Copilot Credit eligible products. ** The actual realized cost per Copilot Credit for P3 and the MCS license will depend on the utilization rate. As such customers should take into account their expected usage pattern as P3 Commit Units expire annually while the Copilot Credit capacity packs expire monthly.1.8KViews2likes0CommentsAzure VMWare (AVS) Cost Optimization Using Azure Migrate Tool
What is AVS? Azure VMware Solution provides private clouds that contain VMware vSphere clusters built from dedicated bare-metal Azure infrastructure. Azure VMware Solution is available in Azure Commercial and Azure Government. The minimum initial deployment is three hosts, with the option to add more hosts, up to a maximum of 16 hosts per cluster. All provisioned private clouds have VMware vCenter Server, VMware vSAN, VMware vSphere, and VMware NSX. As a result, you can migrate workloads from your on-premises environments, deploy new virtual machines (VMs), and consume Azure services from your private clouds. Learn More: https://learn.microsoft.com/en-us/azure/azure-vmware/introduction What is Azure Migrate Tool? Azure Migrate is a comprehensive service designed to help you plan and execute your migration to Azure. It provides a unified platform to discover, assess, and migrate your on-premises resources, including servers, databases, web apps, and virtual desktops, to Azure. The tool offers features like dependency analysis, cost estimation, and readiness assessments to ensure a smooth and efficient migration process. Learn More: https://learn.microsoft.com/en-us/azure/migrate/migrate-services-overview How Azure Migrate can be used to Discover and Assess AVS? Azure Migrate enables the discovery and assessment of Azure VMware Solution (AVS) environments by collecting inventory and performance data from on-premises VMware environments, either through direct integration with vCenter (via Appliance) or by importing data from tools like RVTools. Using Azure Migrate, organizations can analyze the compatibility of their VMware workloads for migration to AVS, assess costs, and evaluate performance requirements. The process involves creating an Azure Migrate project, discovering VMware VMs, and generating assessments that provide insights into resource utilization, right-sizing recommendations, and estimated costs in AVS. This streamlined approach helps plan and execute migrations effectively while ensuring workloads are optimized for the target AVS environment. Note: We will be narrating the RVtools Import method in this article. What Is RVTools? RVTools is a lightweight, free utility designed for VMware administrators to collect, analyze, and export detailed inventory and performance data from VMware vSphere environments. Developed by Rob de Veij, RVTools connects to vCenter or ESXi hosts using VMware's vSphere Management SDK to retrieve comprehensive information about the virtual infrastructure. Key Features of RVTools: Inventory Management: Provides detailed information about virtual machines (VMs), hosts, clusters, datastores, networks, and snapshots. Includes details like VM names, operating systems, IP addresses, resource allocations (CPU, memory, storage), and more. Performance Insights: Offers visibility into resource utilization, including CPU and memory usage, disk space, and VM states (e.g., powered on/off). Snapshot Analysis: Identifies unused or orphaned snapshots, helping to optimize storage and reduce overhead. Export to Excel: Allows users to export all collected data into an Excel spreadsheet (.xlsx) for analysis, reporting, and integration with tools like Azure Migrate. Health Checks: Identifies configuration issues, such as disconnected hosts, orphaned VMs, or outdated VMware Tools versions. User-Friendly Interface: Displays information in tabular form across multiple tabs, making it easy to navigate and analyze specific components of the VMware environment. Hand-on LAB Disclaimer: The data used for this LAB has no relationship with real world scenarios. This sample data is self-created by the author and purely for understanding the concept. To discover and assess your Azure VMware Solution (AVS) environment using an RVTools extract report in the Azure Migrate tool, follow these steps: Prerequisites RVTools Setup: Download and install RVTools from the RVTools Download Ensure connectivity to your vCenter server. Extract the data by running RVTools and saving the output as an Excel (.xlsx) file Permissions: You need at least the Contributor role on the Azure Migrate project. Ensure that you have appropriate permissions in your vCenter environment to collect inventory and performance data. File Requirements: The RVTools file must be saved in .xlsx format without renaming or modifying the tabs or column headers. Note: Sample Sheet: Please check the attachment included with this article. Note that this is not the complete format; some tabs and columns have been removed for simplicity. During the actual discovery and assessment process, please do not modify the tabs or columns. Procedure Step 1: Export Data from RVTools Follow the steps provided in official website to get RVTools Extract Sample Sheet: Please check the attachment included with this article. Note that this is not the complete format; some tabs and columns have been removed for simplicity. During the actual discovery and assessment process, please do not modify the tabs or columns. Step 2: Discover Log in to the Azure portal. Navigate to Azure Migrate and select your project or create new project. Under Migration goals, select Servers, databases and web apps. On Azure Migrate | Servers, databases and web apps page, under Assessment tools, select Discover and then select Using import. In Discover page, in File type, select VMware inventory (RVTools XLSX). In the Step 1: Import the file section, select the RVTools XLSX file and then select Import. Wait for some time to Import Once import completed check for Error Messages if any and rectify those and re upload, otherwise wait 10-15 minutes to reflect imported VMs in the discovery. Post discovery Reference Link: https://learn.microsoft.com/en-us/azure/migrate/vmware/tutorial-import-vmware-using-rvtools-xlsx?context=%2Fazure%2Fmigrate%2Fcontext%2Fvmware-context Step 3: Assess After the upload is complete, navigate to the Servers tab. Click on Assess -->Azure VMware Solution to assess the discovered machines. Edit assessment settings based on your requirements and Save Target region: Select the Azure region for the migration. Node Type: Specify the Azure VMware Solution series (e.g., AV36, AV36P). Pricing model: Select pay-as-you-go or reserved instance pricing. Discount: Specify any available discounts. Note: We will be explaining all the parameters in optimize session. As of now just review and leave parameters as it is. In Assess Servers, select Next. In Select servers to assess > Assessment name > specify a name for the assessment. In Select or create a group > select Create New and specify a group name. Select the appliance and select the servers you want to add to the group. Then select Next. In Review + create assessment, review the assessment details, and select Create Assessment to create the group and run the assessment. Step 4: Review the Assessment View an assessment In Windows, Linux and SQL Server > Azure Migrate: Discovery and assessment, select the number next to Azure VMware Solution. In Assessments, select an assessment to open it. As an example (estimations and costs, for example, only): Review the assessment summary. You can select Sizing assumptions to understand the assumptions that went in node sizing and resource utilization calculations. You can also edit the assessment properties or recalculate the assessment. Step 5: Optimize We have received a report without any optimization in our previous steps. Now we can follow below steps to optimize the cost and node count even further High level steps: Find limiting factor Find which component in settings are mapped for optimization depending on limiting factor Try to adjust the mapped component according to Scenario and Comfort Find Limiting factor: First understand which component (CPU, memory and storage) is deciding your ESXI Node count. This will be highlighted in the report The limiting factor shown in assessments could be CPU or memory or storage resources based on the utilization on nodes. It is the resource, which is limiting or determining the number of hosts/nodes required to accommodate the resources. For example, in an assessment if it was found that after migrating 8 VMware VMs to Azure VMware Solution, 50% of CPU resources will be utilized, 14% of memory is utilized and 18% of storage will be utilized on the 3 Av36 nodes and thus CPU is the limiting factor. Find which option in the setting can be used to optimize: This is depending on the limiting factor. For eg: If Limiting factor is CPU, which means you have high CPU requirement and CPU oversubscription can be used to optimize ESXI Node. Likewise, if storage is the limiting factor editing FTT, RAID or introducing External storage like ANF will help you to reduce Node count. Even reducing one node count will create a huge impact in dollar value. Let's understand how over commitment or over subscription works with simple example. Let's suppose I have two VMs with below specification Name CPU Memory Storage VM1 9 vCPU 200 GB 500 GB VM2 4 vCPU 200 GB 500 GB Total 13 vCPU 400 GB 1000 GB We have EXSI Node which has below capacity: vCPU 10 Memory 500 GB storage 1024 GB Now without optimization I need two ESXI node to accommodate 13 vCPU of total requirement. But let's suppose VM1 and VM2 doesn't consume entire capacity all the time. The total capacity usage at a time will not go beyond 10. then I can accommodate both VM in same ESXI node, Hence I can reduce my node count and cost. Which means it is possible to share resources among both VMs. Without optimization With optimization Parameters effecting Sizing and Pricing CPU Oversubscription Specifies the ratio of number of virtual cores tied to one physical core in the Azure VMware Solution node. The default value in the calculations is 4 vCPU:1 physical core in Azure VMware Solution. API users can set this value as an integer. Note that vCPU Oversubscription > 4:1 may impact workloads depending on their CPU usage. Memory overcommit factor Specifies the ratio of memory overcommit on the cluster. A value of 1 represents 100% memory use, 0.5, for example is 50%, and 2 would be using 200% of available memory. You can only add values from 0.5 to 10 up to one decimal place. Deduplication and compression factor Specifies the anticipated deduplication and compression factor for your workloads. Actual value can be obtained from on-premises vSAN or storage configurations. These vary by workload. A value of 3 would mean 3x so for 300GB disk only 100GB storage would be used. A value of 1 would mean no deduplication or compression. You can only add values from 1 to 10 up to one decimal place. FTT : How many device failure can be tolerated for a VM RAID : RAID stands for Redundant Arrays of Independent Disks Explains how data should be stored for redundancy Mirroring : Data will be duplicated as it is to another disk E.g.: To protect a 100 GB VM object by using RAID-1 (Mirroring) with an FTT of 1, you consume 200 GB. Erasure Coding : Erasure coding divides data into chunks and calculates parity information (redundant data) across multiple storage devices. This allows data reconstruction even if some chunks are lost, similar to RAID, but typically more space-efficient E.g.: to protect a 100 GB VM object by using RAID-5 (Erasure Coding) with an FTT of 1, you consume 133.33 GB. Comfort Factor: Azure Migrate considers a buffer (comfort factor) during assessment. This buffer is applied on top of server utilization data for VMs (CPU, memory and disk). The comfort factor accounts for issues such as seasonal usage, short performance history, and likely increases in future usage. For example, a 10-core VM with 20% utilization normally results in a 2-core VM. However, with a comfort factor of 2.0x, the result is a 4-core VM instead. AVS SKU Sizes Optimization Result In this example we got to know that CPU is my limiting factor hence I have adjusted CPU over subscription value from 4:1 to 8:1 Reduced node count from 6 (3 AV36P+3 AV64) to 5 AV36P Reduced Cost by 31% Note: Over-provisioning or over-committing can put your VMs at risk. However, in Azure Cloud, you can create alarms to warn you of unexpected demand increases and add new ESXi nodes on demand. This is the beauty of the cloud: if your resources are under-provisioned, you can scale up or down at any time. Running your resources in an optimized environment not only saves your budget but also allows you to allocate funds for more innovative ideas.2.9KViews1like1CommentProvider-Managed Azure Subscriptions: Cost Control and Commitment Clarity
As a Microsoft Cloud Solution Architect supporting enterprise customers, I occasionally encounter a specific scenario where customers with an Enterprise Agreement (EA) or Microsoft Customer Agreement (MCA-E) allow a service provider (SP) to manage one or more of their Azure subscriptions via the SP’s tenant. This setup has notable implications for cost and commitment management, which I’ll explore in this article. Recommended prerequisite reading: Microsoft Cost Management: Billing & Trust Relationships Explained Scenario Overview A customer signs a contract with a service provider to outsource the management of certain resources. The customer retains full control over resource pricing and expects the usage of these resources to contribute towards their Microsoft Azure Consumption Commitment (MACC). To achieve this, the customer associates one or more Azure subscriptions with a Microsoft Entra ID tenant owned and managed by the SP. In our example, this is “Subscription B.” The SP gains full RBAC access to the subscription and its resources, while the billing relationship remains tied to the customer’s billing account (EA) or billing profile (MCA-E). Let’s have a look at the implications from both the customers and the service providers perspective: Customers perspective Cost & Pricing All cost in Subscription B that occurs because of resource usage are tied and therefore billed to the customers billing account (EA) or billing profile (MCA-E). The prices used for the usage are based on the negotiated customer price list associated with the billing account (EA) /profile (MCA-E). The Azure resource consumption of Subscription B plus any eligible Marketplace offer consumption within the subscription contributes to the MACC of the customer. Customer has full cost visibility of Subscription B via Azure Cost Analysis on the billing account/billing profile level. Commitments (Reservations / Savings Plans) Shared commitments at the billing account/billing profile level are utilized by matching resources in Subscription B. Commitments scoped to Subscription B or lower can only be purchased by the customer, if the customer has RBAC rights on the subscription and the global billing policy allows purchases for subscription owner / reservation purchasers. Service Provider Perspective Cost & Pricing The service provider is responsible for managing Subscription B’s resources and the associated costs. Subscription B’s actual and amortized cost view is limited for the service provider as they have only access at the subscription level. The service provider has no direct access to the customer price (Price Sheet) or invoice information. Commitments (Reservations / Savings Plans) The service provider can purchase commitments scoped at Subscription B or lower (resource group) if the global customer’s billing policy allows purchases for subscription owners / reservation purchasers. The associated costs of the commitment are attributed to the customer’s billing account/profile. Shared or management group scoped commitments purchased by the service provider based on their own billing account / billing profile do not apply to Subscription B. Key take aways Decoupled Ownership: Customers can separate subscription management from billing ownership, enabling flexible operational models. Cost Control: Customers retain full visibility and control over pricing, cost allocation, and commitment utilisation—even when subscriptions are managed by a service provider. Governance and Policy Alignment: Successful implementation depends on clear billing policies and RBAC configurations that align with both customer and provider responsibilities.561Views1like0CommentsWhat’s new in FinOps toolkit 12 – July 2025
This month, you’ll find support for FOCUS 1.2, autostart in FinOps hubs which can reduce your hub costs, a new page in the Cost summary Power BI report, and various small fixes, improvements, and documentation updates across the board. Read on for details.799Views3likes0CommentsUnderstanding the Total Cost of Ownership
Whether you're just beginning your journey in Azure or are already managing workloads in the cloud, it's essential to ground your strategy in proven guidance. The Microsoft Cloud Adoption Framework for Azure offers a comprehensive set of best practices, documentation, and tools to help you align your cloud adoption efforts with business goals. One of the foundational steps in this journey is understanding the financial implications of cloud migration. When evaluating the migration of workloads to Azure, calculating the Total Cost of Ownership (TCO) is a crucial step. TCO is a comprehensive metric that includes all cost components over the life of the resource. A well-constructed TCO analysis can provide valuable insights that aid in decision-making and drive financial efficiencies. By understanding the comprehensive costs associated with moving to Azure, you can make informed choices that align with your business goals and budget. Here is a breakdown of the main elements that you need to build your own TCO: 1. Current infrastructure configuration: Servers: details about your existing servers, including the number of servers, their specifications (CPU, memory, storage), and operating systems. Databases: information about your current databases, such as the type, size, and any associated licensing costs. Storage: type and amount of storage you are currently using, including any redundancy or backup solutions. Network Traffic: Account for outbound network traffic and any associated costs. 2. Azure Environment Configuration: Virtual Machines (VMs): appropriate Azure VMs that match your current server specifications. This has to be based on CPU, memory, storage, and region. Storage Options: type of storage (e.g., Standard HDD, Premium SSD), access tiers, and redundancy options that align with your needs. Networking: networking components, including virtual networks, load balancers, and bandwidth requirements. 3. Operational Costs: Power and Cooling: Estimate the costs associated with power and cooling for your on-premises infrastructure. IT Labor: Include the costs of IT labor required to manage and maintain your current infrastructure. Software Licensing: Account for any software licensing costs that will be incurred in both the current and Azure environments. Once you have more clarity of these inputs you can complement your analysis with other tools depending on your needs. The Azure Pricing Calculator is well suited to providing granular cost estimation for different Azure services and products. However, if the intent is to estimate cost and savings during migrations, Azure Migrate business case feature should be the preferred approach as it will allow the user to perform detailed financial analysis (TCO/ROI) for the best path forward and assess readiness to move workloads to Azure with confidence. Understand your Azure costs The Azure pricing calculator is a free cost management tool that allows users to understand and estimate costs of Azure Services and products. It serves as the only unauthenticated experience that allows you to configure and budget the expected cost of deploying solutions in Azure The Azure pricing calculator is key for properly adopting Azure. Whether you are in a discovery phase and trying to figure out what to use, what offers to apply or in a post purchase phase where you are trying to optimize your environment and see your negotiated prices, the azure pricing calculator fulfills both new users and existing customers' needs. The Azure pricing calculator allows organizations to plan and forecast cloud expenses, evaluate different configurations and pricing models, and make informed decisions about service selection and deployment options. Decide, plan, and execute your migration to Azure Azure Migrateis Microsoft’s free platform for migrating to and modernizing in Azure. It provides capabilities for discovery, business case (TCO/ROI), assessments, planning and migration in a workload agnostic manner. Customers must have an Azure account and create a migration project within the Azure portal to get started. Azure Migrate supports various migration scenarios, including for VMware and Hyper-V virtual machines (VM), physical servers, databases, and web apps. The service offers accurate appliance based and manual discovery options, to cater to customer needs. The Azure Migrate process consists of three main phases: Decide, Plan, and Execute. In the Decide phase, organizations discover their IT estate through several supported methods and can get a dependency map for their applications to help collocate all resources belonging to an application. Using the data discovered, one can also estimate costs and savings through the business case (TCO/ROI) feature. In the Plan phase, customers can assess for readiness to migrate, get right-sized recommendations for targets in Azure and tools to use for their migration strategy (IaaS/PaaS). Users can also create a migration plan consisting of iterative “waves” where each wave has all dependent workloads for applications to be moved during a maintenance window. Finally, the Execute phase focuses on the actual migration of workloads to a test environment in Azure in a phased manner to ensure a non-disruptive and efficient transition to Azure. A crucial step in the Azure Migrate process is building a business case prior to the move, which helps organizations understand the value Azure can bring to their business. The business case capability highlights the total cost of ownership (TCO) with discounts and compares cost and savings between on-premises and Azure including end-of-support (EOS) Windows OS and SQL versions. It provides year-on-year cash flow analysis with resource utilization insights and identifies quick wins for migration and modernization with an emphasis on long-term cost savings by transitioning from a capital expenditure model to an operating expenditure model, paying only for what is used. Understanding the Total Cost of Ownership (TCO) is essential for making informed decisions when migrating workloads to Azure. By thoroughly evaluating all cost components, including infrastructure, operational, facilities, licensing and migration costs, organizations can optimize their cloud strategy and achieve financial efficiencies. Utilize tools like the Azure Pricing Calculator and Azure Migrate to gain comprehensive insights and ensure a smooth transition to the cloud.31KViews0likes2CommentsNews and updates from FinOps X 2024: How Microsoft is empowering organizations
Last year, I shared a broad set of updates that showcased how Microsoft is embracing FinOps practitioners through education, product improvements, and innovative solutions that help organizations achieve more. with AI-powered experiences like Copilot and Microsoft Fabric. Whether you’re an engineer working in the Azure portal or part of a business or finance team collaborating in Microsoft 365 or analyzing data in Power BI, Microsoft Cloud has the tools you need to accelerate business value for your cloud investments.11KViews8likes0Comments