finops framework
17 Topicsfinops toolkit - missing clusterUri
I installed finops toolkit but I did not get any value in the clusterURI from my hub. Any thoughts on how to fix it? thanks instructions: Copy the Data Explorer cluster URI: Select the resource group where your FinOps hub instance was deployed. Select Settings > Deployments > hub > Outputs. Copy the clusterUri output value.14Views0likes0CommentsMicrosoft Cost Management: Billing & trust relationships explained
As a Microsoft Cloud Solution Architect supporting our global enterprise customers on FinOps and Microsoft Cost Management topics I am often involved in conversations with customers explaining the different relationships existing in Microsoft Cost Management. In this article I will shed some light on key relationships for commercial enterprise customers to provide clarity. The knowledge provided in this article can be especially helpful for those customers currently planning to transition from a Microsoft Enterprise Agreement contract towards a Microsoft Customer Agreement. Building blocks for Cost Management relationships Before we start looking into the individual relationships, we need to understand the building blocks of these Cost Management relationships: Billing contracts – Enterprise Agreement (EA) and Microsoft Customer Agreement (MCA) providing legal and commercial terms as well as certain technical capabilities like Billing Roles. Microsoft Azure Consumption Commitment (MACC) as a contractual agreement where an organization commits to spending a certain amount on Azure services over a specified period and might gets a certain discount on their usage. Billing roles provided by each contract like Enterprise Administrator or Billing Account Owner to manage the contract. Tenants based on Microsoft Entra ID providing entities access to Billing Roles and Azure Resources Azure subscriptions as containers to deploy and manage Azure Resources at specific cost. Azure customer price sheet (CPS) determining the current Customer prices for specific Azure Resources. Key Cost Management relationships in EA and MCA contracts Customer to contract Customer Organizations can sign multiple contracts with Microsoft. Even though it is generally advised to have a 1:1 relationship between Customer and Microsoft contract, Customer can theoretically use multiple contracts at the same time. Examples Best example for this is during a transition period from EA to MCA, where the customer has an active EA and an active MCA contract. During M&A activities a customer organization might end up with multiple EA or MCA contracts. Contract to price sheet Every contract is associated with a customer specific price sheet determining the individual customer prices in the agreed billing currency. In MCA the default associated price sheet basically equals the Azure Retail price list in USD. The price sheet can be accessed in the Azure portal. Contract to Microsoft Azure Consumption Commitment (MACC) Customers can sign a MACC. There is usually a 1:1 relationship between a contract and a MACC. As a benefit of this commitment, Customers might get discounted pricing on Azure Resources (usage). This potential discount will be reflected in the price sheet associated with the contract. Contract to billing roles Both EA and MCA provide Customers with roles, that can manage certain aspects of the contract. These roles are called billing roles. Billing roles differ from EA to MCA and are described in detail here for EA and here for MCA. A key difference between EA and MCA is, that Customers can associate any valid work, school, or Microsoft account to EA billing roles but only work accounts of an approved tenant for MCA. Contract to tenant To manage billing roles customers must associate exactly one Entra ID tenant with the contract. This happens at contract setup. Identities within this Tenant can be assigned to billing roles within the contract. Example User Dirk from the contoso.com tenant can be assigned to the EA admin role in Contoso’s EA contract. In MCA this tenant is called the primary billing tenant. Only users from this tenant can be assigned to billing roles within the MCA contract. CAUTION: The tenant global administrator role is above the billing account administrator. Global administrators in a Microsoft Entra ID tenant can add or remove themselves as billing account administrators at any time to the Microsoft Customer Agreement. If you want to assign identities from tenants other than the primary billing tenant, you can add associated billing tenants. NOTE: Even though customers should strive for a single tenant, there is no restriction in how many tenants a customer can create within a contract. Contract to subscription An Azure subscription is a logical container used to provision and manage Azure resources. Resource access is managed by a trust relationship to an Entra ID tenant and billing is managed by a billing relationship to a Microsoft contract (EA or MCA). Every subscription can only have one billing relationship to one contract. This billing relationship can be moved to a different contract based on certain conditions (e.g. in EA/MCA transition scenarios or M&A scenarios). Every contract can manage 5000 subscriptions by default. NOTE: The billing relationship determines the prices for the consumed resources within the subscription. If you have a subscription that is associated with a contract that uses Azure retail prices you pay the retail price. If the associated contract has customer specific prices (e.g. by signing a MACC with applicable discounts), the resources within this subscription will be charged at these prices. Subscription to tenant Every subscription has a 1:1 trust relationship to an Entra ID tenant. This trust relationship determines, which identities can manage the resources within the subscription. Tenant to subscription Every tenant can manage trust relationships with virtually unlimited number of subscriptions. These subscriptions can use billing relationships to multiple contracts which might lead to different prices for resources deployed to these subscriptions. Example Contoso is using a single Entra ID tenant, has just signed a new MACC agreement and is migrating to MCA. At signing the MCA contract, the MACC is associated with the MCA account and not with the EA contract. During an EA to MCA transition period, some subscriptions still have a billing relationship to the old EA contract and others are already moved to the MCA contract. In this situation the same VM in the same region might be charged differently, depending on the subscription the VM is deployed to. If the subscription is still using the EA billing relationship, the price might be different (e.g. due to lack of applied discounts). Summary & key takeaways Relationships on a page Let’s summarize the described relationships: Fully understanding the described relationships can be very helpful in a variety of scenarios, especially when you are currently discussing the transition from an Enterprise Agreement contract towards a Microsoft Customer Agreement. Key takeaways Three key takeaways: Tenants do not determine the price of an Azure resource! Only the billing relationship between a subscription and a contract determines the price of a resource! Customers can use multiple tenants to manage their Azure resources. Only for managing billing roles there is usually a 1:1 relationship between the contract and an Entra ID tenant. Next steps If the provided information was helpful for you: Share this article within your FinOps, Finance and Cloud Competence Center teams. If you need in-depth support for planning the proper relationships and billing structures for your organization, please reach out to your local Microsoft representative. If you have a unified support contract, you can ask your Customer Success Account Manager (CSAM) to reach out to our global team of FinOps experts from the Culture & Cloud Experience (CCX) practice. This article lays the foundation for an upcoming post about how to manage the transition from EA to MCA from a FinOps perspective.752Views4likes1CommentManaging the cost of AI: Leveraging the FinOps Framework
Artificial intelligence is well and truly here, though the rate of adoption varies across consumers and businesses. When we think of AI in relation to FinOps, we look at it from two different angles: how are we using FinOps to optimize our AI costs, and how are we using AI to optimize all of our costs. The second angle includes the intersection of our cost management tools and AI capabilities, including things like Azure Copilot answering cost and forecasting questions. The first angle is only just starting to be whispered about in corporate hallways. Many organizations are still curious about AI and are exploring its capabilities without too much concern about cost or return on investment. They want to see if they can apply this new component to new or existing applications or to business process, to prove whether it’s beneficial. But early adopters, and savvy organizations that have a mature FinOps practice, are looking at how their cloud cost management practices wrap around this new type of cloud workload. Let’s explore some of the foundational FinOps elements and how they apply to AI. The FinOps Framework During its establishment and subsequent updates, the FinOps Foundation’s FinOps Framework was always intended to expand beyond just consumption-based cloud workloads. We’re seeing the rise of this with the addition of Scopes to the Framework, including Software-as-a-Service and Data Centers. And while AI isn’t a new scope because it fits into Public Cloud (or Data Center if you are hosting your own models locally), AI does benefit from the original forward-thinking approach. It provides an acid-test to see if the Principles, Domains, etc. are applicable to this new kind of workload, both for AI capabilities that purely consumption based (e.g. Azure OpenAI language models) or are SKU based with a combination of size limitations plus scale unit hours (e.g. Azure AI Search). Principles, Core Personas, Allied Persona, Phases and Maturity all ring true when looked at from the perspective of your AI workloads. You may add a few specialized personas when you go down to a greater level of detail under Engineers – like data and prompt engineers. Learn more in the FinOps Foundation’s FinOps for AI Overview. AI does not negate the need for teams to collaborate, everyone to take ownership of their cloud usage, and for business value to drive decision-making. The Domains and Capabilities are where we drill down into the next level of detail and examine if our existing tools and processes are up to the task of managing AI workloads. Microsoft takes a holistic approach to adopting new workloads, reflected in our Cloud Adoption Framework and Well Architected Framework. These have been updated to include AI adoption and AI Workload Documentation. For AI adoption, our Azure Essentials guidance introduces stages of Readiness and foundation, Design & govern, and Manage and optimize – combining detailed best practices and tools relevant to your AI adoption journey. FinOps capabilities for adopting AI on Azure Combine Microsoft's guidance and the FinOps Framework, and you can drill down into many of the FinOps capabilities during the stages of adopting AI on Azure: Readiness & foundation Planning & estimating – Planning for and estimating the cost of an AI workload requires an understanding of the entire application architecture, driven by the business requirements. Like every other application you manage, your organization will have requirements regarding security, resiliency & redundancy, recovery time and recovery points. These may be less rigorous if your application is stateless and not holding long-term data, but the architecture may need to be resilient if the application will become mission-critical or customer facing. & browser frontend, PostgreSQL & Redis, Event Bus and OpenAI. Next, it’s important to understand how AI services are priced. Most text-based AI services using Large Language Models (LLMs), including the Azure OpenAI Service, are priced per 1,000 tokens (where a token is a common sequence of characters found in the text). OpenAI’s Tokenizer website can demonstrate how text is broken into tokens, with efficiency improvements already being seen from GPT-3 to GPT-4o. Calculating the number of tokens in a sample conversation (inputs) & multiplying that by the number of website visitors x the percentage that may interact with a chatbot, can give us an indication of token usage and predicted cost. Workload optimization – As well as ensuring that all the components of the application are right-sized, consider whether you are using the right AI models, pre-built models or more efficient newer models. Also investigate ways that your application design can optimize token usage in how it caches responses (including semantic caching) and sends summaries instead of entire conversation history. Budgeting & forecasting – Good budgeting & forecasting is built on understanding current usage and adjusting for future expectations. In a future blog post, we’ll dive into how to measure and analyze AI workload usage. Once you understand the specifics of AI in your cost management tooling, the regular habit of budget reviews and forecasting adjustments is part of your overall FinOps processes. Empower teams & foster collaboration – Just as you’re learning how AI services generate cost and how to analyze that data, ensure your teams understand this at the right level for them. Engineers might use these insights to make application design changes, while Finance teams might care less about tokens but could equate that to customer usage and hopefully a parallel increase in sales. Design & govern Establish AI policy & governance – It’s easy to think of this one in terms of privacy & security, but what other aspects of AI do you need to put rules around, especially in relation to cost? Are all engineers allowed to deploy all AI workloads? Is AI restricted to development subscriptions until token usage is proven with a proof of concept? Are there any relevant in-built or customer Azure Policies that you need to implement? Are self-hosted open source LLMs allowed in your organization? Integrate intersecting disciplines like LLMOps – While we think of Engineers as the traditional personas who will develop and maintain an application, AI introduces specialized disciplines like LLMOps. Machine Learning Ops for Large Language Models (LLMOps) includes the automation of repetitive tasks, such as model building, testing, deployment, and monitoring, which improves efficiency. Though LLMs are pre-trained, MLOps can be leveraged to tune the LLMs, operationalize and monitor them effectively in production. These improvements can lead to cost reduction. Take advantage of the variable cost model of the cloud – A key part of this is understanding how your AI service is charged, and whether run times will impact cost or not. Fine tuning models have an hourly cost in addition to a token usage rate, charged from deployment onwards, so be mindful of when you deploy them and delete them when they are no longer needed. Implement the “crawl, walk, run” approach for continuous improvement – This one is self-explanatory and even more applicable as your organization adopts AI workloads. You will start out with a basic understanding and rudimentary decisions, which should be reviewed as your AI maturity increases. Manage and optimize Leverage reporting and analytics – Identify how AI usage and costs are surfaced in your existing cost management tools. We’ll cover this from a Microsoft Azure perspective in a future blog post. Anomaly management – As for any application, understand your process or tools for detecting anomalies in usage, and what steps should be taken next. Drive accountability – Emphasize to technical and engineering teams that their design decisions have cost implications, and that they have also have the power to help identify and implement cost efficiencies. Rate optimization – This responsibility should lie with your FinOps team and includes both the pricing rates for your organization (for example, via an Enterprise Agreement) as well as rate discounts using Azure Reservations, which also apply to AI services using Provisioned Throughput Units. Sustainability – An important consideration for corporate responsibility, especially if your organization has Environmental, Social & Governance (ESG) goals or reporting requirements. Leverage the Azure Carbon Optimization reports for emissions data and more. Unit economics – Unit economics breaks down into understanding the true cost of an application and correlating that with the business metric of the desired outcome (e.g. revenue per visit). From an application cost perspective, consider capabilities like Azure API Gateway’s Azure OpenAI Token Metric policy, which collects token usage data and facilitates accurate cross-charging based on token consumption. Then explore the outcome of your AI solution to identify business value indicators. These may be easier to measure if you’ve integrated AI with your e-commerce site, but a little trickier if it’s an internal chatbot that is improving business productivity. Conclusion If you have an established FinOps practice, you’ll find many of the process and capabilities are applicable to adopting and managing AI workloads on Azure. If you haven’t explored FinOps in depth yet, new AI workloads may be the catalyst for establishing a FinOps practice, but these foundational principles will also benefit your entire cloud, SaaS and data center estate. Learn how to navigate the financial landscape for successful AI adoption, including security funding, establishing your organizational readiness and managing your AI investments.596Views1like0CommentsStep by Step to Create a Tag Filter in FinOps Hub (Portuguese)
Now live on the Azure InfraGurus regional blog, is guidance on filtering by tags using FinOps hubs. This local language blog is published in Portuguese. https://techcommunity.microsoft.com/t5/azure-infragurus/passo-a-passo-para-criar-um-filtro-de-tag-no-finops-hub/ba-p/4249215 If you'e looking for more Infrastructure guidance in Portuguese, check out the rest of the blog: https://techcommunity.microsoft.com/t5/azure-infragurus/bg-p/AzureInfraGurus104Views1like0CommentsNews and updates from FinOps X 2024: How Microsoft is empowering organizations
Last year, I shared a broad set of updates that showcased how Microsoft is embracing FinOps practitioners through education, product improvements, and innovative solutions that help organizations achieve more. with AI-powered experiences like Copilot and Microsoft Fabric. Whether you’re an engineer working in the Azure portal or part of a business or finance team collaborating in Microsoft 365 or analyzing data in Power BI, Microsoft Cloud has the tools you need to accelerate business value for your cloud investments.11KViews8likes0Comments