[Guest perspective: The following article is a guest post by Roland Krummenacher at Alescent, a Microsoft Partner. The views and opinions expressed in this article are those of the author. We thank Roland for sharing his perspective.]
The FinOps toolkit provides a set of standard Power BI reports that address key scenarios within the FinOps Framework, such as cost transparency and rate optimization. However, many organizations require more than just predefined reports. They often need the flexibility to build custom analyses, develop advanced reporting, and enrich FinOps data with internal data sources.
This post shows you how to:
- Build your own cost analyses using Kusto Query Language (KQL).
- Connect and enrich your FinOps data with additional sources in Azure Data Explorer (ADX) or Microsoft Fabric Real-Time Intelligence (RTI).
Common use cases
Organizations typically extend their FinOps reporting and analysis capabilities to:
- Allocate and charge back cloud costs
Design customized showback and chargeback models, refine cost allocation across business units, cost centers, and regions, and support internal billing and negotiation processes. - Improve cost visibility and efficiency
Identify cost-saving opportunities, detect idle or over-provisioned resources, and analyze the financial impact of architectural changes such as serverless adoption or reserved instance utilization. - Integrate cloud costs with business metrics
Combine cost data with key performance indicators (KPIs) such as cost per customer, cost per transaction, or revenue margins. - Enable advanced forecasting and budgeting
Perform trend analysis, forecast future costs, and estimate budgets based on historical usage patterns. - Support multi-cloud governance
Normalize and compare costs across multiple cloud providers for unified reporting and benchmarking. - Implement proactive cost management
Set up custom anomaly detection mechanisms to monitor unexpected spending patterns. - Create executive-level reporting
Build tailored KPIs and dashboards for C-level stakeholders, focusing on business-relevant insights.
Enabling advanced analysis with Azure Data Explorer and Microsoft Fabric
Since version 0.7 (released in November 2024), FinOps hubs, one of the tools in the FinOps toolkit, includes Azure Data Explorer (ADX) as its central data platform. In version 0.10 (released on May 4, 2025), FinOps hubs also added support for Microsoft Fabric Real-Time Intelligence (RTI), which is a version of ADX fully integrated into Microsoft Fabric. ADX and RTI are high-performance, fully managed analytics services optimized for large-scale data exploration. They enable organizations to query, join, and analyze cloud cost and usage data at scale using the powerful Kusto Query Language (KQL).
By leveraging ADX and RTI, FinOps hubs not only support standard reporting needs but also empower users to build advanced custom analyses, enrich datasets with internal or external sources, and integrate seamlessly with a variety of visualization and analytics tools.
Prerequisites
To follow the examples in this article, ensure you have:
- A deployed FinOps hub instance with Azure Data Explorer or Microsoft Fabric, see FinOps hubs.
- An Entra ID account with read access to both the Hub and Ingestion databases (Viewer role). For details, see Security roles.
Writing your first KQL query
Kusto Query Language (KQL) is the primary query language for Azure Data Explorer and Fabric Real-Time Intelligence. While KQL is conceptually similar to SQL or T-SQL, KQL is optimized for fast, flexible analysis across large datasets.
The simplest way to execute a KQL query is through the Azure Data Explorer or Microsoft Fabric web UI. For an overview, see Azure Data Explorer web UI query overview or Query data in a Microsoft Fabric KQL queryset.
Steps to run a query
Whether your data is in Azure Data Explorer or Microsoft Fabric, you can use either tool to connect and run queries.
To use Azure Data Explorer:
- Open the Azure Data Explorer web UI.
- Add a connection to your Azure Data Explorer cluster, see add a cluster connection.
- In the query editor, select the Hub database.
To use Fabric Real-Time Intelligence:
- Open the desired workspace from Microsoft Fabric.
- In the list of workspace resources, select Hub_queryset for your eventhouse.
Once you've connected to the Hub database, run the following query:
Costs
| summarize Cost = sum(EffectiveCost)
by BillingPeriodStart
| render columnchart
This query summarizes effective costs by billing period and renders the results as a column chart.
FOCUS compatibility
The Costs view in the Hub database is compatible with the FinOps Open Cost and Usage Specification (FOCUS), allowing you to align analyses with standardized FinOps use cases.
While KQL is the primary language, Azure Data Explorer also supports a subset of T-SQL commands, enabling users familiar with SQL to write basic queries. For more information, see T-SQL – Azure Data Explorer.
The following example shows the different metered costs for a particular SKU. The SQL-statement, taken from the FOCUS use case library, can be executed against the Costs view:
SELECT
ProviderName,
ChargePeriodStart,
ChargePeriodEnd,
SkuId,
SkuPriceId,
PricingUnit,
ListUnitPrice,
SUM(PricingQuantity) AS TotalPricingQuantity,
SUM(ListCost) AS TotalListCost,
SUM(EffectiveCost) AS TotalEffectiveCost
FROM Costs
WHERE ChargePeriodStart >= DATEADD(day, -30, GETDATE())
AND ChargePeriodEnd < GETDATE()
GROUP BY
ProviderName,
ChargePeriodStart,
ChargePeriodEnd,
SkuId,
SkuPriceId,
PricingUnit,
ListUnitPrice
ORDER BY ChargePeriodStart ASC
While Azure Data Explorer supports a subset of T-SQL commands for basic querying, KQL is the recommended language for advanced analysis. It offers broader capabilities, is optimized for large-scale data exploration, and benefits from widespread adoption across Azure services such as Microsoft Defender, Azure Resource Graph, Azure Monitor and Microsoft Fabric — meaning many cloud and security teams are already familiar with it.
Writing more advanced KQL queries
Basic KQL queries can be extended to address complex business requirements. For example:
Costs
| where BillingPeriodStart >= datetime(2024-01-01)
and BillingPeriodEnd < datetime(2024-04-01)
| summarize
TotalEffectiveCost = sum(EffectiveCost)
by
ProviderName,
BillingCurrency,
BillingPeriodStart,
ServiceCategory
| order by TotalEffectiveCost desc
Beyond the Costs view, FinOps hubs also include other views such as Prices, which can be queried individually or joined with other datasets using the join operator. Example queries are available in FinOps toolkit best practice library.
Learn more about KQL
Expand your KQL skills with the following Microsoft Learn resources:
Visualizing FinOps data with KQL
Writing powerful queries is only the first step. Visualizing insights effectively is critical to drive action across the organization. Azure Data Explorer (ADX) and Microsoft Fabric Real-Time Intelligence offer a rich set of visualization options to address diverse FinOps needs — from portfolio managers to cloud engineers.
Microsoft Excel and Power Query
Microsoft Excel, enhanced with Power Query, is ideal for portfolio managers and procurement teams who want to integrate live FinOps data into their existing spreadsheets. Excel and Power Query enable users to refresh FinOps data seamlessly, perform custom reporting within familiar Excel environments, and support processes like budget planning, contract renewals, and forecast validation.
Learn more about connecting Microsoft Excel to Azure Data Explorer.
Power BI
For FinOps teams, finance managers, and executive stakeholders, Power BI is a natural choice especially when Power BI is already used across the organization. Power BI allows users to build FinOps KPIs, create management reports, and integrate cloud financials into existing corporate dashboards with ease, powered directly by Azure Data Explorer queries.
Learn more about connecting Power BI to Azure Data Explorer.
Azure Data Explorer dashboards and KQL render visualizations
FinOps practitioners and cloud analysts often need lightweight, real-time visualizations to monitor specific cloud spend patterns or anomalies. Azure Data Explorer dashboards and the render operator allow users to quickly create ad-hoc dashboards, monitor commitment-based discounts like reservations and savings plans, or track the costs of specific services such as Azure OpenAI, Sentinel, or Azure Monitor — all without leaving Azure Data Explorer.
Learn more about Azure Data Explorer dashboards.
Learn more about KQL render operator
Azure Workbooks
Azure Workbooks are ideal for cloud engineers and operations teams who work primarily within the Azure Portal. Workbooks provide a flexible way to create new FinOps visualizations by combining cost data from Azure Data Explorer with operational telemetry and monitoring insights. Organizations can easily add their own custom Workbooks to complement the existing FinOps workbooks.
This approach helps teams extend their cost governance capabilities while maintaining consistency with the core FinOps Framework.
Learn more about connecting Azure Workbooks to Azure Data Explorer.
Microsoft Fabric
Microsoft Fabric extends the FinOps visualization landscape by bringing together real-time analytics and traditional reporting within a unified platform. FinOps teams can use Fabric to build Power BI reports for advanced financial insights and Real-Time Dashboards for live cost monitoring, whether data is in a Fabric eventhouse or a separate Azure Data Explorer cluster.
Fabric also enables organizations to package multiple FinOps assets, such as reports and dashboards, into organizational apps. This allows teams to deliver comprehensive, role-specific FinOps experiences across the business, ensuring that portfolio managers, engineers, and executives can access the right insights in a single, integrated environment.
Learn more about Microsoft Fabric Org Apps.
Learn more about Microsoft Fabric Real-Time Dashboards.
Automating FinOps insights with KQL
Beyond visualization, organizations can automate FinOps workflows and insights by executing KQL queries programmatically. Azure Data Explorer and Fabric Real-Time Intelligence support integration with automation tools across low-code platforms, DevOps pipelines, and data orchestration services.
Low-code automation: Power Automate and Azure Logic Apps
Using built-in connectors for Azure Data Explorer, FinOps teams can create automated workflows that:
- Execute KQL queries on a schedule or in response to events.
- Send notifications or alerts when cost anomalies are detected.
- Populate ticketing systems with cost optimization tasks.
- Trigger approval workflows for unusual spending patterns.
Learn more about Azure Data Explorer connectors.
DevOps integration: Azure DevOps Pipelines
KQL queries can also be integrated into Azure DevOps Pipelines using dedicated tasks.
This enables:
- Running cost validation checks as part of CI/CD pipelines.
- Automatically generating FinOps reports during deployments.
- Enforcing cost governance as part of the delivery process.
Learn more about Azure DevOps integration with Azure Data Explorer.
Data orchestration: Azure Data Factory and Microsoft Fabric pipelines
For more complex data processing and orchestration scenarios, Azure Data Factory and Microsoft Fabric pipelines allow you to execute KQL queries and commands natively.
Typical use cases include:
- Periodically enriching FinOps datasets with external reference data.
- Automating ingestion of additional telemetry or metadata into FinOps Hub.
- Triggering FinOps analyses as part of broader data workflows.
Learn more about executing KQL commands in Azure Data Factory.
Learn more about Pipelines in Microsoft Fabric.
Integrating FinOps data across systems
Before moving into full-scale analysis and visualization, many organizations find value in enriching FinOps data by combining it with additional operational, telemetry, or security sources. Azure Data Explorer and Fabric Real-Time Intelligence provide powerful capabilities to integrate external datasets seamlessly, enabling FinOps teams to generate deeper insights, automate governance processes, and support data-driven decision-making across the business.
This section outlines the main integration options and best practices for working with FinOps data in broader cloud and enterprise environments.
External tables
For FinOps analysts and data engineering teams, external tables offer a flexible way to reference additional datasets without the need for ingestion. External tables allow querying external data sources directly from Azure Data Explorer or Fabric Real-Time Intelligence, helping to avoid data duplication and minimizing ingestion costs.
Typical use cases include:
- Joining FinOps cost data with organizational metadata, such as business unit mappings or account hierarchies.
- Referencing FinOps metadata, budget and forecast data, or other external business and technical datasets in real time.
- Combining FinOps data with datasets stored in data lakes or relational databases like SQL Server, MySQL, PostgreSQL, or CosmosDB.
A practical example is the integration of blob inventory report data with storage cost data:
By enabling blob inventory reports on a specific Azure Storage account, detailed metadata about stored blobs is automatically exported as CSV or Parquet files into a storage container.
These files can then be referenced directly in Azure Data Explorer as an external table, allowing detailed cost analysis and storage optimization. Learn how to enable Azure Storage blob inventory reports.
Learn more about external tables in Azure Data Explorer.
Create an external table using the Azure Data Explorer web UI wizard
Azure Monitor and Log Analytics integration
For cloud engineers, SRE teams, and FinOps practitioners, integrating Azure Monitor and Log Analytics unlocks new possibilities:
- Correlating cost data with operational telemetry such as CPU utilization, memory usage, request volumes, or storage IOPS.
- Attributing shared infrastructure costs based on actual resource usage patterns.
- Monitoring operational and financial efficiency side by side.
This bi-directional integration allows querying Azure Monitor data within Azure Data Explorer, and vice versa, enabling unified cost and performance analysis.
A practical example: When virtual machines and managed disks are configured to send performance metrics (such as CPU utilization, memory usage, throughput, latency, and IOPS) to a Log Analytics workspace (specifically into the Perf table), this telemetry can be directly joined with cost data from the Costs view. This enables not only identifying downsizing opportunities but also calculating the potential savings associated with specific optimization actions.
Correlate data in Azure Data Explorer and Azure Resource Graph with data in a Log Analytics workspace
Query data in Azure Monitor with Azure Data Explorer
Sentinel and security data integration
For security teams and cloud governance groups, integrating FinOps data with security telemetry can provide important financial insights.
Since Microsoft Sentinel is built on top of a Log Analytics workspace, it is possible to directly correlate Sentinel cost data with Sentinel data ingestion metrics. This enables building showback models for shared security costs, based on actual usage patterns and ingestion volumes — ensuring transparency and fair allocation of security-related cloud spend.
Best practices for FinOps data integration
Choosing the right integration method depends on your use case, data dynamics, and performance requirements. General recommendations:
- Use external tables when:
- The data is relatively static or updated periodically (e.g., metadata mappings, budget forecasts, inventory reports).
- Real-time query performance is not critical.
- Ingest data into Azure Data Explorer when:
- You need high-performance queries over frequently changing datasets.
- Large-scale enrichment with operational, security, or business data is required.
- Real-time cost monitoring, anomaly detection, or performance correlation is needed.
In practice, many organizations adopt a hybrid approach — using external tables for static reference data while ingesting dynamic datasets for optimized querying and analytics.
Additionally, Azure Data Explorer makes it easy to copy data from an external table into a native table. This provides a simple yet powerful pattern for implementing ETL (Extract, Transform, Load) processes within ADX, allowing you to materialize reference data or periodically snapshot external datasets for performance optimization.
How we put this into practice at Alescent
At Alescent, we have been helping organizations unlock the full potential of KQL for FinOps and advanced cloud cost optimization — from early adoption on Azure Data Explorer to fully integrated solutions on Microsoft Fabric. We have successfully implemented end-to-end FinOps capabilities, covering everything from cost transparency and allocation to forecasting, anomaly detection, and executive reporting.
If you are interested in learning more today, feel free to connect with me on LinkedIn for a direct conversation.
Conclusion
With FinOps hubs leveraging Azure Data Explorer and Microsoft Fabric, organizations can move beyond basic cost reporting to build a fully integrated, data-driven FinOps practice. By combining flexible querying, advanced visualization, and seamless data integration, FinOps teams can deliver deeper insights, optimize cloud investments, and empower every stakeholder — from engineers to executives — to make smarter, financially informed decisions.
Next steps
Ready to take your FinOps practice to the next level?
- Explore FinOps hubs and learn about deployment options and capabilities.
- Deepen your KQL skills with the Kusto Query Language documentation.
- Start building your own custom cost analyses and dashboards — and unlock deeper insights into your cloud environment.