Data Factory adds a new easy way to view estimated consumption of your pipelines. Use it to estimate the number of units consumed by activities while debugging your pipeline and post-execution runs.
While designing and debugging your pipelines, click on the "View debug run consumption" link in the bottom monitoring output.
ADF will now pop-up a report showing the number of units consumed for each activity in that pipeline. You can then click the Pricing Calculator link and plug in the consumed units to the pricing calculator to estimate your usage for that run.
If you are in design or debug mode, you should note the data sizes and volumes that you are using for your testing. You can then extrapolate the units consumed to the production workload sizes that you will use for scheduled pipelines. You can perform this from your Git branch without publishing your factory.
The ADF consumption report is only surfacing ADF related units. There may be additional units billed from other services that you are using and accessing which are not accounted for here including Azure SQL Database, Synapse Analytics, CosmosDB, ADLS, etc.
Once you have completed debugging, you can then view the estimated consumption from a triggered run. Before setting a schedule for the pipeline, you can now publish your pipeline and test it with larger datasets. After publishing, click Trigger Now. The results of the pipeline execution will appear in the ADF monitoring pane. There, you can click on the consumption report icon to view the units consumed for that triggered execution.
Once you are satisfied with your pipeline and apply an operational schedule to trigger your pipeline, you can view the consumption report again from the monitoring view.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.