Power BI Import Datasets are simple to build and highly performant. Power BI literally imports and compresses an optimized cache of reporting data. Most of the time, Power BI architects will schedule their Import Datasets to refresh on a schedule. Sometimes the availability of new data is not predictable, and they will want to ensure that reports have current data by triggering a Refresh when the source data changes. A Data Refresh API can accomplish this task, but implementing it is not a code-free experience.
Self-Service Business Intelligence architects seeking a low code option for triggering refreshes of their Power BI Import Datasets have another option. Power Automate, which like Power BI is part of the Power Platform, has an easy-to-use action that Refreshes a Power BI Dataset. The video at the top of this blog article details the process for setting up a Flow in Power Automate that triggers a Refresh. Towards the end of the video, note that a methodology for scaling out this capability is reviewed. Regardless of the source of data, most ETL/ELT tools will complete by adding or updating data for a reporting source. The updates or additions then need to be refreshed into a Power BI Import Dataset cache.
In a large enterprise environment, self-service Business Intelligence users will often pull data into Power BI from many different sources. Sources could be EMR data, ERP data, Open Data public repositories of Population Health data, CRM data from Dynamics, or many other sources.
Triggering a Power BI Refresh refresh using Power Automate is easy, but with a disciplined approach it can be even easier for both users and administrators. First, consider adding an Azure SQL Database (DB) that is read-only for all Business Intelligence users who might need to trigger a Power BI refresh. You can use other Connections for your trigger, but Azure SQL DB is a simple option since it can be in the same Azure tenant as Power BI and does not require Gateway Administration. A separate table can be added for every source of data that a user might need to use as a trigger. If each ETL/ELT process in an organization adds a single row to a table in Azure SQL DB upon completion, then there would be a complete set of trigger tables users could access in a single Azure SQL DB. Consider the diagram below:
The trigger tables in Azure SQL DB do not need to contain any data from the actual source, and therefore data Security should not be an issue. The trigger tables need an Identity column, and ideally Date, Time, and possibly Datetime columns would be helpful too. Consider the following example:
In short, self-service Power BI users who need to trigger Dataset Refreshes from unpredictable data sources can be granted access to a centrally managed and carefully curated Azure SQL Database containing simple trigger files for data sources in an organization.
Here are a few notes and links:
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.