Data Factory or Logic Apps?

MVP

I currently use Power Automate to get daily logs from a third party service using the HTTP connector. The Power Automate flow returns around 20 separate json files per day. When combined the files provide the complete log for the day. The files have to be separated out to keep the GET request within the limit of the third party service. The files are saved in SharePoint and then combined in Power BI via a Dataflow that only combines the latest 15-days of data.

 

This approach has worked well for the use case but I have other similar use cases and I know they will put a strain on my Power Automate account and SharePoint. From time to time the writing (by Power Automate) or subsequent reading of the files (by the Dataflow) is impacted by SharePoint throttling. I feel I need to industrialise the approach and move away from using SharePoint as the repository.

 

I've looked through the Microsoft documentation for Power BI and Data Factory and I cannot pick a clear path. It is not clear to me that Data Factory will be able to get the data from the third party service. My current thinking is that I:

 

  • Convert my existing Power Automate flow to an Azure Logic App
  • Write the data to Azure Blob storage
  • Modify the combining Power BI dataflow to access the Blob Storage.

 

I could create a new Logic App for each additional use case.

 

The alternative seems to be to use Data Factory and then Power BI but then I am not clear whether I need to use a Logic App to stage the files first for Data Factory to consume. 

 

I do not need extra items like Synapse Analytics. Power BI is more than adequate. It is important that I save the daily snapshots as the third party service only holds data for 7-days.

 

What does this community advise?

 

[I've also asked the Power BI community to get their perspective.]

3 Replies
Hi Simon,
What purpose are pursuing with an alternative solution?

In part because I lose several hours per day due to SharePoint throttling. In part because the Power Automate calls are not 100% reliable. But also there is element of product development maturity. I've started out as a citizen data cruncher and I recognise the need to industrialise and professionalise my work. 

I migrated data from SharePoint to ADL Gen 2 and Power dataflows and Power queries started to work way faster. But there is one caveat that our SharePoint was not deployed properly, so maybe this is the reason.

Talking about your points:
1. Convert my existing Power Automate flow to an Azure Logic App
2. Write the data to Azure Blob storage
3. Modify the combining Power BI dataflow to access the Blob Storage.

#1 - cannot really say as I don't have too much experience with Power Automate.
#2 and #3 - this is a good point despite the cost. In Azure you will spend more money, depending on the sizes of your files.