Data Sink From Dataverse to Azure SQL database using azure data factory is taking really long time

Copper Contributor

Hi Everyone,

 

Actually I'm new to the azure data factory, we are facing 2 problem while sinking data verse to azure SQL database for reporting purpose.

 

Problem 1:

We have created a azure synapse link to push the data to the azure data lake from there we are pushing data to Azure SQL DB using Azure Data Factory pipelines and data flows, Using source data as Data lake data from Dataverse and Sink as dataset from the Azure SQL Database but this process is taking time. More than 9mins to sink 32 tables. Each table contains less than 2000 rows on average. and One table contains 250k rows is there a way to reduce the time of this process?

 

Problem 2: 

We are have an issue with triggering the pipelines. Is there any way to trigger on data change?

If any data is changed in the data verse then the pipe line should trigger and update the respective record or table in the Azure SQL DB.

 

Please help me in this. Thanks in advance.

1 Reply
For problem 1 are you using Link or pipelines? You seem to indicate both.

For problem 2, use the "Enable Change Data Capture" checkbox on the CDM source.