Azure design for large data volumn

New Contributor

Urgently need help - how to read 120gb (3.4billion rows from a table) at lightening data from azure SQL server database to azure data Lake.

I tried to two options:
Copy activity with parallelism and highest DIU - this gives time out error after long running hours
Data flow - this takes 11 hours long time to read data

Please suggest
1 Reply




Do you have detailled logs about that .I would suggest to have a look on the data flows performance and tuning guide 

Mapping data flow performance and tuning guide - Azure Data Factory | Microsoft Docs


You can also  consider performing  data orchestration using Azure Batch and Data factory for large volumes of data 

Process large-scale datasets by using Data Factory and Batch - Azure Data Factory | Microsoft Docs