Azure design for large data volumn

%3CLINGO-SUB%20id%3D%22lingo-sub-2115831%22%20slang%3D%22en-US%22%3EAzure%20design%20for%20large%20data%20volumn%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2115831%22%20slang%3D%22en-US%22%3EHi%2C%3CBR%20%2F%3E%3CBR%20%2F%3EUrgently%20need%20help%20-%20how%20to%20read%20120gb%20(3.4billion%20rows%20from%20a%20table)%20at%20lightening%20data%20from%20azure%20SQL%20server%20database%20to%20azure%20data%20Lake.%3CBR%20%2F%3E%3CBR%20%2F%3EI%20tried%20to%20two%20options%3A%3CBR%20%2F%3ECopy%20activity%20with%20parallelism%20and%20highest%20DIU%20-%20this%20gives%20time%20out%20error%20after%20long%20running%20hours%3CBR%20%2F%3EData%20flow%20-%20this%20takes%2011%20hours%20long%20time%20to%20read%20data%3CBR%20%2F%3E%3CBR%20%2F%3EPlease%20suggest%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-2116374%22%20slang%3D%22en-US%22%3ERe%3A%20Azure%20design%20for%20large%20data%20volumn%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2116374%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F957106%22%20target%3D%22_blank%22%3E%40Sonalk%3C%2FA%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EHi%26nbsp%3B%3C%2FP%3E%3CP%3EDo%20you%20have%20detailled%20logs%20about%20that%20.I%20would%20suggest%20to%20have%20a%20look%20on%20the%20data%20flows%20performance%20and%20tuning%20guide%26nbsp%3B%3C%2FP%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fdata-factory%2Fconcepts-data-flow-performance%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3EMapping%20data%20flow%20performance%20and%20tuning%20guide%20-%20Azure%20Data%20Factory%20%7C%20Microsoft%20Docs%3C%2FA%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EYou%20can%20also%26nbsp%3B%20consider%20performing%26nbsp%3B%20data%20orchestration%20using%20Azure%20Batch%20and%20Data%20factory%20for%20large%20volumes%20of%20data%26nbsp%3B%3C%2FP%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fdata-factory%2Fv1%2Fdata-factory-data-processing-using-batch%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3EProcess%20large-scale%20datasets%20by%20using%20Data%20Factory%20and%20Batch%20-%20Azure%20Data%20Factory%20%7C%20Microsoft%20Docs%3C%2FA%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E
New Contributor
Hi,

Urgently need help - how to read 120gb (3.4billion rows from a table) at lightening data from azure SQL server database to azure data Lake.

I tried to two options:
Copy activity with parallelism and highest DIU - this gives time out error after long running hours
Data flow - this takes 11 hours long time to read data

Please suggest
1 Reply

@Sonalk 

 

Hi 

Do you have detailled logs about that .I would suggest to have a look on the data flows performance and tuning guide 

Mapping data flow performance and tuning guide - Azure Data Factory | Microsoft Docs

 

You can also  consider performing  data orchestration using Azure Batch and Data factory for large volumes of data 

Process large-scale datasets by using Data Factory and Batch - Azure Data Factory | Microsoft Docs