Blob to Sql

Copper Contributor



I am trying to copy my 3 CSV files from my blob to my SQL Database, with the help of metadata and for-each activity.


I have my SQL tables created with proper schema and want to create a pipeline in such a fashion that each CSV loads into its relevant table.


Problem - When I run the pipeline it is creating new tables instead of loading in the existing pre-made tables.


Can anyone let me know what should i do?



2 Replies

Hello @NImai1185 


If you want to copy your 3 files to its relevant tables, then simply use Data flow as create 3 data sets for your csv files and 3 for tables.

If not, then you have to store the table names in array variable and then replace those names with the file name and pass into the foreach activity and in sink parameter.

So, it will store files in its relevant tables.

You just have to cut out the .csv in the expression above. Could just do it with a replace function. Don't use mapping data flows in that case, that would be a total overkill. Also, you might want to look at the standard metadata-driven copy flow in the templates library.