ADF - data connect from blob to Azure SQL

%3CLINGO-SUB%20id%3D%22lingo-sub-2014186%22%20slang%3D%22en-US%22%3EADF%20-%20data%20connect%20from%20blob%20to%20Azure%20SQL%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2014186%22%20slang%3D%22en-US%22%3E%3CP%3EHi%20All%2C%26nbsp%3B%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%3CSPAN%3EI%20have%20a%20scenario%2C%20I%20have%20a%20multiple%20excel%20files%20(4%20files)%20in%20storage%20blob%20and%20need%20to%20upload%20in%20SQL%20in%204%20different%20table%20(I%20have%204%20tables%26nbsp%3B%20for%20staging%26nbsp%3Band%204%20tables%20for%20master%20table)%20.%20I%20have%20created%20the%20stored%20procedure%20in%20SQL%20for%20those%204%20files.%20%3C%2FSPAN%3E%3C%2FP%3E%3CP%3E%3CSPAN%3ECan%20anyone%20help%20me%20with%20the%20ADF%20process%20to%20upload%20automatically%20on%20regular%20basis.%26nbsp%3B%3C%2FSPAN%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%3CSPAN%3EThanks%3C%2FSPAN%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-LABS%20id%3D%22lingo-labs-2014186%22%20slang%3D%22en-US%22%3E%3CLINGO-LABEL%3EAzure%20Data%20Integration%3C%2FLINGO-LABEL%3E%3CLINGO-LABEL%3ECopy%20Activity%3C%2FLINGO-LABEL%3E%3C%2FLINGO-LABS%3E%3CLINGO-SUB%20id%3D%22lingo-sub-2026815%22%20slang%3D%22en-US%22%3ERe%3A%20ADF%20-%20data%20connect%20from%20blob%20to%20Azure%20SQL%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2026815%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F910874%22%20target%3D%22_blank%22%3E%40Shruthi96%3C%2FA%3E%26nbsp%3BI'm%20supposing%20the%20files%20contain%20records%20and%20you%20would%20like%20to%20insert%20records%20in%20SQL%2C%20not%20the%20files%20themselves.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EIn%20this%20case%2C%20a%20simple%20COPY%20activity%20will%20do%20the%20work%20and%20you%20can%20create%20a%20trigger%20for%20each%20new%20file%20created%20in%20blob%20storage.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-2029172%22%20slang%3D%22en-US%22%3ERe%3A%20ADF%20-%20data%20connect%20from%20blob%20to%20Azure%20SQL%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-2029172%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F720786%22%20target%3D%22_blank%22%3E%40DennesTorres%3C%2FA%3E%26nbsp%3B%2C%20Yes%20I%20am%20looking%20through%20parametrizing%20the%20data.%26nbsp%3B%20I%20dont%20want%20to%20create%20dataset%20for%20each%26nbsp%3B%20file%20for%20source%20and%20sink%20file.%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E
New Contributor

Hi All, 

 

I have a scenario, I have a multiple excel files (4 files) in storage blob and need to upload in SQL in 4 different table (I have 4 tables  for staging and 4 tables for master table) . I have created the stored procedure in SQL for those 4 files.

Can anyone help me with the ADF process to upload automatically on regular basis. 

 

Thanks

 

 

5 Replies

@Shruthi96 I'm supposing the files contain records and you would like to insert records in SQL, not the files themselves.

 

In this case, a simple COPY activity will do the work and you can create a trigger for each new file created in blob storage.

 

 

@DennesTorres , Yes I am looking through parametrizing the data.  I dont want to create dataset for each  file for source and sink file. 

@Shruthi96 You can create parameters in a dataset. In this case the parameter could be the filename/table name

 

You can create a table in SQL to hold the file name/table name relation. Than you use a lookup activity to read this data, foreach activity to repeat the copy activity for each file and fill the dataset parameters.

This requires your files to follow some pattern. If they are completely different from each other and from the tables this may not work. However, there are only 4 files, right? It's not that difficult to make 8 datasets.

 

 

@DennesTorres  Thank you, i think this solution will work.  Will try and reply you thanks. 

 

@Shruthi96 

hola,
aparte de parametrizar data set y cargar todos los archivos con 1 pipeline: primero leer metadatos de la carpeta que contiene los ficheros y luego pasar los nombres del ficheros por ForEach que tiene "Copy data" por dentro.

iryna_YF_0-1614800336210.png

 

iryna_YF_1-1614800558999.png

iryna_YF_2-1614800729366.png

Un saludo!