Dear All,


I have a very simple csv file to import into SQL Server. Below are some properties:

1) CSV file is of around 1 GB

2) This need to be imported into a table and replaced on daily basis through a day-end job process

3) We need to migrate this on Azure


My Questions:

1) If we use SSIS we have to migrate it to Azure Data factory when moved to cloud. This will be expensive when compared to using OPENROWSET or BULK INSERT command directly from SQL Server batch job?

2) Should we use OPENROWSET in this scenario?




1 Reply


I had a similar scenario and used powershell bcp bulk insert from the computer that stored the csv file locally. When it became a nightly job, I added a Power Automate to move the file from the sales team sharepoint folder to a onedrive folder synced to a folder on the local drive on the computer that executed the powershell bcp. Not an elegant solution, but it only took a few minutes to setup up and I get an email with the job results every morning (different failures or successful import). 


My solution is probably not the best solution, but maybe some part of the solution helps.