How to import a large dataset into a SharePoint list using azure webjobs

Iron Contributor

I need to import 200,000 rows into a SharePoint online list. The rows currently exist in an access database. This process needs to be run several times per year by an end user (uploading a large list to a different site). When I try to export from excel to SharePoint the export runs for hours and then typically fails with and ‘out of resources’ exception. Also, one of the columns in the access database is an email address, which needs to be converted to a ‘person or group’ column in SharePoint, so the access export won’t really give me what I need anyway.

 

So I exported the data from Access to a csv, and created an SPFX/pnp-js-core application that parses the CSV and uploads the data to the SharePoint list while converting the email address to a ‘person or group’ column. This works, but it takes many hours to upload 200,000 rows.

 

It was suggested that I upload the CSV files to SharePoint, and then have an azure webjob parse the CSV and save the rows in SharePoint.  I created the webjob  that reads the 200,000 lines from the csv file, and saves the rows in SharePoint. The webjob is run based on a message being added to a queue, I found that 5 minutes after the webjob has started, a second instance of the webjob starts, and it too reads the lines from the csv and starts adding rows to SharePoint . 

 

So my questions are: How do I stop that second instance of the webjob from starting?  I have set the isSingleton to true, but that did not help. There must be some flag that says how long a webjob should be allowed to run for. 

 

Secondly , is this the right approach? Or should the first webjob just parse the csv and write the rows out to a queue, and then have a second webjob that listens on that queue and does the insert into SharePoint. It seems this might  be more in keeping with the architecture of azure webjobs, but it will likely be super slow as the second webjob will need to connect to SharePoint separately for each of the 200,000 rows to be inserted.  Are webjobs even the right toll for this?  (would Logic Apps, Azure FUnctions, or azure Batch be better?)

 

I am new to azure. It seems like this must be a common use case. Is there any guidance available on how to do it.

 

 

 

 

3 Replies
I would tackle this with Microsoft Flow.

Built in Excel, CSV connectors.
Parallel (up to 50) for-each construct (up to 5000 per for-each, we'll have to process the file in chunks).
Pause, resume, error-retry.
Email field conversion.
Schedule trigger or File upload trigger or Message Queue. Trigger restriction on concurrency: 1

Though, I'm more likely to have the first part of the Flow split the CSV into 5000-row chunks. Place them into a separate queue.

Then have a second Flow watching the queue to fire off and insert rows in parallel, per file.

Make sure that you create indexes before you get to 5,000 items and if you get to 20,000 items you won't be able to add new indexes at all.