Forum Discussion
Russell Gove
Feb 04, 2018Iron Contributor
How to import a large dataset into a SharePoint list using azure webjobs
I need to import 200,000 rows into a SharePoint online list. The rows currently exist in an access database. This process needs to be run several times per year by an end user (uploading a large list...
John Liu
Feb 15, 2018MVP
I would tackle this with Microsoft Flow.
Built in Excel, CSV connectors.
Parallel (up to 50) for-each construct (up to 5000 per for-each, we'll have to process the file in chunks).
Pause, resume, error-retry.
Email field conversion.
Schedule trigger or File upload trigger or Message Queue. Trigger restriction on concurrency: 1
Though, I'm more likely to have the first part of the Flow split the CSV into 5000-row chunks. Place them into a separate queue.
Then have a second Flow watching the queue to fire off and insert rows in parallel, per file.
Built in Excel, CSV connectors.
Parallel (up to 50) for-each construct (up to 5000 per for-each, we'll have to process the file in chunks).
Pause, resume, error-retry.
Email field conversion.
Schedule trigger or File upload trigger or Message Queue. Trigger restriction on concurrency: 1
Though, I'm more likely to have the first part of the Flow split the CSV into 5000-row chunks. Place them into a separate queue.
Then have a second Flow watching the queue to fire off and insert rows in parallel, per file.
powerappsRocks
Feb 04, 2021Brass Contributor
John Liu ,
How many "units" (https://docs.microsoft.com/en-us/power-automate/limits-and-config#action-request-limits) do you think a Flow like this would cost?