Cosmos DB bulk inserts (Python SDK)

Copper Contributor

Hi,

 

I've been struggling finding a good documentation explaining how bulk inserts work in the Azure SDK. I can't it to work no matter what approach I'm taking and as it seems since v4 you should use the Bulk Executer but again I find very little documentation on the subject.

The closest I've come to succeeding is using the code below together with the server side example mentioned here but I've only gotten it to work with single objects while I'm able to point to a partition key of an object directly. As soon as I do a json string with multiple objects I either get a server side error or it's complaining about me not specifying partition key correctly.

 

result = container.scripts.execute_stored_procedure(
created_sproc, partition_key=None, params=json_string, enable_script_logging=True)

 

Sorry for bringing such lack of details in the issue but I've gotten pretty much nowhere on this.

2 Replies

@petersaverman Hey

 

I'm having same issue, per my understanding requirement of adding "partition key" to the request automatically disallows us to make bulk load of the data into cosmos, and "one by one" loading increase the cost and time - I'm currently investigating doing same thing using Azure Data Factory - but until now - no success 

@lakime Sorry, I don't really remember what the outcome of this was. Hope you find a good solution!