How to Parse Fields on Data Ingest for CSV?

Copper Contributor

I have a 1.2GB CSV file (updates every month) that I want to ingest into a table. But I need to parse fields on data ingest rather than at query time. Custom fields doesn't seem to work for delimited data unless I'm missing something.

 

What is the best way to do this?

6 Replies
Two thoughts (you have probably thought of this)
1. Could you use externaldata operator to just read the csv file from a storage account? Works well for ad-hoc and doesn't cost time, effort and Table cost.
2. Large Watchlists, if you can chunk the file into 3 parts? https://techcommunity.microsoft.com/t5/microsoft-sentinel-blog/large-watchlist-using-sas-key-is-in-p...
1. The externaldata operator does work, although it doesn't seem very performant with such a large lookup. That's one of the reasons I wanted to get it into a table. Plus the KQL becomes much simpler, which makes it easier for less experienced team members to plug into their queries.

2. Watchlists unfortunately won't work for my particular use case. The data has to live in one table.

I'm building a geolocation enrichment so logs that have an IP address can add fields for the associated country and ISP. The lookup itself constrains my options, but if I can get the data into a table it works great.

I'm currently looking into the Upload-AzMonitorLog.ps1 method of uploading the CSV.
You may also be able to upload with a Logic App (Playbook) using a Scheduled trigger?
Not sure why you would not be able to separate the data using a delimiter using the split() command. Is it possible to paste some of the cleaned up data?
Is there a way to use the split() command to parse the data on ingest into a custom field? I can't do it at query time due to the way I need to use the data.

It seems very strange to me that this parsing works perfectly as an externaldata table but I can't upload the same data to a permanent table easily.