Forum Discussion
TheDilly
Apr 07, 2022Copper Contributor
How to Parse Fields on Data Ingest for CSV?
I have a 1.2GB CSV file (updates every month) that I want to ingest into a table. But I need to parse fields on data ingest rather than at query time. Custom fields doesn't seem to work for delimited...
TheDilly
Apr 07, 2022Copper Contributor
1. The externaldata operator does work, although it doesn't seem very performant with such a large lookup. That's one of the reasons I wanted to get it into a table. Plus the KQL becomes much simpler, which makes it easier for less experienced team members to plug into their queries.
2. Watchlists unfortunately won't work for my particular use case. The data has to live in one table.
I'm building a geolocation enrichment so logs that have an IP address can add fields for the associated country and ISP. The lookup itself constrains my options, but if I can get the data into a table it works great.
I'm currently looking into the Upload-AzMonitorLog.ps1 method of uploading the CSV.
2. Watchlists unfortunately won't work for my particular use case. The data has to live in one table.
I'm building a geolocation enrichment so logs that have an IP address can add fields for the associated country and ISP. The lookup itself constrains my options, but if I can get the data into a table it works great.
I'm currently looking into the Upload-AzMonitorLog.ps1 method of uploading the CSV.
Clive_Watson
Apr 07, 2022Bronze Contributor
You may also be able to upload with a Logic App (Playbook) using a Scheduled trigger?