Forum Discussion

TheDilly's avatar
TheDilly
Copper Contributor
Apr 07, 2022

How to Parse Fields on Data Ingest for CSV?

I have a 1.2GB CSV file (updates every month) that I want to ingest into a table. But I need to parse fields on data ingest rather than at query time. Custom fields doesn't seem to work for delimited data unless I'm missing something.

 

What is the best way to do this?

    • TheDilly's avatar
      TheDilly
      Copper Contributor
      1. The externaldata operator does work, although it doesn't seem very performant with such a large lookup. That's one of the reasons I wanted to get it into a table. Plus the KQL becomes much simpler, which makes it easier for less experienced team members to plug into their queries.

      2. Watchlists unfortunately won't work for my particular use case. The data has to live in one table.

      I'm building a geolocation enrichment so logs that have an IP address can add fields for the associated country and ISP. The lookup itself constrains my options, but if I can get the data into a table it works great.

      I'm currently looking into the Upload-AzMonitorLog.ps1 method of uploading the CSV.
      • Clive_Watson's avatar
        Clive_Watson
        Bronze Contributor
        You may also be able to upload with a Logic App (Playbook) using a Scheduled trigger?
  • GaryBushey's avatar
    GaryBushey
    Bronze Contributor
    Not sure why you would not be able to separate the data using a delimiter using the split() command. Is it possible to paste some of the cleaned up data?

Share