I have a problem when trying to import logs (from a JSON file) into a custom table. Not all the logs present in the file are collected in the custom table.
If the file contains 24000 logs, a query on the table report only 14000 logs.
I'm using a bash script to import the logs file on the (linux) folder where the Custom Table get the JSON logs.
In this script first, I create the file (using the touch command) with size 0 bytes, then after 2 minutes I export (after decompressed) the entire file to be processed on the destination folder. So the log analytics agent start transfer the logs insider the file on the custom table, but not all the logs are transferred.
Where did I go wrong?
Is there a log file (on the agent) to understand the reason of this behavior?
Some special configuration on the agent to get the JSON logs and not discards some logs?
Suggestion are really appreciated. I'm blocked. I have tested syslog mode too, but the logs are truncated in a wrong way.