Nov 03 2020 08:13 AM
Nov 03 2020 08:13 AM
Hi all, I need to send the custom logs of a CSV file to Azure Sentinel. New files are written daily on a collector device from several applications running on other devices. I was able to transfer the files on the RedHat 7 collector device through SFTP but the logs are not sent to the Log Analytics Workspace of Sentinel by the Log Analytics Agent.
The Agent is correctly configured (custom log path, etc...) because I was able to receive a log when I manually updated one of the log files. The problem here is that the new file is never flushed with new data. From the documentation (https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-custom-logs) I see this statement:
"Custom log collection requires that the application writing the log file flushes the log content to the disk periodically. This is because the custom log collection relies on filesystem change notifications for the log file being tracked."
I simply want the Log Analytics Agent for Linux to get the data from the newly created file (the file is never updated) and push all the logs to the Workspace. Is it possible with the Log Analytics Agent for Linux?
Thank you very much in advanced
Nov 03 2020 09:48 AM
@simonepatonico Does the new file following the required naming standards in regards to having the date and time as part of the filename?
Nov 03 2020 10:00 AM
@Gary Bushey For now I did some tests with files named Norma01.csv, Norma02.csv, etc...
I did the configuration required on the Log Analytics Workspace as you can see from the attached figure.
Nov 03 2020 12:20 PM
@simonepatonico Looking at the document link you posted it states the following. Are you following this naming convention? It did not look like it from your image unless there is only 1 entry in the file that was shown in the image.
The log must either have a single entry per line or use a timestamp matching one of the following formats at the start of each entry.
M/D/YYYY HH:MM:SS AM/PM
Mon DD, YYYY HH:MM:SS
MMM d hh:mm:ss
Nov 03 2020 12:28 PM
@Gary Bushey The log file has a single entry per line.
Also, all the other prerequisites are satisfied:
- The log file does not allow circular logging or log rotation (In my case the file is never changed)
- The log file must use ASCII or UTF-8 encoding (In my case the log file uses UTF-8 encoding).
Nov 04 2020 04:38 AM
@simonepatonico And just to confirm my understanding, when you get a new file added to the folder, you never see its data being uploaded. Is that correct?
Nov 04 2020 04:53 AM
@Gary Bushey yes the data is never uploaded on the Workspace because the file is never changed. I noticed that the agent is using in_tail plugin of fluentd to upload data when new logs are appended to the file.
So my question: How can I upload logs from a file that is never changed with the OMS Agent?
Nov 04 2020 05:00 AM
@simonepatonico The OMS agent will not upload the data because, as far as it is concerned, the file has never changed.
I am guessing there is no way to tell the program that sends the data to write to a different file each time. I would think you would need to write a bash script to detect when the last modified file date for the file has changed and rename it using the date naming format discussed earlier so that the agent detects it as a new file. Not sure how often the file gets written to so it may be tricky making sure you do not lock the file when it needs to be written to.
Apr 23 2021 07:04 AM
Apr 26 2021 07:57 AM
Apr 26 2021 08:56 AM