Microsoft Secure Tech Accelerator
Apr 03 2024, 07:00 AM - 11:00 AM (PDT)
Microsoft Tech Community

Ingestion of Custom Logs of Files (Never Updated) in Azure Sentinel

Brass Contributor

Hi all, I need to send the custom logs of a CSV file to Azure Sentinel. New files are written daily on a collector device from several applications running on other devices. I was able to transfer the files on the RedHat 7 collector device through SFTP but the logs are not sent to the Log Analytics Workspace of Sentinel by the Log Analytics Agent.

 

The Agent is correctly configured (custom log path, etc...) because I was able to receive a log when I manually updated one of the log files. The problem here is that the new file is never flushed with new data. From the documentation (https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-custom-logs) I see this statement:

 

"Custom log collection requires that the application writing the log file flushes the log content to the disk periodically. This is because the custom log collection relies on filesystem change notifications for the log file being tracked."

 

I simply want the Log Analytics Agent for Linux to get the data from the newly created file (the file is never updated) and push all the logs to the Workspace. Is it possible with the Log Analytics Agent for Linux?

Thank you very much in advanced

Simone

11 Replies

@simonepatonico Does the new file following the required naming standards in regards to having the date and time as part of the filename?

@Gary Bushey For now I did some tests with files named Norma01.csv, Norma02.csv, etc...

I did the configuration required on the Log Analytics Workspace as you can see from the attached figure.

 

@simonepatonico Looking at the document link you posted it states the following.  Are you following this naming convention?  It did not look like it from your image unless there is only 1 entry in the file that was shown in the image.

 

The log must either have a single entry per line or use a timestamp matching one of the following formats at the start of each entry.

YYYY-MM-DD HH:MM:SS
M/D/YYYY HH:MM:SS AM/PM
Mon DD, YYYY HH:MM:SS
yyMMdd HH:mm:ss
ddMMyy HH:mm:ss
MMM d hh:mm:ss
dd/MMM/yyyy:HH:mm:ss zzz
yyyy-MM-ddTHH:mm:ssK

@Gary Bushey The log file has a single entry per line.

Also, all the other prerequisites are satisfied:

The log file does not allow circular logging or log rotation (In my case the file is never changed)

- The log file must use ASCII or UTF-8 encoding (In my case the log file uses UTF-8 encoding).

@simonepatonico And just to confirm my understanding, when you get a new file added to the folder, you never see its data being uploaded.   Is that correct?

@Gary Bushey yes the data is never uploaded on the Workspace because the file is never changed. I noticed that the agent is using in_tail plugin of fluentd to upload data when new logs are appended to the file.

 

So my question: How can I upload logs from a file that is never changed with the OMS Agent? 

@simonepatonico The OMS agent will not upload the data because, as far as it is concerned, the file has never changed. 

 

I am guessing there is no way to tell the program that sends the data to write to a different file each time.  I would think you would need to write a bash script to detect when the last modified file date for the file has changed and rename it using the date naming format discussed earlier so that the agent detects it as a new file.  Not sure how often the file gets written to so it may be tricky making sure you do not lock the file when it needs to be written to.

Hi All

how was this resolve?
I have the same issue. I am trying to ingest a log file from Sybase, it's just one file that doesn't change but the log is written to the file every minute. The XXX.log is a per-line format and it meets all the other requirements. I created the sybase_CL and it's not working. is there a workaround?
@GaryBushey @simonepatonico How were you guys able to resolve this?
Hello makniy,

I solved the problem making a bash script (cron job) that creates an empty file in the path checked by the OMS agent when a new file is received from the application. After waiting for 2 minutes (To be sure the OMS agent found the empty file), it copies the content of the received file in the empty file. This way, it works because the file monitored by the OMS agent changes.
thanks, @simonepatonico.
Would you be able to share ur script either as a template or without sensitive details.
Also, I am now thinking that my log doesn't get a new file but a new update every 2 - 4 mins. So it retails every update in the same log file (PX1.log or QX1.BS.log).