Feb 04 2020 11:34 PM
Feb 04 2020 11:34 PM
I created a txt file and called it in the Custom log. The txt file contains only 5 IPs separated by new line.
I am not getting those 5 IPs in RawData for that Custom log.
Do we need to do anything to enable reading the files via Custom log? My requirement is to make a list out of those IPs.
Feb 05 2020 03:54 AM
@MiteshAgrawal Another option you can try to is to use PowerShell to read the file and add the entries into the Log Analytics workspace. These links will help get you started.
Feb 05 2020 05:07 AM
Hi @Gary Bushey ,
Thanks for your quick reply.
Please find the attached screenshots which clearly shows what I am trying to do.
I am following the below mentioned post to read data from txt files and looking for those entries in txt files in RawData under newly created Custom Log.
I have the txt file in my local system with access to Azure. Custom Log creation was successful.
Feb 05 2020 05:08 AM
Feb 05 2020 05:18 AMSolution
1. 1st check, does your machine that contains the log file have a working MMA on it? You have data from that computer in the Heartbeat table?
2. In Log Analytics the table shows up in the Schema? I think the answer is no?
<name you used>_CL
custom_CL | limit 10
If the MMA isn't talking to Azure then its likely there is a network issue (often a proxy). Instructions will vary by product and setup to resolve this. Can you put the file on a server that is working?
Feb 06 2020 12:18 PM
Thanks a tonne for your help.
I was having issue with the MMA and I resolved it and as you mentioned, I started receiving heartbeat from my system.
Also, in sometime I got the Custom Logs as well.
Without your help this might not be possible. I have been using Azure only from past couple of days and was able to do this only because of your help. Thank you and the community.
I have three more requirements. I have received the IPs in the RawData under Custom Log source table name.
1. I want to save those IPs in some list or table permanently which I can call in rules. How can I achieve this?
2. I used BLOB storage and uploaded a txt file over there with IPs. Called them in the query using externaldata operator. This is successful. But again I want to use these IPs against rules and so want to save these IPs in some list. And what if I delete the blob object, if I do not create a list then we cannot reference the data via externaldata operator right?
3. How can I change the time of querying the file read for Custom Log collection? I want to read the logs from the file in every 1 hours (for example)?
Please find the necessary screenshots attached in order to understand my requirement correctly.
Feb 07 2020 03:05 AM
1. https://techcommunity.microsoft.com/t5/azure-sentinel/implementing-lookups-in-azure-sentinel-part-1-... lists externaldata that you've used.
I like this as the file on Blob is a single thing to maintain centrally, used by many scripts.
2. If you delete the file, externaldata will fail. An alternative is a datatable in each Query https://docs.microsoft.com/en-us/azure/kusto/query/datatableoperator?pivots=azuremonitor - depending on how much data you want to compare and how often it changes, and its unique to each query.
3. Custom log works that way (when files changes/flushes occur), alternatives are: https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-custom-logs
In the cases where your data can't be collected with custom logs, consider the following alternate strategies:
Feb 10 2020 05:19 AM
Hi @Clive Watson,
How can I append blobs using scripts? I mean if I get some new IOCs then I do not want to manually upload the BLOB. I want a mechanism which can append some data in an exisiting BLOB.
Also, is it possible to get data from an external website and do certain operations and create a BLOB?