Forum Discussion
Managing lists
- Oct 13, 2019
Hi omrip
I struggling to understand what you are asking here, so sorry to ask again?
Are you trying to read from a file, if so see https://cloudblogs.microsoft.com/industry-blog/en-gb/cross-industry/2019/08/13/azure-log-analytics-how-to-read-a-file/ If you are trying to create a file from Log Analytics, you can't do that, only read from a file is possible using externaldata operator as per my example. You can build lists on the fly / at run time with a data table as shown.
If it's a file you need to upload, perhaps on a schedule, you might need to use Logic Apps to control that workflow/process. Then read from it with extrernaldata and parse the JSON (if it's JSON )
thx
1. i do not think you understood my intention.
i have hundreds of endpoint and would like to create a large table/file from my known assets and to check on top of that.
beside managing it locally and and using the mantioned _json, is there a way to upload the file to the Azure and run on top of that?
2. what do i do in case i am managing a large amount of IOC's list , if i run a search on that list and i do not have it, i would like to ingest the new IOC i found to the list.
again, for neither of the cases i do not wish to mange them locally but in the Azure.
Hi omrip
I struggling to understand what you are asking here, so sorry to ask again?
Are you trying to read from a file, if so see https://cloudblogs.microsoft.com/industry-blog/en-gb/cross-industry/2019/08/13/azure-log-analytics-how-to-read-a-file/ If you are trying to create a file from Log Analytics, you can't do that, only read from a file is possible using externaldata operator as per my example. You can build lists on the fly / at run time with a data table as shown.
If it's a file you need to upload, perhaps on a schedule, you might need to use Logic Apps to control that workflow/process. Then read from it with extrernaldata and parse the JSON (if it's JSON )
- omripFeb 25, 2020Copper Contributor
i got it using
externaldata (Type:string, Indicator:string , Campaign:string ) [@"https://xxxxxxxx.csv"]1. how to i search for a hit of the IOC's on all of the tables on sentinel.2. how do i do that on specific tables- CliveWatsonFeb 25, 2020Former Employee
The above has examples like this (adapt the whitelist line to your own file)
let timeRange = 1d; let whitelist = externaldata (UserPrincipalName: string) [h"https://..."] with (ignoreFirstRecord=true); SigninLogs | where TimeGenerated >= ago(timeRange) | where UserPrincipalName !in~ (whitelist)Using your data across all tables, would need a union or join e.g. (jusr replace the fake whitelist with your one).
externaldata (Type:string, Indicator:string , Campaign:string ) [let whitelist = dynamic(["fake IOC","another fakeIOC"]); union withsource=TableName * | where Indicator in (whitelist)
- omripFeb 19, 2020Copper Contributor
HI CliveWatson
can you please elaborate on the process which you have mentioned :
1. "If it's a file you need to upload, perhaps on a schedule, you might need to use Logic Apps to control that workflow/process. Then read from it with external data and parse the JSON (if it's JSON )"
2. also what is the process of using a blob storage?
3. am i bind of using only a Blob storage?
4. the external file must be json format?
- MiteshAgrawalFeb 03, 2020Brass Contributor
Hi CliveWatson,
I have 1000's of IOCs to be used against rules to check for a match. And if using BLOB storage isn't an option (want to read data from a file stored locally in the system) then what should we do?
Regards,
Mitesh Agrawal- CliveWatsonFeb 03, 2020Former Employee
There are more guidance articles https://techcommunity.microsoft.com/t5/azure-sentinel/implementing-lookups-in-azure-sentinel-part-1-reference-files/ba-p/1091306 and more to follow. Also have you considered a custom log?
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-custom-logs
or reading data from a file using a Logic App?
- MiteshAgrawalFeb 03, 2020Brass ContributorHi Clive,
Thanks for the links. The first one is related to BLOB storage which we aren't using as of now.
I found 2nd one interesting and will definitely try creating a Custom log source to read files.
Regards,
Mitesh Agrawal