10-10-2019 11:54 AM
10-10-2019 11:54 AM
how can i manage a list on Sentinel
for instance- i have a list of known assets that hold hundreds+ assets and when the search runs i would like to search and check if there is a hit in the list
obviously using similar solution such as above is not possible:
let List = datatable(Account:string, Domain:string) ["john", "johnsdomain.com", "greg", "gregsdomain.net", "larry", "Domain"];
the same goes for IOC's i have found in my enviroment and would like to search for a hit.
10-11-2019 01:49 AM
Option 1 - you can use IN or !IN to include or exclude
let List = datatable(Account:string, Domain:string) ["john", "johnsdomain.com", "Demo", "gregsdomain.net", "larry", "Domain"]; SigninLogs | where Identity in (List)
Option 2 - you can use a JOIN as well
let masterList = dynamic (['GB', 'US']); // setup a master list of country codes SigninLogs | where TimeGenerated >= ago(1d) | summarize perIdentityAuthCount=count() by Identity, locationString= strcat(tostring(LocationDetails["countryOrRegion"]), "/", tostring(LocationDetails["state"]), "/", tostring(LocationDetails["city"]), ";" , tostring(LocationDetails["geoCoordinates"])), countryString= strcat(tostring(LocationDetails["countryOrRegion"])) // filter on masterList of country codes, exclude those on the list | where countryString !in (masterList) | summarize distinctAccountCount = count(), identityList=makeset(Identity), t = tostring(masterList) by locationString | extend identityList = iff(distinctAccountCount<10, identityList, "multiple (>10)") | join kind= anti ( SigninLogs | where TimeGenerated < ago(1d) | project locationString= strcat(tostring(LocationDetails["countryOrRegion"]), "/", tostring(LocationDetails["state"]), "/", tostring(LocationDetails["city"]), ";" , tostring(LocationDetails["geoCoordinates"])) | summarize priorCount = count() by locationString) on locationString | where distinctAccountCount >= 1 // select threshold above which #new accounts from a new location is deemed suspicious
Option 3 - create a group/list with a query and compare it to another table
// First create a list of Linux machines that startwith "aksnnnnnnn" let myLinuxGrp = toscalar(Heartbeat | where OSType == "Linux" and Computer startswith "aks" | summarize make_set(Computer)); Syslog | where TimeGenerated > ago(60m) | where myLinuxGrp contains Computer | project myLinuxGrp, Computer , SyslogMessage
10-12-2019 02:04 AM
1. i do not think you understood my intention.
i have hundreds of endpoint and would like to create a large table/file from my known assets and to check on top of that.
beside managing it locally and and using the mantioned _json, is there a way to upload the file to the Azure and run on top of that?
2. what do i do in case i am managing a large amount of IOC's list , if i run a search on that list and i do not have it, i would like to ingest the new IOC i found to the list.
again, for neither of the cases i do not wish to mange them locally but in the Azure.
10-13-2019 10:43 AMSolution
I struggling to understand what you are asking here, so sorry to ask again?
Are you trying to read from a file, if so see https://cloudblogs.microsoft.com/industry-blog/en-gb/cross-industry/2019/08/13/azure-log-analytics-h... If you are trying to create a file from Log Analytics, you can't do that, only read from a file is possible using externaldata operator as per my example. You can build lists on the fly / at run time with a data table as shown.
If it's a file you need to upload, perhaps on a schedule, you might need to use Logic Apps to control that workflow/process. Then read from it with extrernaldata and parse the JSON (if it's JSON )
02-03-2020 02:10 AM
02-03-2020 02:58 AM
There are more guidance articles https://techcommunity.microsoft.com/t5/azure-sentinel/implementing-lookups-in-azure-sentinel-part-1-... and more to follow. Also have you considered a custom log?
or reading data from a file using a Logic App?
02-03-2020 04:21 AM
02-19-2020 07:05 AM
can you please elaborate on the process which you have mentioned :
1. "If it's a file you need to upload, perhaps on a schedule, you might need to use Logic Apps to control that workflow/process. Then read from it with external data and parse the JSON (if it's JSON )"
2. also what is the process of using a blob storage?
3. am i bind of using only a Blob storage?
4. the external file must be json format?
02-25-2020 10:26 AM
The above has examples like this (adapt the whitelist line to your own file)
let timeRange = 1d; let whitelist = externaldata (UserPrincipalName: string) [h"https://..."] with (ignoreFirstRecord=true); SigninLogs | where TimeGenerated >= ago(timeRange) | where UserPrincipalName !in~ (whitelist)
Using your data across all tables, would need a union or join e.g. (jusr replace the fake whitelist with your one).
let whitelist = dynamic(["fake IOC","another fakeIOC"]); union withsource=TableName * | where Indicator in (whitelist)
by MiteshAgrawal on February 27, 2020
by MiteshAgrawal on January 30, 2020
by CharithCaldera on May 25, 2020
by HarshaKashyap on May 22, 2020
by Azure-Monitor-Team on April 16, 2020