Forum Discussion

omrip's avatar
omrip
Copper Contributor
Oct 10, 2019

Managing lists

how can i manage a list on Sentinel

for instance- i have a list of known assets that hold hundreds+ assets and when the search runs i would like to search and check if there is a hit in the list

obviously using similar solution  such as above is not possible:

let List = datatable(Account:string, Domain:string)
["john", "johnsdomain.com", "greg", "gregsdomain.net", "larry", "Domain"];

 the same goes for IOC's i have found in my enviroment and would like to search for a hit.

  • Hi omrip 

     

    I struggling to understand what you are asking here, so sorry to ask again? 

     

    Are you trying to read from a file, if so see https://cloudblogs.microsoft.com/industry-blog/en-gb/cross-industry/2019/08/13/azure-log-analytics-how-to-read-a-file/   If you are trying to create a file from Log Analytics, you can't do that, only read from a file is possible using externaldata operator as per my example.  You can build lists on the fly / at run time with a data table as shown.  

    If it's a file you need to upload, perhaps on a schedule, you might need to use Logic Apps to control that workflow/process.  Then read from it with extrernaldata and parse the JSON (if it's JSON )

  • MiteshAgrawal's avatar
    MiteshAgrawal
    Brass Contributor
    Hi, Did you get answer to your query? I also have 1000's of IOCs to be used against rules to check for a match. And if using BLOB storage isn't an option (want to read data from a file stored locally in the system) then what should we do?

    Regards,
    Mitesh Agrawal
  • omrip 

     

    Option 1 - you can use IN or !IN to include or exclude

    let List = datatable(Account:string, Domain:string)
    ["john", "johnsdomain.com",
     "Demo", "gregsdomain.net",
     "larry", "Domain"];
    SigninLogs
    | where Identity in (List)
    

    Option 2 - you can use a JOIN as well 

    let masterList = dynamic (['GB', 'US']);  // setup a master list of country codes
    SigninLogs
    | where TimeGenerated >= ago(1d)
    | summarize perIdentityAuthCount=count() by Identity,  
                locationString= strcat(tostring(LocationDetails["countryOrRegion"]),
                 "/", tostring(LocationDetails["state"]), "/",
                 tostring(LocationDetails["city"]), ";" ,
                 tostring(LocationDetails["geoCoordinates"])),
                 countryString= strcat(tostring(LocationDetails["countryOrRegion"]))
    // filter on masterList of country codes, exclude those on the list
    | where countryString !in (masterList)
    | summarize distinctAccountCount = count(), identityList=makeset(Identity), t = tostring(masterList)  by locationString
    | extend identityList = iff(distinctAccountCount<10, identityList, "multiple (>10)")
    | join kind= anti (
        SigninLogs
        | where TimeGenerated < ago(1d)
        | project   locationString= strcat(tostring(LocationDetails["countryOrRegion"]), "/", tostring(LocationDetails["state"]), "/", 
            tostring(LocationDetails["city"]), ";" , tostring(LocationDetails["geoCoordinates"]))
        | summarize priorCount = count() by locationString) on locationString
    | where distinctAccountCount >= 1 // select threshold above which #new accounts from a new location is deemed suspicious

     

    Option 3 - create a group/list with a query and compare it to another table 

     

    // First create a list of Linux machines that startwith "aksnnnnnnn" 
    let myLinuxGrp = toscalar(Heartbeat 
    | where OSType == "Linux" and Computer startswith "aks" 
    | summarize make_set(Computer));   
    Syslog
    | where TimeGenerated > ago(60m) 
    | where myLinuxGrp contains Computer 
    | project myLinuxGrp, Computer , SyslogMessage 

      

    • omrip's avatar
      omrip
      Copper Contributor

      CliveWatson 

      thx

      1. i do not think you understood my intention.

      i have hundreds of endpoint and would like to create a large table/file from my known assets and to check on top of that.

      beside managing it locally and and using the mantioned _json, is there a way to upload the file to the Azure and run on top of that?

      2. what do i do in case i am managing a large amount of IOC's list , if i run a search on that list and i do not have it, i would like to ingest the new IOC i found to the list.

      again, for neither of the cases i do not wish to mange them locally but in the Azure.

       

      • CliveWatson's avatar
        CliveWatson
        Icon for Microsoft rankMicrosoft

        Hi omrip 

         

        I struggling to understand what you are asking here, so sorry to ask again? 

         

        Are you trying to read from a file, if so see https://cloudblogs.microsoft.com/industry-blog/en-gb/cross-industry/2019/08/13/azure-log-analytics-how-to-read-a-file/   If you are trying to create a file from Log Analytics, you can't do that, only read from a file is possible using externaldata operator as per my example.  You can build lists on the fly / at run time with a data table as shown.  

        If it's a file you need to upload, perhaps on a schedule, you might need to use Logic Apps to control that workflow/process.  Then read from it with extrernaldata and parse the JSON (if it's JSON )

Resources