Microsoft Secure Tech Accelerator
Apr 03 2024, 07:00 AM - 11:00 AM (PDT)
Microsoft Tech Community
Enriching Azure Sentinel with Azure AD information
Published Apr 10 2020 03:00 PM 17.1K Views
Microsoft

Howdy Everyone!

 

As organizations are migrating over to Azure Sentinel as their primary SIEM solution, they are looking at ways to enrich their data. For example associating Azure Activity logs or Office 365 Data with an organizational unit derived from Azure AD. Such enrichment enables filtering the information based on organizational units.

 

In this walk-through, I will demonstrate how to import Azure AD information which includes the subsidiary association of users as an extension property to Azure Sentinel. I will than use a join query to enrich event data and use the enrichment information to split up events based on subsidiary. This will enable me to send the data to a separate workspace representing the subsidiary owned date store.

 

Special Thanks!

Yaniv Shasha- Thank you for the assistance on creating the automation scripts with Azure Function.

https://www.linkedin.com/in/yanivshasha/ 

Jim Britt - Thank you for assistance of co-working this problem together and your PS Module.

https://www.linkedin.com/in/jimbritt/

 

Requirements:

Workspace and Log Permissions:

You’ll be needing API permissions for the Log Analytics workspace: More details here , I’ll personally be using the Log Analytics Contributor role. For my example, I’m using AAD permissions, please make sure you have the appropriate permissions when pulling your data. AAD Roles here if your pulling data from Azure AD.

 

PS Modules I’m using with PowerShell:

 

Walking through the test AAD Ingestion:

Based on the goal above, I’m looking at getting all the tenant AAD Attributes into Azure Sentinel to enrich my current dataset. As there is no native "data connector" for AAD Attributes we’ll have to build our own connector.

 

We have a few options of ingesting data into Log Analytics, those all centered on the API Log Analytics offers. Examples are Data connector with Logic Apps, PS Module or even the raw API with Scripting. We'll be using  and work on automation after. Playbooks can be a simple option as well, could even automate this pulling of this information with playbooks, although Azure Automation or Scripting can be cheaper if you’re expecting to Query/Format/Push lots of data into an Log Analytics workspace. I'm planning on submitting unformatted information into Log Analytics from AAD.

 

Starting with PowerShell:

This section is walking through the process with manually running PowerShell to ingest data. The scripts mentioned in this walk-through will be added to the Azure Sentinel Github Community: Link to Community.

 

3 min video explaining the process through PowerShell:

Getting started we'll need to connect to Azure Active Directory via the AzureAD Powershell Module.

 

 

 

Connect-AzureAD -AccountId <UPN> -TenantId <Tenant GUID>

 

 

 

Connect.png

I’m going to collect all the AzureAD Users and store it into a variable of $UsersCollected

 

 

$UsersCollected = Get-AzureADUser -All $True

 

 

Feel free to look at the variable, we’re needing this to be in JSON format to ingest the data cleanly. Going to make another variable to make my future scripting easier.

 

 

$JsonUserFormat = ConvertTo-Json $UsersCollected

 

 

Now all the information collected from the Azure AD Tenant has collected, we’re needing to test the ingestion into the Log Analytics workspace Azure Sentinel is using. You’ll need the following information:

  • $CustomerID : <Log Analytics WORKSPACE ID>
  • $SharedKey : <Log Analytics Primary Key>
  • $JsonUserFormat
  • $logType = “AADLogTest”
    • Important, this is the custom table field the logs will be ingested in.

(Optional)

  • $TimeStampfield = Get-Date
    • Will show the time of the ingestion

Copy and paste the below, update with your workspace information:

 

 

$CustomerID = ‘<WORKSPACE ID>’
$SharedKey = ‘<WORKSPACE KEY>’
$logType = “<WORKSPACE CUSTOM TABLE NAME>”
$TimeStampfield = Get-Date

 

 

Variables.png

Now using the OMSAPI Module, send the data to the Log Analytics workspace.

 

 

Send-OMSAPIIngestionFile -customerId $customerId -sharedKey $sharedKey -body $JsonUserFormat -logType $logType -TimeStampField $Timestampfield

 

 

OMSAPI.png

Accepting the data will be very quick, the very first time it can take up to 20-30 mins for the table to be build into the Log Analytics workspace.

Looking at the data with-in Azure Sentinel’s log Analytics workspace:

 

AADLogTest.png

You’ll see above the Custom Logs table has been crated, and I was able to query the information.

 

An example user in my domain, Adele Vance, this user has been preconfigured with an ExtensionProperty that tells us her home office location. We're interested in enriching our Azure Sentinel alerts with this property.

Looking up her name, we found her Azure Active Directory Attributes information, then filtered the query down to the information we're currently interested in:

 

 

 

AADLogTest_CL
| where UserPrincipalName_s contains "adelev"
| project UserPN = UserPrincipalName_s, ObjectId = ObjectId_g, UserType = UserType_s, CompanyLocation = ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s

 

 

AdelevTest.png

Now seeing the information we're wanting, lets join the “CompanyLocation” property with the UPN with other data sets. We're going to use SigninLogs, as she's just been complained about login failure's to her account multiple times with the wrong password.

 

Here is an example of joining the data together with SigninLogs query example:

 

 

SigninLogs
| where UserPrincipalName contains "adelev"
| project UserPrincipalName_s = UserPrincipalName, Status, Location, IPAddress, Identity, ResultDescription
| join kind= inner (
   AADLogTest_CL
) on UserPrincipalName_s
| project Username_S = UserPrincipalName_s, Status, Location, IPAddress, Identity, ResultDescription, ObjectId = ObjectId_g, UserType = UserType_s, CompanyLocation = ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s

 

 

AdelevTestwSigninLogs.png

Going one step further; we could look at multiple data sets and query out the information based on the UPN location:

 

Example would be us looking at SigninLogs, OfficeActivity and AuditLogs.

 

 

let UserLocation = "US - Texas";
let T1SigninLogs =
SigninLogs
| extend UserPrincipalName_s = UserPrincipalName
| join kind= inner (
   AADLogTest_CL
   | where TimeGenerated >= ago(7d)
   | where ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s == UserLocation  
) on UserPrincipalName_s;
let T2AuditLogs =
AuditLogs
| extend UserPrincipalName_s = tostring(parse_json(tostring(InitiatedBy.user)).userPrincipalName)
| join kind= inner (
   AADLogTest_CL
   | where TimeGenerated >= ago(7d)
   | where ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s == UserLocation  
) on UserPrincipalName_s;
let T3OfficeActivity =
OfficeActivity 
| extend UserPrincipalName_s = UserId
| join kind= inner (
   AADLogTest_CL
   | where TimeGenerated >= ago(7d)
   | where ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s == UserLocation  
) on UserPrincipalName_s; 
union T1SigninLogs, T2AuditLogs, T3OfficeActivity

 

 

 

Azure Sentinel Playbook:

4 Min video walking through Playbook design:

 

Now that we're able to see the data, lets build an Azure Sentinel playbook to send the data into the US – Texas location Log Analytics workspace. This can be accomplished via Azure Automation, PS with API’s another easier way to visually see what’s going is a Playbook (Logic Apps) with Azure Sentinel.

Going into our Azure Sentinel Playbooks, create a new Playbook and decide how you’re wanting to start playbook. Popular choices are “Recurrence”, “HTTP request”, and “Alert Triggered with Azure Sentinel”.

For this example, lets use Recurrence, every 30 mins.

Recurrence.png

 

In this example, we're not going to dynamically fill a Location value, although this might be accomplished through some type of pull or alert or HTTP Request. Skipping that part, lets setup a “Location Value” with the Variables connector. Select “Initialize Variable”

Configure the following:

  • Name
  • Type : String
  • Value
 

Now to collect the information, we could use the Log Analytics API although I’m going to utilize the Log Analytics connector for querying information to simplify the process. Find “Azure Monitor Logs”, Select “Run Query and List Results”

Configure the following: We’re using the Workspace the AAD information has been sent to.

  • Subscription: Workspace Sub
  • ResourceGroup: Workspace Resource Group
  • ResourceType : Log Analytics Workspace
  • ResourceName: Workspace Name
  • Query Example:

 

 

SigninLogs
| where TimeGenerated >= ago(30m)
| extend UserPrincipalName_s = UserPrincipalName
| join kind= inner (
   AADLogTest_CL
   | where TimeGenerated >= ago(7d)
   | where ExtensionProperty_extension_1fdbfe7a3f344194b08d177c8125ef04_UserLocation_s == "@{variables('Location')}"
) on UserPrincipalName_s

 

 

  • Time Range: Set in query

LogAQuery (2).png

In the query we’ve configured to only look at Signinlogs to test the failure of login’s, like I showed earlier this could be expanded to include multiple tables when pulling information. We’re not caring about what user, unless the username has the approved value when we’re using the ExtensionProperty_UserLocation to match “US – Texas”. I’m planning on only updating my AAD tables once a week, why I’ve configured it look back 7 days, and I’m wanting only the past 30 mins of information to limit my window. This could be bigger or smaller, although to prevent repeating information setting a window would be better during reoccurring of pulling information over.

 

Now to compose the data to be ingested into Log Analytics again, the information I’m wanting is the “values” collected. So it’ll look something like this à

 

Compose.png

Now to send the data over to another workspace, or the same workspace. We’re going to build a new custom table for enrichment of information called “USTexasData” and send the outputs we’ve filtered into this workspace.

Configure the following: We’re using the Workspace you’re wanting to send the filtered information to

  • JSON Request Body : Outputs from Compose
  • Custom Log Name: Whatever Custom log name you’re wanting to use
  • Connect to : Confirm the connection you’re wanting to connect to, make sure whatever account you’re using has the correct access to write to this workspace when building that connection.

Example:

SendData.png

To create some noise, attempt to unsucessfully login to the account we're monitoring,Adelev's account in this example, we should now see a new collection of data after the 30 min window automatically to this table configured in the logic app (USTexasData_CL):

 

USTexasData.png

 

We've now successfully filtered our our users that are at the US - Texas location and sent these logs over to a new custom table.

 

If anyone has any other requests on what to be blogged about, please feel free to reply to this blog post.

 

Thanks!

 

Chris Boehm

Customer Experience Engineering - Azure Sentinel Team.

Linkedin Profile 

15 Comments
Copper Contributor

Good long source. thanks

Silver Contributor

Seeing some examples with data from the 3rd party connectors would be interesting. 

Microsoft

@Dean Gross - Specifically with AAD Enrichment with a 3rd party connector? Would be more than happy to show an example, is there a specific one you're thinking of? 

Silver Contributor

@Chris Boehm  not necessarily AAD Enrichment, but if there is something that could be done with Okta, Ping, or Duo that could be interesting. I was actually thinking about examples with sources like AWS, F5, Barracuda, Checkpoint etc. Maybe something from each of the major categories of connector types. 

Copper Contributor

@Chris Boehm  Hey Chris, have you added AAD Ingestion script to the Github. I would appreciate if you can add it there please.

Copper Contributor

Hi Chris,

 

Thanks for sharing command with details. I have tried it for sample data and bulk data.

For sample data it works fine but for bulk data for example more then 2500 entries it does not work and throws errors.

Command used:

Install-Module -Name OMSIngestionAPI
$MyCredential = Get-Credential
$TenantID = 'TenantID'
Connect-AzureAD -TenantId $TenantID -Credential $MyCredential
$UserCollected = Get-AzureADUser -All $True
$JsonUserFormat = ConvertTo-Json $UserCollected
$customerId = 'workspace ID of sentinel'
$sharedKey = 'shared key of sentinel'
$logType = 'TableName'
$TimeStampfield = Get-date
Send-OMSAPIIngestionFile -customerId $customerId -sharedKey $sharedKey -body $JsonUserFormat -logType $logType -TimeStampField $Timestampfield
 
Is there any way to split data for variable $UserCollected and use it into chunk of 2500 users per execution??
Microsoft

@milanthumar ,

 

I ran into this myself when testing 50,000 users. I had to break it down with a count -> between 0-49, 50-99, etc.

 

$UsersCollected = Get-AzureADUser -All $True
$TotalCount = $UsersCollected.count

 

# Example of ingesting data 10 at a time into Log Analytics
For ($number = 0; $number -le $TotalCount; $number+=10)
{
$TimeStampfield = Get-date
$TopNumber = $number+10
$JSONUser += ConvertTo-Json $UsersCollected[$number..$TopNumber] -Compress -ErrorAction Stop
Send-OMSAPIIngestionFile -customerId $customerId -sharedKey $sharedKey -body $JSONUser -logType $logType -TimeStampField $Timestampfield -Verbose
}

 

Copper Contributor

$Great thanks @Chris Boehm 

#Given script worked

#Finally I could ingest data in bulk with script with small correction $JSONUser = instead of $JSONUser += 

 

$UsersCollected = Get-AzureADUser -All $True
$TotalCount = $UsersCollected.count

$I = <Constant Number>

For($number = 0; $number -le $TotalCount; $number+=$I)

{

$TopNumber = $number+$I

$TimeStampfield = Get-date

$JSONUser = ConvertTo-Json $UserCollected[$number..$TopNumber] -Compress -ErrorAction Stop

Send-OMSAPIIngestionFile -customerId $customerId -sharedKey $sharedKey -body $JsonUser -logType $logType -TimeStampField $Timestampfield -Verbose

}

Copper Contributor

Hi All,

Thank you for nice article and discussion.

Is there anyway to schedule AD Enrichment using Azure function / logic app or Powershell script where we don't need to interact to key in credentials and run once everyday.

Copper Contributor

Thank you @Chris Boehm i tested with a scheduled Azure Automation powershell runbook with a workspace key stored in Azure Keyvault. It works like a charm!

Copper Contributor

Are there any methods that work to purge the custom logs prior to uploading? Our thought is that we will periodically run the script to send up a fresh set of logs (ie: new users); however duplicate entries will be found in the table on successive runs.

 

 

Copper Contributor
Copper Contributor

Thanks, @Hds. I’ll look into this.

Copper Contributor

After you enable UEBA for your Azure Sentinel workspace, data from your Azure Active Directory is synchronized to the IdentityInfo table in Log Analytics for use in Azure Sentinel. You can embed user data synchronized from your Azure AD from the in your analytics rules to enhance your analytics to fit your use cases and reduce false positives.

https://docs.microsoft.com/en-us/azure/sentinel/ueba-enrichments#identityinfo-table-public-preview 

Copper Contributor

Very useful article, thanks!

 

@Hds did you use the AzureAD module, or the Az.Accounts module to gather the data?

If the former, given its upcoming deprecation, are you switching to Microsoft.Graph.Users, or another method?

 

'm about to create a similar Runbbok but of course if I can re-use someone's already-functioning effort it will save some time! :) Will be taking the extra advice on data purging onboard too.

Version history
Last update:
‎Nov 02 2021 05:52 PM
Updated by: