Microsoft Defender for Cloud export to Azure Data Explorer
Published May 26 2022 09:16 AM 2,847 Views
Microsoft

Are you interested in sending your Microsoft Defender for Cloud logs to Azure Data Explorer? The usual pattern for this would be to utilize Azure Log Analytic continuous export functionality but the Defender logs aren’t currently supported so let’s take a look at another option. Did you know that Microsoft Defender for Cloud has Continuous Export built into the product?

 

Continuously export Microsoft Defender for Cloud data

 

Microsoft Defender for Cloud Continuous Export

So lets walk through the steps to use this to export the data to Event Hub and then ingest it into Azure Data Explorer.

 

  1. In the Azure Portal open the Microsoft Defender for Cloud blade.
  2. Under Management select “Environment settings”

bwatts670_0-1652807796185.png

  1. Find the subscription that you want to enable this for and select it. 
  2. Select the “Continuous export” tab and configure your export to an existing Event Hub. I’m exporting the “Security recommendations”, “Secure score”, “Security alerts”, and “Regulatory compliance”
  3. Click on “Save” and your export will be setup.

 

Azure Data Explorer Configuration

Now we need to setup our ingestion from Event Hub to Azure Data Explorer. At a very high level the process will be to:

  • Utilize OneClick UI to setup the ingestion of data from Event Hub into a raw table.
  • Create tables for the different log types (“Security recommendations”, “Secure score”, “Security alerts”, and “Regulatory compliance”)
  • Use update policies to transform the data from the raw table to the appropriate destination table
  • Modify the retention period on the raw table to a few hours

 

OneClick to Configure Ingestion from Event Hub

If you haven’t used OneClick Ingestion I would suggest looking at the following video.

 

Azure Data Explorer One Click Ingestion for Azure Event Hub - YouTube

 

  1. Go to https://dataexplorer.azure.com and connect to your ADX cluster
  2. Right click on the database where you wish to ingest the Azure Defender for Cloud logs and select “Ingest new data”

bwatts670_2-1652807796197.png

 

 

  1. In the “Destination” tab we’ll create a new table. I called mine defenderraw but you can utilize whatever name you wish.

bwatts670_3-1652807796202.png

 

  1. In the “Source” tab select “Event Hub” for the type and then choose the Event Hub your exporting the logs to.

 

bwatts670_0-1652822427970.png

 

 

  1. Once it pulls some messages from Event Hub modify your data type from string to dynamic.

bwatts670_5-1652807796218.png

 

  1. Click “Next: Start Ingesting” to create the table, ingestion mapping, and the data connection to Event Hub!

 

Update Policies

Now that you have the raw data landing in ADX you probably want to fork that out to individual tables via update policies based on the type of event. Below we’ll walk through the steps for one of those event types, subAssessments.

 

Here are the high-level steps we’ll take:

  1. Create the KQL query to filter and parse the event type
  2. Create the destination table based on this query
  3. Create a function to use during the update policy
  4. Create the update policy on our newly created table

 

KQL Query

So lets start with the KQL query. First step is to filter down to only the event types that we want to see. We can utilize the data.Type property within the payload to do this

 

 

defenderraw
| where data.Type == 'Microsoft.Security/assessments/subAssessments'

 

 

Now that we have the events filtered we want to turn the properties in the data column (dynamic json) into individual columns.  You could create individual columns using data.<field> but there is an easier way. KQL has bag_unpack that will do all the hard work for me.

 

 

defenderraw
| where data.Type == 'Microsoft.Security/assessments/subAssessments'
| evaluate bag_unpack(data)

 

 

This results in the json being flattened one level without you having to call into it

bwatts670_6-1652807796220.png

Before creating a new table based on the query lets check the DataTypes for the columns and determine which columns we want.

 

 

defenderraw
| where data.Type == 'Microsoft.Security/assessments/subAssessments'
| evaluate bag_unpack(data)
| getschema

 

 

 

bwatts670_7-1652807796225.png

 

Final step is to only select the columns that we want

 

 

defenderraw
| where data.Type == 'Microsoft.Security/assessments/subAssessments'
| evaluate bag_unpack(data)
| project Type, Id, Name, Properties, SecurityEventDataEnrichment, SubAssessmentEventDataEnrichment, TenantId

 

 

Create Table

Once you have the query that will be used in your update policy there is a really simple method to create the table. Take 0 rows in the query (just return the schema) and use that in a .set command

 

 

.set subAssessments <|
defenderraw
| where data.Type == 'Microsoft.Security/assessments/subAssessments'
| evaluate bag_unpack(data)
| project Type, Id, Name, Properties, SecurityEventDataEnrichment, SubAssessmentEventDataEnrichment, TenantId
| take 0

 

 

Running this will create a table called subAssessments with the correct schema.

 

Create the Function

Now we’ll take this query, minus the take 0, and create a function.

 

 

.create function updateSubAssessments() {
    defenderraw
    | where data.Type == 'Microsoft.Security/assessments/subAssessments'
    | evaluate bag_unpack(data)
    | project Type, Id, Name, Properties, SecurityEventDataEnrichment, 
    SubAssessmentEventDataEnrichment, TenantId
}

 

 

Now we have a function called “updateSubAssessments” that can be used in our update policy.

 

Create the Update Policy

Last step is to create an update policy that will monitor the raw table, grab any subAssessments, run the function, and place the results in our subAssessments table.

 

 

.alter table subAssessments policy update @'[{"IsEnabled": true, "Source": "defenderraw", "Query": "updateSubAssessments()", "IsTransactional": true, "PropagateIngestionProperties": false}]'

 

 

Once the update policy is in place any new “subAssessments” event type that lands in “defenderraw” will automatically be parsed and placed in the “subAssessments” table.

Follow these steps for any additional event types that you wish to parse from the “defenderraw” table.

 

Verify Results

You need to wait until at least one event of type “subAssessments” lands in the “defenderraw” table. Once events have landed in the raw table you should be able to query the “subAssessments” table to verify that parsed events are landing correctly.

 

bwatts670_8-1652807796231.png

 

Modify Retention Settings

Once you have the data forked to individual tables you’ll want to adjust the retention setting on the raw table. You can go all the way down to 0 retention if you wish and the update policies will still work. But my suggestion would be to keep a few hours so that if you needed to troubleshoot or look at the raw data coming in you can do this. The below command will set the raw table to 2 hours of retention:

.alter table defenderraw policy retention "{\"SoftDeletePeriod\": \"00.02:00:00\", \"Recoverability\": \"Enabled\"}"

Summary

We’ve walked through the process of sending Microsoft Defender for Cloud logs to Azure Data Explorer using continuous export to Azure Event Hub.  At a high level you need to configure:

  1. Continuous Export of data from Microsoft Defender for Cloud to Event Hub
  2. Data ingestion from Azure Event Hub to Azure Data Explorer (https://docs.microsoft.com/en-us/azure/data-explorer/ingest-data-event-hub)
  3. Azure Data Explorer Update Policy to fork the data into appropriate tables (https://docs.microsoft.com/en-us/azure/data-explorer/kusto/management/updatepolicy)

For the purposes of this article we walked through the manual steps but all of this can be automated using scripts or a Bicep Template

 

https://docs.microsoft.com/en-us/azure/templates/microsoft.kusto/clusters?tabs=bicep

 

Even the tables, functions, update policies, etc.. can be created in the template using configuration KQL scripts.

 

https://docs.microsoft.com/en-us/azure/data-explorer/database-script

 

Co-Authors
Version history
Last update:
‎May 17 2022 02:32 PM
Updated by: