Microsoft Secure Tech Accelerator
Apr 03 2024, 07:00 AM - 11:00 AM (PDT)
Microsoft Tech Community
Azure Sentinel To-Go (Part1): A Lab w/ Prerecorded Data 😈 & a Custom Logs Pipe via ARM Templates 🚀
Published Mar 27 2020 04:01 PM 60.7K Views
Microsoft
header-main.png

 

Recently, I started working with Azure Sentinel, and as any other technology that I want to learn more about, I decided to explore a few ways to deploy it. I got a grasp of the basic architecture and got more familiarized with it. As a researcher, I also like to simplify deployments in my lab environment and usually look for ways to implement the infrastructure I work with as code. Therefore, I started to wonder if I could automate the deployment of an Azure Sentinel solution via a template or a few scripts. Even though, it made sense to expedite the deployment of the solution, I realized I still did not have data or other resources to play with. Then, I wondered If I could integrate the deployment of an Azure Sentinel instance and other resources through the same scripts or templates covering different scenarios.

 

In the end, this approach allows me to also share the process with others in the community in a more practical way.

 

This post is part of a four-part series where I will show you how to deploy your own Azure Sentinel solution in a lab environment via Azure Resource Management (ARM) templates along with a custom logs ingestion pipeline to consume pre-recorded datasets and other resources such as network environments for research purposes.

 

In this post, I show you how to use ARM templates to deploy an Azure Sentinel solution and ingest pre-recorded datasets via a python script, Azure Event Hubs and a Logstash pipeline.

 

The other parts of this series can be found in the following links:

 

What is Azure Sentinel?

Microsoft Azure Sentinel is a scalable, cloud-native, security information event management (SIEM) and security orchestration automated response (SOAR) solution. An Azure service that empowers organizations to bring disparate data sources from resources hosted both on-premises and in multiple clouds and be able to detect, investigate and respond to threats.

 

If you want to learn more about Azure Sentinel, I would recommend to explore this Microsoft Azure document page. Also, if you want to know what you can do with it, make sure you read the articles available in the Microsoft Tech Community Sentinel blog and take a look at these awesome webinars.

Deploying Azure Sentinel

Technically, all we need to do to deploy an Azure Sentinel solution is:

  • Create a Log Analytics Workspace: Azure Sentinel leverages the Azure Monitor Log Analytics workspace to store the data it collects..
  • Enable Azure Sentinel: This is enabled on the top of the workspace.

That basic set up allows you explore all the main features of Azure Sentinel as well as preloaded out-of-the-box resources such as queries, visualizations, response playbooks, and notebooks. You could also upload other resources and even enable data connectors in Azure Sentinel via code. Javier Soriano blogged about it in this post, and it is a great reference for production deployments.

One of the things I wanted to do different for this post was execute Azure Sentinel On-boarding steps, but in a declarative way with Azure Resource Manager (ARM) templates without having to run PowerShell commands. 

 

Azure Resource Manager (ARM) Templates?

 

To implement infrastructure as code for your Azure solutions, use Azure Resource Manager templates. The template is a JavaScript Object Notation (JSON) file that defines the infrastructure and configuration for your project. The template uses declarative syntax, which lets you state what you intend to deploy without having to write the sequence of programming commands to create it.

The Azure Resource Manager is the deployment and management service for Azure and below you can see some of the ways you could interact with it.

 

https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/overviewhttps://docs.microsoft.com/en-us/azure/azure-resource-manager/management/overview

 

A few things that I like about ARM templates are the orchestration capabilities to deploy resources in parallel which makes it faster than serial deployments, and also the feature to track deployments via the Azure portal.

 

https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/overview#why-choose-resource-manager-templateshttps://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/overview#why-choose-resource-manager-templates

 

Additional Reading

 

On-boarding Azure Sentinel with ARM Templates

 

Now that we know a little bit more about Azure Resource Manager services, we are ready to deploy Azure Sentinel. One document that I recommend to get familiar with to learn more about Azure resources mapped to ARM template resource types is this one. In this section, we are going to deploy a Log Analytics workspace and enable Azure Sentinel. Remember that I provide the template for you so that you can follow along.

1. Deploying a Log Analytics Workspace ARM Template

 

A Log Analytics workspace can be found under the Microsoft.OperationalInsights resource types as Microsoft.OperationalInsights/workspaces

 

{
"name": "string",
"type": "Microsoft.OperationalInsights/workspaces",
"apiVersion": "2015-11-01-preview",
"location": "string",
"tags": {},
"properties": {
"sku": {
"name": "string"
},
"retentionInDays": "integer"
},
"resources": []
}

 

I created an initial template with some parameters to make it modular for anyone to use. This is the initial template:

 

 

2. Enabling Azure Sentinel ARM Template

 

Next, I needed to define the Azure Sentinel solution and enable it on the top of the Log Analytics workspace. You can do it with a resource type found under the Microsoft.OperationsManagement resource types as Microsoft.OperationsManagement/solutions .

 

{
"name": "string",
"type": "Microsoft.OperationsManagement/solutions",
"apiVersion": "2015-11-01-preview",
"location": "string",
"tags": {},
"plan": {
"name": "string",
"publisher": "string",
"promotionCode": "string",
"product": "string"
},
"properties": {
"workspaceResourceId": "string",
"containedResources": [
"string"
],
"referencedResources": [
"string"
]
}
}

 

I added that to our initial ARM template and this is the final result:

 

 

That’s it! You can download it and use it for the next steps.

Executing ARM Templates

 

There are a few ways to execute ARM templates, and it all depends on how comfortable you are with the Azure portal and Azure tool-kits (e.g. Azure CLI)

Prerequisites

  • An active Azure Subscription: If you don’t have one, create a free account. You might be eligible for some free credits for the first 30 days.
  • A Resource Group: A container that holds related resources for an Azure solution. You can use an existing one, but if this is your first time playing with Azure resources, you can create one following these instructions. You can also do it while deploying and ARM template via the Azure portal.

Option 1: Using Azure CLI

 

If you want to use one command to deploy an ARM template, then this option is for you. The Azure command-line interface (CLI) is Microsoft’s cross-platform command-line experience for managing Azure resources. It can be installed in Windows, macOS and Linux environments. In addition, there is a PowerShell version of it and also an interactive, authenticated, browser-accessible option known as the Azure Cloud Shell.

 

We can start using Azure CLI and create a Resource Group if you have not done it yet. Run the following command to create one in a specific location:

 

az group create --location eastus --resource-group AzSentinelDemo

 

Next, you can run the following command to execute the ARM template:

 

az deployment group create --name AzSentinelDeploy --resource-group AzSentinelDemo --template-file <ARM Template name>.json --parameters workspaceName=AzSentinelWS
  • az deployment group create: Start a deployment
  • --name : Name of your deployment
  • --resource-group: Name of the Azure Resource group
  • --template-file : Template that I put together for this deployment.
  • --parameters : Deployment parameter values (key=value). Provide a name for your Log Analytics workspace. The name must be globally unique across all Azure subscriptions. I take care of that for you in the template by adding a unique string after the name you provide.

 

Track your deployment: Azure Portal>Resource Group Name>Deployments

 

first-demo-deployment-one.png

 

first-demo-deployment-two.png

 

That’s it! once your deployment completes, you will be able to access the main Azure Sentinel interface. Before we do that, let me show you another way to execute our ARM template.

Option 2: Using Azure Portal


It takes a few more clicks to do it via the Azure portal, but it is easy to follow:

  • Go to https://portal.azure.com/ , click on the "Create a resource” option on the top left of your screen to create resources, and search for "Template Deployment".

 

create-resource-search-highlight.png

 

  • Choose: “Build your own template in the editor

 

create-resource-build-your-own.png

 

  • Upload the template we put together.

 

azure-resource-load-file.png

 

  • Once the template is uploaded, you will see the parameters and resources sections get populated. Click save.

 

azure-rsource-file-loaded.png

 

  • Next, you need to set your subscription and resource group names. As you can see in the image below, you can directly create an Azure Resource Group if you don’t have one yet. Click on Review to validate the template and finally create to deploy your resources.

 

azure-resource-azsentinel-parameters.png

 

Cyb3rWard0g_0-1601586094092.png

 

  • Then, you can track the deployment of your Azure Sentinel resources by going to Azure Portal > Resource Group Name > Deployments

 

Cyb3rWard0g_0-1601586332219.png

 

That’s it! once your deployment completes, you will be able to access the main Azure Sentinel interface.

Accessing Azure Sentinel

 

  • Search for “Azure Sentinel

 

Cyb3rWard0g_0-1601586504213.png

 

  • Select the Azure Sentinel workspace that you just created.

 

Cyb3rWard0g_1-1601586555179.png

 

You will be taken to the main Azure Sentinel interface. That was easy right?

 

Cyb3rWard0g_2-1601586637785.png

 

Wait, what?


Why do I have to do all that with ARM templates when I can just follow these instructions and with a few clicks I can deploy one too?

 

Deploying the solution while working in a lab environment is not enough. You need to have other resources and data to start exploring and learning about all the capabilities Azure Sentinel provides. That will take more than just a few clicks. What if we can take the ARM template that we just used and run other nested templates in parallel to deploy other resources and even ingest pre-recorded data for additional research?

Enter Azure Sentinel To-Go !

 

Cyb3rWard0g_0-1601587751987.png

 

Azure Sentinel2Go is an open source project developed to expedite the deployment of an Azure Sentinel lab along with other Azure resources and a data ingestion pipeline to consume pre-recorded datasets for research purposes.

 

Azure Sentinel2Go is part the Blacksmith project 

 

The Blacksmith project focuses on providing dynamic easy-to-use templates for security researches to model and provision resources to automatically deploy applications and small networks in the cloud.

 

Azure Sentinel2Go is a work in progress, and I welcome feedback on what it is that you would like to see being deployed along with an Azure Sentinel solution and datasets you would like to work with in your lab environment.

Azure Sentinel + Custom Log Pipeline

 

One of the features that I have noticed security analysts get interested the most while using Azure Sentinel for the first time is the Log Analytics capabilities. Log Analytics is the primary tool in the Azure portal for writing log queries written in Kusto Query Language (KQL) to quickly retrieve, consolidate, and analyze security events. Therefore, I decided to find a way for researchers to learn about KQL with pre-recorded datasets.

 

Fortunately, the Log Analytics workspace allows the collection of custom logs via its HTTP Data Collector API. If you want to learn how to do it with code, there are some basic examples in Azure docs for Powershell, C# and Python.

Data Ingestion Pipeline Designs


In this section I will share a few of my favorite ways to send pre-recorded datasets to a Log Analytics workspace custom log table.

1) Python Script -> Log Analytics Workspace

 

Cyb3rWard0g_12-1585348389443.png

 

This is one of the simplest ways to send data directly to a log analytics workspace. I took the basic example available here, and extended it a little bit to be able to read from a JSON file or a folder, show a progress bar, and send smaller sized chunks of 5MB per POST request. Make sure you read the Data Limits while using a similar approach. I also extended the PowerShell script available and created a proof of concept here.

 

The script is available here and all the information you will need from the log analytics workspace can be found in Azure Portal>Log Analytics Workspace>Advanced Settings.

 

Cyb3rWard0g_13-1585348408886.png

 

Next, we need a data sample for this exercise. Therefore, the project comes with a few data samples in this folder. Download the dataset-sample-small.tar.gz to your local computer and decompress it.

 

tar -xzvf dataset-sample-small.tar.gz

 

Next, send it over by running these commands in your local computer:

 

python3 ala-python-data-producer.py -w <WorkspaceID> -k <SharedKey> -l "onesample" -f dataset-sample-small.json -v

 

Once it completes go to your Azure Sentinel interface and click on Logs. You can see that there are no events yet. It usually takes from 5–10 mins.

 

Cyb3rWard0g_0-1601587031471.png

 

You can see a new table under customs logs with the event schemas. Remember that not every event will have the same schema. Make sure you understand the schema of your events before running queries.

 

Cyb3rWard0g_15-1585348441445.png

 

Based on the event schemas, we can run the following query to see what events we are working with:

 

onesample_CL
| summarize count() by winlog_channel_s, winlog_event_id_d, winlog_task_s

 

Cyb3rWard0g_16-1585348455615.png

 

That’s it! This is a very practical way to ingest custom logs, but might not scale with larger files or hundreds of files in a loop. Therefore, I wanted to also provide another option that would allow me to send events to a more robust pipeline and let it handle the whole process. This is a proof of concept and works very well in a lab environment.

2) Azure Event Hubs -> Logstash -> Log Analytics


I like to use existing tools that are proven to work at scale and this is not the exception. TL;DR — I use Kafkacat to read json files stored locally and send them over to an Azure Event Hub. Next, Logstash reads them from Azure Event Hub, and sends them over to a Log Analytics workspace.

 

Cyb3rWard0g_0-1585348612569.png

 

In more details the following is happening in the image above:

  • First, I use Kafkacat in Producer mode to read contents of a JSON file and send them over to a Kafka server. Kafkacat is a generic non-JVM producer and consumer for Apache Kafka.
  • Instead of a Kafka server, I use Azure Event Hubs with Kafka features enabled to receive and store events from Kafkacat. Azure Event Hubs is a server-less big data streaming platform and event ingestion service.
  • Next, I use a Linux VM with Logstash installed as a docker container to read events from the Azure Event Hub. Logstash is an open source data collection engine with real-time pipelining capabilities.
  • Finally, I use the same Logstash server to send events collected from the Azure Event Hub to the Azure Sentinel’s workspace for further analysis.

I already provide the following configurations as part of the Azure Sentinel2Go project.

 

Event Hub -> Logstash Conf

This is the Logstash input config file to consume events from an Azure Event Hub. The plugin used is the Logstash Azure Event Hubs input plugin.

 

input {
azure_event_hubs {
event_hub_connections => ["${EVENTHUB_CONNECTIONSTRING}"]
threads => 2
initial_position => "end"
#codec => "json"
}
}

 

I do not use the input codec => "json" property because I do not want to unpack the event Message field and exceed the max number (500) of custom fields per data type in the Log Analytics workspace.

 

Logstash Conf -> Log Analytics Workspace

 

This is the Logstash output config file to send the events that it collects from the Azure Event Hub to a Log Analytics workspace. The plugin used is the Azure Log Analytics output plugin for Logstash

developed by Microsoft.

 

output {
   microsoft-logstash-output-azure-loganalytics {
      workspace_id => "${WORKSPACE_ID}"
      workspace_key => "${WORKSPACE_KEY}"
      custom_log_table_name => "prerecorded"
      plugin_flush_interval => 5
   }
   #stdout { codec => rubydebug }
}

 

ARM Template Deployment

 

One thing I added to the Azure Sentinel2Go repository is a “Deploy to Azure” badge used on Azure quick-start templates to upload the ARM template directly to the Azure portal. Very convenient! You need to go to Azure-Sentinel2Go > grocery-list > custom-log-pipeline and click on the "Deploy to Azure" badge to deploy an Azure Sentinel along with a custom logs pipeline:

 

Cyb3rWard0g_0-1596328911451.png

 

You will be taken to the interface to set deployment parameters. Set the Deploy Custom Logs Pipeline parameter to Logstash-EventHub. One thing to pay attention to is the virtual machine size. If you are in westus, you need to switch it to Standard_A3 . Also, make sure you set the AllowedIPAddresses parameter to restrict access to the Logstash box. Add your company or your house Public IP address.

 

Cyb3rWard0g_0-1588783829191.png

 

Monitor your deployment. It should take around 8–10 minutes.

 

Cyb3rWard0g_3-1585348665121.png

 

Once it completes, you should be able to send prerecorded data from your local computer to the Azure Event Hub.

 

Sending events to the Azure Event Hub

 

First, create a local Kafkacat configuration file to define a few properties to be able to access the Azure Event Hub. I created one for you

 

You will need to get the following values and paste them in the config file.

  • Event Hub namespace: Get it from the Event Hub resource.
  • Event Hub Connection String: You can get it following these steps.

 

Second, we need a sample dataset to send over to our Azure Event Hub. We can use the same dataset we used earlier with the Python script.

 

Next, in your local computer, run Kafkacat in Producer mode as shown below. 

 

kafkacat -b <EVENTHUB-NAMESPACE>.servicebus.windows.net:9093 -t <EVENTHUB-NAME> -F <KAFKACAT-FILE>.conf -P -l dataset-sample.json
  • -b : Bootstrap broker(s) (host[:port]). Your Event Hub Namespace
  • -t : Topic to produce/send events to. The name of you Event Hub.
  • -F : Read configuration properties from the Kafkacat.conf file.
  • -P : Producer Mode : Produce/Send events.
  • -l : Send messages from a file. Pre-recorded dataset.

 

Once you run that command, you can check the events flowing through the Azure Event Hub. Go to Azure Portal > Resource Group Name > Event Hub Namespace and filter the Show Metrics view to show Messages only. It might take a few minutes for the view to update.

 

Cyb3rWard0g_4-1585348736789.png

 

The Azure Sentinel view also will take a a few mins to update.

 

Cyb3rWard0g_1-1588784274743.png

 

Explore the Custom Logs

 

As you already know, click on Logs (Log Analytics) to explore the custom logs and their schema. One thing to remember is that the events flowing through this pipeline are packed inside of the Message field. As I mentioned before, this is to avoid exceeding the max number (500) of custom fields per data type in case you send a lot of events with different schemas.

 

Cyb3rWard0g_2-1588784483863.png

 

You can unpack the Message field and get to specific nested fields with the Kusto Query function parse_json(). This function interprets a string as a JSON value and returns the value as dynamic . 

 

prerecorded_CL
| extend m=parse_json(Message)
| summarize count() by EventID=tostring(m.winlog.event_id),EventProvider=tostring(m.winlog.channel),Task=tostring(m.winlog.task)

 

Cyb3rWard0g_7-1585348820210.png

 

Remember that not every event will have the same schema. Make sure you understand the schema of your events before running queries.

(Optional) Loading Pre-Recorded Datasets

 

Azure Sentinel2Go also comes with the option to load pre-recorded datasets right at deployment time from the Mordor project. It leverages the same Logstash VM for the data ingestion. You do not have to send anything from your local computer. The data from mordor is downloaded and imported all via ARM templates.

 

Cyb3rWard0g_0-1588789915151.png

Downloading & Decompressing Mordor Datasets

 

I use the following commands to download and decompress all small mordor datasets. The commands are part of the deployment and are executed inside of the Linux VM when you choose to add the item "mordor-small-datasets" to the Add to cart parameter while deploying Azure Sentinel2Go. You do not have to run anything in your local computer.

 

git clone https://github.com/OTRF/mordor.git
cd mordor/datasets/small/
find . -type f -name "*.zip" | grep -i 'host' | while read filename; do unzip -o -d /opt/logstash/datasets/ $filename; done;

 

If you choose to add the item "mordor-large-apt29" to your Add Mordor Dataset parameter while deploying Azure Sentinel2Go, the following commands are executed inside of the Linux VM:

 

git clone https://github.com/OTRF/mordor.git
cd mordor/datasets/large/apt29
find . -type f -name "*_manual.zip" -print0 | xargs -0 -I{} unzip {} -d /opt/logstash/datasets/

 

JSON files -> Logstash Conf

 

This is the additional Logstash input config to read all the JSON files. The plugin used is the Logstash File Input plugin.

 

input {
file {
path => "/usr/share/logstash/datasets/*.json"
start_position => "beginning"
sincedb_path => "/dev/null"
#codec => "json"
}
}

 

ARM Template Deployment

 

If you have resources running from the earlier deployment, I recommend to delete them (Lab environment). Similar to our previous deployment, go to Azure-Sentinel2Go > grocery-list > custom-log-pipeline. Select Logstash for the Deploy Custom Logs Pipeline parameter as shown below and add a mordor dataset to your cart (Add Mordor Dataset) . For this example, we are going to use the mordor-small-datasets scenario. Also, once again, make sure you set the AllowedIPAddresses parameter to restrict access to the Logstash box. Add your company or your house Public IP address.

 

Cyb3rWard0g_1-1601955211079.png

 

Monitor the deployment. It might take around 8–10 minutes for it to be done. When it is complete, go to your Azure Sentinel interface. Give it 2–3 mins for events to start showing. You will start getting thousands of events (300K+).

 

Cyb3rWard0g_0-1601956084887.png

 

What I do while I wait for all the events (200k+) to be ingested :) 

 

Cyb3rWard0g_11-1585348900875.png

 

Playing and exercising at my backyard with my dog Pedro while the horses watch and datasets get ingested.

Take advantage of the time you have and stretch a little bit! Take a break!

What can we do with the data?

 

We can do the same as before and explore a few events to understand the event schemas. Also, since those events were generated as part of the Mordor project, you could focus on datasets mapped to specific ATT&CK tactics and techniques. The project comes with a Navigator View for the specific platforms that it supports (Currently only Windows).

 

Cyb3rWard0g_12-1585348954272.png

 

Let’s Look for Potential Lateral Movement Techniques

 

One thing that I like to look for when looking for lateral movement techniques is processes created under logon sessions that were initially created as part of a network authentication event (Logon Type 3). One example can be adversaries leveraging the Windows Management Instrumentation (WMI) and Win32_Process class to execute commands over the network. This behavior would generate something similar to this:

 

Cyb3rWard0g_13-1585348968130.png

 

We can use KQL and its JOIN operator to look for a similar behavior without filtering on the parent process wmiprvse.exe. We can use events 4624 (An account was successfully logged on) and 4688 (A New process has been created) from the Microsoft-Windows-Security-Auditing event provider.

 

Use the following query and run it in log analytics as shown below:

 

prerecorded_CL
| extend a=parse_json(Message)
| where a.EventID == 4624 and a.LogonType == 3 and a.TargetUserName !endswith "$"
| project path_s, TargetLogonId=tostring(a.TargetLogonId)
| join kind= inner
(
prerecorded_CL
| extend b=parse_json(Message)
| where b.EventID == 4688 and b.TargetLogonId != "0x3e4"
| project ParentProcessName=b.ParentProcessName,
NewProcessname=b.NewProcessName,
CommandLine=b.CommandLine,
TargetLogonId=tostring(b.TargetLogonId)
)
on TargetLogonId
| project-away TargetLogonId, TargetLogonId1

 

As you can see in the image below, that query got some hits from a few datasets that were created after emulating adversaries using WMI and Powershell Remoting to execute commands over the network.

 

Cyb3rWard0g_1-1601956552431.png

 

 

That’s it for this first part! I hope you enjoyed it and found the design and deployment of Azure Sentinel2Go helpful. In the next post, I will show you how to deploy additional resources along with an Azure Sentinel solution to focus on a few use cases that go beyond just using the Log Analytics features. I want to make sure Azure Sentinel2Go also allows the exploration of other capabilities provided by Azure Sentinel.

 

 

Additional Large Open Datasets to test:

 

References

https://mordordatasets.com/introduction

https://docs.microsoft.com/en-us/azure/azure-monitor/faq

https://docs.microsoft.com/en-us/azure/azure-monitor/terminology

https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-platform

https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources#custom-sources

https://docs.microsoft.com/en-us/azure/sentinel/overview

https://techcommunity.microsoft.com/t5/azure-sentinel/deploying-and-managing-azure-sentinel-as-code/...

https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/overview

https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/overview

https://docs.microsoft.com/en-us/azure/azure-monitor/insights/solutions

https://azuremarketplace.microsoft.com/en-us/marketplace/apps/Microsoft.SecurityOMS?tab=Overview

https://azure.microsoft.com/en-us/pricing/details/azure-sentinel/

https://azure.microsoft.com/en-us/pricing/details/monitor/

https://www.elastic.co/guide/en/logstash/current/plugins-inputs-azure_event_hubs.html

https://azure.microsoft.com/en-us/services/event-hubs/

https://github.com/yokawasa/logstash-output-azure_loganalytics

https://docs.microsoft.com/en-us/azure/kusto/query/

 

25 Comments
Version history
Last update:
‎Oct 05 2020 09:03 PM
Updated by: