Deploying and Managing Microsoft Sentinel as Code
Published Jan 28 2020 12:28 AM 125K Views
Microsoft

clipboard_image_1.png

 

Philippe Zenhaeusern and Javier Soriano co-author this blog post.

 

The content of this blog is not up to date anymore. The new recommended way to manage content as code in Microsoft Sentinel is Repositories.

 

In the last few months working on Microsoft Sentinel, we have talked to many partners and customers about ways to automate Microsoft Sentinel deployment and operations.

 

These are some of the typical questions: How can I automate customer onboarding into Sentinel? How can I programmatically configure connectors? As a partner, how do I push to my new customer all the custom analytics rules/workbooks/playbooks that I have created for other customers?

 

In this post, we will try to answer all these questions, not only describing how to do it but also giving you some of the work done with a repository that contains a minimum viable product (MVP) around how to build a full Sentinel as Code environment.

 

The post will follow this structure:

 

  1.  Infrastructure as Code
  2.  Microsoft Sentinel Automation Overview
  3.  Automating the deployment of specific Microsoft Sentinel components
  4.  Building your Sentinel as Code in Azure DevOps

We recommend you go one by one in order to fully understand how it works.

 

Infrastructure as Code

You might be familiar with the Infrastructure as Code concept. Have you heard about the Azure Resource Manager, Terraform, or AWS Cloud Formation? Well, they are all ways to describe your infrastructure as code so that you can treat it as such…put it under source control (e.g., git, svn), so you can track changes to your infrastructure the same way you track changes in your code. You can use any source control platform, but in this article, we will use Github.

 

Besides treating your infrastructure as code, you can also use DevOps tooling to test that code and deploy that infrastructure into your environment, all in a programmatic way. This is also referred to as Continuous Integration/Continuous Delivery (CICD). Please take a look at this article if you want to know more. This post will use Azure DevOps as our DevOps tool, but the concepts are the same for any other tool.

 

The whole idea is to codify your Microsoft Sentinel deployment in the Sentinel context and put it in a code repository. Every time there is a change in the files that define this Sentinel environment, this change will trigger a pipeline that will verify the changes and deploys them into your Sentinel environment. But how do we programmatically make these changes into Sentinel?

 

Microsoft Sentinel Automation Overview

As you probably know, there are different components inside Microsoft Sentinel…we have Connectors, Analytics Rules, Workbooks, Playbooks, Hunting Queries, Notebooks, and so on.

These components can be managed easily through the Azure Portal, but what can I use to modify all these programmatically?

 

Here is a table that summarizes what can be used for each:

 

Component

Automated with

Onboarding

API, Powershell, ARM

Alert Rules

API, Powershell

Hunting Queries

API, Powershell

Playbooks

ARM

Workbooks

ARM

Connectors

API

 

  • Powershell: Special thanks to Wortell for writing the AzSentinel module, which greatly facilitates many of the tasks. We will use it in the three components that support it (Onboarding, Alert Rules, Hunting Queries).
  • API: Some components don’t currently have a Powershell module and can only be configured programmatically via API. The Sentinel API is now public, and its details can be found here. We will use it to enable Connectors.
  • ARM: This is Azure’s native management and deployment service. You can use ARM templates to define Azure resources as code. We will use it for Playbooks and Workbooks.

How to structure your Sentinel code repository

Here we would like to show what we think is the recommended way to structure your repository.

 

 

 

 

 

 

 

 

|
|- contoso/  ________________________ # Root folder for customer
|  |- AnalyticsRules/  ______________________ # Subfolder for Analytics Rules
|     |- analytics-rules.json _________________ # Analytics Rules definition file (JSON)
|
|  |- Connectors/  ______________________ # Subfolder for Connectors
|     |- connectors.json _________________ # Connectors definition file (JSON)
|
|  |- HuntingRules/ _____________________ # 
|     |- hunting-rules.json _______________ # Hunting Rules definition file (JSON)
|
|  |- Onboard/  ______________________ # Subfolder for Onboarding
|     |- onboarding.json _________________ # Onboarding definition file (JSON)
|
|  |- Pipelines/ _____________________ # Subfolder for Pipelines 
|     |- pipeline.yml _______________ # Pipeline definition files (YAML)
|
|  |- Playbooks/  ______________________ # Subfolder for Playbooks
|     |- playbook.json _________________ # Playbooks definition files (ARM)
|
|  |- Scripts/ _____________________ # Subfolder for script helpers 
|     |- CreateAnalyticsRules.ps1 _______________ # Script files (PowerShell)
|
|  |- Workbooks/  ______________________ # Subfolder for Workbooks
|     |- workbook-sample.json _________________ # Workbook definition files (ARM)

 

 

 

 

 

 

 

 

You can find a sample repository with this structure here.

 

We will use this same repository throughout this post as we have placed there the whole testing environment. Note: take into account that this is just a Minimum Viable Product and is subject to improvements. Feel free to clone it and enhance it.

 

Automating deployment of specific Microsoft Sentinel components

Now that we have a clear view of what to use to automate what and how to structure our code repository, we can start creating things. Let’s go, component by component, detailing how to automate its deployment and operation.

 

Onboarding

Thanks to the AzSentinel Powershell module by Wortell, we have a command that simplifies this process. We just need to execute the following command to enable Sentinel on a given Log Analytics workspace:

 

Set-AzSentinel [-SubscriptionId <String>] -WorkspaceName <String> [-WhatIf] [-Confirm] [<CommonParameters>]

We have a created a script (InstallSentinel.ps1) with some more logic in it, so we can use it in our pipelines. This script takes a configuration file (JSON) as an input where we specify the different workspaces where the Sentinel (SecurityInsights) solution should be enabled. The file has the following format:

 

 

 

 

 

 

 

 

 

{
    "deployments": [
        {
            "resourcegroup": "<rgname>",
            "workspace": "<workspacename>"
        },
        {
            "resourcegroup": "<rgname2>",
            "workspace": "<workspacename2>"
        }
    ]
}

 

 

 

 

 

 

 

 

The InstallSentinel.ps1 script is located in our repo here and has the following syntax:

 

InstallSentinel.ps1 -OnboardingFile <String>

We will use this script in our pipeline.

 

Connectors

Sentinel Data Connectors can currently only be automated over the API, which is not officially documented yet. However, with Developer Tools enabled in your browser, it is quite easy to catch the related connector calls. Please take into account that this API might change in the future without notice, so be cautious when using it.

 

The following script runs through an example connecting to “Azure Security Center” and “Azure Activity Logs” to the Sentinel workspace. Both are very common connectors to collect data from your Azure environments. (Be aware that some connectors will require additional rights, connecting the “Azure Active Directory” source, for instance, will require additional AAD Diagnostic Settings permissions besides the “Global Administrator” or “Security Administrator” permissions on your Azure tenant.)

The “EnableConnectorsAPI.ps1” script is located inside our repo here and has the following syntax:

 

EnableConnectorsAPI.ps1 -TenantId <String> -ClientId <String> -ClientSecret <String> -SubscriptionId <String> -ResourceGroup <String> -Workspace <String> -ConnectorsFile <String>

The ConnectorsFile parameter references a JSON file that specifies all the data sources you want to connect to your Sentinel workspace. Here is a sample file:

 

 

 

 

 

 

 

{
    "connectors": [
    {
        "kind": "AzureSecurityCenter",
        "properties": {
            "subscriptionId": "subscriptionId",
            "dataTypes": {
                "alerts": {
                    "state": "Enabled"
                }
            }
        },
    },
    {
        "kind": "AzureActivityLog",
        "properties": {
            "linkedResourceId": "/subscriptions/subscriptionId/providers/microsoft.insights/eventtypes/management"
        }
    }]
}

 

 

 

 

 

 

 

 

The script will iterate through this JSON file and enable the data connectors one by one. This JSON file should be placed into the Connectors directory so the script can read it.

 

As you can imagine, there are some connectors that cannot be automated, like all the ones based on Syslog/CEF, as they require installing an agent.

 

Analytics Rules

The AzSentinel Powershell module provides a command to be able to create new Analytics Rules (New-AzSentinelAlertRule), passing a bunch of parameters to define the rule characteristics. An even more interesting command allows you to create analytics rules based on an input file where all the rules' properties are specified. This command is Import-AzSentinelAlertRule.

 

We have created a script that takes the workspace and rules file and creates the analytics rules accordingly.

 

The script is located inside our repo here and has the following syntax:

CreateAnalyticsRules.ps1 -Workspace <String> -RulesFile <String>

 

As you can see, one of the parameters is a rules file (in JSON format) where you will specify all the rules (of any type) that need to be added to your Sentinel environment. Here is a sample file:

 

 

 

 

 

 

 

 

{
  "Scheduled": [
    {
      "displayName": "AlertRule01",
      "description": "",
      "severity": "Medium",
      "enabled": true,
      "query": "SecurityEvent | where EventID == \"4688\" | where CommandLine contains \"-noni -ep bypass $\"",
      "queryFrequency": "5H",
      "queryPeriod": "6H",
      "triggerOperator": "GreaterThan",
      "triggerThreshold": 5,
      "suppressionDuration": "6H",
      "suppressionEnabled": false,
      "tactics": [
        "Persistence",
        "LateralMovement",
        "Collection"
      ],
      "playbookName": "",
      "aggregationKind": "SingleAlert",
      "createIncident": true,
      "groupingConfiguration": {
        "enabled": false,
        "reopenClosedIncident": false,
        "lookbackDuration": "PT5H",
        "entitiesMatchingMethod": "All",
        "groupByEntities": [
          "Account",
          "Ip",
          "Host",
          "Url"
        ]
      }
    },
    {
      "displayName": "AlertRule02",
      "description": "",
      "severity": "Medium",
      "enabled": true,
      "query": "SecurityEvent | where EventID == \"4688\" | where CommandLine contains \"-noni -ep bypass $\"",
      "queryFrequency": "5H",
      "queryPeriod": "6H",
      "triggerOperator": "GreaterThan",
      "triggerThreshold": 5,
      "suppressionDuration": "6H",
      "suppressionEnabled": false,
      "tactics": [
        "Persistence",
        "LateralMovement",
        "Collection"
      ],
      "playbookName": ""
    }
  ],
  "Fusion": [
    {
      "displayName": "Advanced Multistage Attack Detection",
      "enabled": true,
      "alertRuleTemplateName": "f71aba3d-28fb-450b-b192-4e76a83015c8"
    }
  ],
  "MLBehaviorAnalytics": [
    {
      "displayName": "(Preview) Anomalous SSH Login Detection",
      "enabled": true,
      "alertRuleTemplateName": "fa118b98-de46-4e94-87f9-8e6d5060b60b"
    }
  ],
  "MicrosoftSecurityIncidentCreation": [
    {
      "displayName": "Create incidents based on Azure Active Directory Identity Protection alerts",
      "description": "Create incidents based on all alerts generated in Azure Active Directory Identity Protection",
      "enabled": true,
      "productFilter": "Microsoft Cloud App Security",
      "severitiesFilter": [
        "High",
        "Medium",
        "Low"
      ],
      "displayNamesFilter": null
    }
  ]
}

 

 

 

 

 

 

 

 

As you can see, Fusion and MLBehaviorAnalytics rules need a field called alertRuleTemplateName. This is an ID that is consistent across all Sentinel environments, so you should use the same values in your own files. As Sentinel grows, we are adding more MLBehaviorAnalytics rules, so you might need to get the alertRuleTemplateName values in order for you to add them to your rules JSON file. In order to get the values for alertRuleTemplateName, you can execute the following command available in AzSentinel:

 

Get-AzSentinelAlertRuleTemplates -WorkspaceName <workspace_name> -Kind MLBehaviorAnalytics

The output will contain a name field that contains the alertRuleTemplateName value.

 

The script will iterate through this JSON file and create/enable the analytics rule alerts. The script also supports updating existing alerts that are already enabled. This JSON file should be placed into the Analytics Rules directory so the script can read it. The script also supports attaching playbooks for automated response to an alert. This is specified in the playbook property for each alert in the JSON file.

 

Workbooks

Workbooks are a native object in Azure, and therefore, can be created through an ARM template. The idea is that you would place all the custom workbooks that you have developed inside the Workbooks folder in your repo, and any change on these will trigger a pipeline that creates them in your Sentinel environment.

 

We have created a script (placed in the same repo here) that can be used to automate this process. It has the following syntax:

CreateWorkbooks.ps1 -SubscriptionId <String> -ResourceGroup <String> -WorkbooksFolder <String> -Workspace <String>

The script will iterate through all the workbooks in the WorksbooksFolder and deploy them into your Microsoft Sentinel instance.

 

Also, consider that the deployment will fail if a workbook with the same name already exists.

If you're building your own workbook ARM template, make sure that you add "sentinel" as the workbookType in the template (look at our examples here)

 

Hunting Rules

In order to automate the deployment of Hunting Rules, we will use the AzSentinel module.

We have a created another script that takes as an input a JSON file where all the Hunting Rules are defined. The script will iterate over them and create/update them accordingly.

 

The syntax for this script is the following:

CreateHuntingRulesAPI.ps1 -Workspace <String> -RulesFile <String>

 

Playbooks

This will work the same way as Workbooks. Playbooks use Azure Logic Apps in order to automatically respond to incidents. Logic Apps are a native resource in ARM, and therefore we can automate its deployment with ARM templates. The idea is that you would place all the custom playbooks that you have developed inside the Playbooks folder in your repo, and any change on these will trigger a pipeline that creates them in your Sentinel environment.

 

We have created a script (placed in the same repo here) that can be used to automate this process. It has the following syntax:

CreatePlaybooks.ps1 -ResourceGroup <String> -PlaybooksFolder <String>

This script will succeed even if the playbooks are already there.

 

Building your Sentinel as Code in Azure DevOps

Now that we have a clear view of how to structure our code repository and what to use to automate each Sentinel component, we can start creating things in Azure DevOps. This is a high-level list of tasks that we will perform:

 

  • Create an Azure DevOps organization
  • Create a project in Azure DevOps
  • Create a service connection to your Azure environment/s
  • Create variables
  • Connect your existing code repository with your Az DevOps project
  • Create pipelines

Let’s review them one by one.

 

Create an Azure DevOps organization

This is the first step in order to have your Azure DevOps environment. You can see the details on how to do this here.

 

Create a project in Azure DevOps

A project provides a repository for source code and a place for a group of people to plan, track progress, and collaborate on building software solutions. It will be the container for your code repository, pipelines, boards, etc. See instructions on how to create it here.

 

Create a service connection to your Azure environment

In order to talk to our Azure environment, we need to create a connection with specific Azure credentials. In Azure DevOps, this is called a service connection. The credentials that you will use to create this service connection are typically a service principal account defined on Azure.

 

You have full details on how to create a service connection here. Once you have created the principal, you will need to grant it access to your Azure environment where Sentinel would live.

 

These are the fields you need to provide to create your service connection:

 

clipboard_image_0.png

Take a note of the Connection name you provide, as you will need to use this name in your pipelines.

 

Create variables

We are going to need several variables defined in the Azure DevOps environment so they can be passed to our scripts to specify the Sentinel workspace, resource group, config files, and API connection information.

 

As we will need these variables across all our pipelines, the best thing to do is create an Azure DevOps variable group. With this, we can define the variable group once and then reuse it in different pipelines across our project. Here you have instructions on how to do it.

 

We have called our variable group “Az connection settings”; this is important because we will reference this name in our pipelines. Here is a screenshot of the variables that we will need to define:

clipboard_image_1.png

 

Connect your existing code repository with your Az DevOps project

In this article, you can see how to import an existing repo into Az DevOps. It works for Github, Bitbucket, Gitlab, and other locations. See instructions here.

 

Create pipelines

There are two ways to create our Azure Pipelines: in classic mode or as YAML files. We are going to create them as YAML files because that way, we can place them into our code repository so they can be easily tracked and reused anywhere. Here you have the basic steps to create a new pipeline.

 

In the new pipeline wizard, select Github YAML in the Connect step:

clipboard_image_2.png

Then select your repository and then choose Starter pipeline if you want to build your own pipeline, or Existing Azure Pipelines YAML file if you want to use the ones we already have in the repository:

Capture.PNG

We are going to create one CI (build) pipeline for Scripts and several CICD (build+deploy) pipelines (one for each Sentinel component).

 

Create a CI pipeline for Scripts

We will treat Scripts slightly differently than the rest. This is because it is not a Sentinel component, and the scripts themselves won’t get deployed to Azure. We will just use them to deploy other things.

Because of this, the only thing we need to do with scripts is to make sure they are available in the other pipelines to be used as artifacts. To accomplish this, we just need two tasks in our CI pipeline: Copy Files and Publish Pipeline Artifact. 

 

Update! we have now added a syntax validator in our pipelines based on the Files Validator task available in the Visual Studio marketplace. You will need to install this task if you want to use our templates.

 

Here is an example of the YAML code that will define this pipeline:

 

 

 

 

 

 

# Scripts build pipeline
# Copies script files to the agent and publishes an artifact with them

trigger:
 paths:
   include:
     - Scripts/*

pool:
  vmImage: 'windows-2019'

steps:
- task: CopyFiles@2
  displayName: 'Copy Scripts'
  inputs:
    SourceFolder: Scripts
    TargetFolder: '$(build.artifactstagingdirectory)'
- task: Files-Validator@1
  inputs:
    rootDir: '$(build.artifactstagingdirectory)/*.ps1'
    validateXML: false
    validateJSON: false
    validateYAML: false
    validatePS: true
- task: PublishPipelineArtifact@1
  displayName: 'Publish Pipeline Artifact'
  inputs:
    targetPath: Scripts
    artifact: Scripts

 

 

 

 

 

 

As you can see, we have added three tasks, one to copy the script files, another one that checks the PowerShell syntax, and the last one to publish the pipeline artifacts. You can find this pipeline in our Github repo here.

 

Create CICD pipelines for each Sentinel component

With the Scripts now available as an artifact, we can now use them in our Sentinel component pipelines. These pipelines will be different from the previous one because we will do CI and CD (build+deploy). We define these in our YAML pipeline file as stages.

 

Here is one sample pipeline for Analytics Rules:

 

 

 

 

 

 

# Analytics Rules build and deploy pipeline
# This pipeline publishes the rules file as an artifact and then uses a powershell task to deploy

name: build and deploy Alert Rules
resources:
 pipelines:
   - pipeline: Scripts
     source: 'scriptsCI'
trigger:
 paths:
   include:
     - AnalyticsRules/*

stages:
- stage: build_alert_rules

  jobs:
    - job: AgentJob
      pool:
       name: Azure Pipelines
       vmImage: 'vs2017-win2016'
      steps:
       - task: CopyFiles@2
         displayName: 'Copy Alert Rules'
         inputs:
          SourceFolder: AnalyticsRules
          TargetFolder: '$(Pipeline.Workspace)'
       - task: Files-Validator@1
         inputs:
           rootDir: '$(Pipeline.Workspace)/*.json'
           validateXML: false
           validateJSON: true
           validateYAML: false
           validatePS: false
       - task: PublishBuildArtifacts@1
         displayName: 'Publish Artifact: RulesFile'
         inputs:
          PathtoPublish: '$(Pipeline.Workspace)'
          ArtifactName: RulesFile

- stage: deploy_alert_rules
  jobs:
    - job: AgentJob
      pool:
       name: Azure Pipelines
       vmImage: 'windows-2019'
      variables: 
      - group: Az connection settings
      steps:
      - download: current
        artifact: RulesFile
      - download: Scripts
        patterns: '*.ps1'
      - task: AzurePowerShell@4
        displayName: 'Create and Update Alert Rules'
        inputs:
         azureSubscription: 'Soricloud Visual Studio'
         ScriptPath: '$(Pipeline.Workspace)/Scripts/Scripts/CreateAnalyticsRules.ps1'
         ScriptArguments: '-Workspace $(Workspace) -RulesFile analytics-rules.json'
         azurePowerShellVersion: LatestVersion
         pwsh: true

 

 

 

 

 

As you can see, we now have two stages: build and deploy. We also had to define resources to reference the artifact that we need from our Scripts build pipeline.

 

The build stage is the same as the one we did for scripts, the only difference being that we validate the JSON files syntax (again, using the Files Validator task)

 

In the deployment stage, we have a couple of new things. First, we are pointing to the variable group that we defined some minutes ago. For that, we use the variables keyword. Then we need to download the artifacts that we will use in our deployment task. For that, we use the download keyword.

 

As the last step in our CICD pipeline, we will use an Azure Powershell task where we will point to our script and specify any parameters needed. As you can see, we reference the imported variables here. One last peculiarity of this pipeline is that we need to use Powershell Core (required by AzSentinel), so we need to specify that with pwsh.

 

If everything went correctly, we would run this pipeline now and verify that our Sentinel analytics rules were deployed automatically. :smile:

 

This and all the other pipelines for the rest of the components are in our repo inside the Pipelines folder.

For Onboarding, the pipeline has no automatic triggers, as we consider that this would be executed only once at installation time.

 

Working with multiple workspaces

 

Whether you are a customer with an Microsoft Sentinel environment containing multiple workspaces or you’re a partner that needs to operate several customers, you need to have a strategy to manage more than one workspace.

 

As you have seen during the article, we have used a variable group to store details like resource group and workspace name. These values will change if we need to manage multiple workspaces, so we would need more than one variable group. For example, one for customer A and another for customer B, or one for Europe and one for Asia.

 

After that’s done, we can choose between two approaches:

  1. Add more stages to your current pipelines. Until now, we only had one deploy stage that deployed to our only Sentinel environment, but now we can add additional stages (with the same steps and tasks) that deploy to other resource groups and workspaces.
  2. Create new pipelines. We can just clone our existing pipelines and just modify the variable group to point to a different target environment.

 

In Summary

We have shown you how to describe your Microsoft Sentinel deployment using code and then use a DevOps tool to deploy that code into your Azure environment.

180 Comments
Co-Authors
Version history
Last update:
‎Nov 17 2021 04:04 AM
Updated by: