Microsoft Secure Tech Accelerator
Apr 03 2024, 07:00 AM - 11:00 AM (PDT)
Microsoft Tech Community
Deploying and Managing Microsoft Sentinel as Code
Published Jan 28 2020 12:28 AM 123K Views
Microsoft

clipboard_image_1.png

 

Philippe Zenhaeusern and Javier Soriano co-author this blog post.

 

The content of this blog is not up to date anymore. The new recommended way to manage content as code in Microsoft Sentinel is Repositories.

 

In the last few months working on Microsoft Sentinel, we have talked to many partners and customers about ways to automate Microsoft Sentinel deployment and operations.

 

These are some of the typical questions: How can I automate customer onboarding into Sentinel? How can I programmatically configure connectors? As a partner, how do I push to my new customer all the custom analytics rules/workbooks/playbooks that I have created for other customers?

 

In this post, we will try to answer all these questions, not only describing how to do it but also giving you some of the work done with a repository that contains a minimum viable product (MVP) around how to build a full Sentinel as Code environment.

 

The post will follow this structure:

 

  1.  Infrastructure as Code
  2.  Microsoft Sentinel Automation Overview
  3.  Automating the deployment of specific Microsoft Sentinel components
  4.  Building your Sentinel as Code in Azure DevOps

We recommend you go one by one in order to fully understand how it works.

 

Infrastructure as Code

You might be familiar with the Infrastructure as Code concept. Have you heard about the Azure Resource Manager, Terraform, or AWS Cloud Formation? Well, they are all ways to describe your infrastructure as code so that you can treat it as such…put it under source control (e.g., git, svn), so you can track changes to your infrastructure the same way you track changes in your code. You can use any source control platform, but in this article, we will use Github.

 

Besides treating your infrastructure as code, you can also use DevOps tooling to test that code and deploy that infrastructure into your environment, all in a programmatic way. This is also referred to as Continuous Integration/Continuous Delivery (CICD). Please take a look at this article if you want to know more. This post will use Azure DevOps as our DevOps tool, but the concepts are the same for any other tool.

 

The whole idea is to codify your Microsoft Sentinel deployment in the Sentinel context and put it in a code repository. Every time there is a change in the files that define this Sentinel environment, this change will trigger a pipeline that will verify the changes and deploys them into your Sentinel environment. But how do we programmatically make these changes into Sentinel?

 

Microsoft Sentinel Automation Overview

As you probably know, there are different components inside Microsoft Sentinel…we have Connectors, Analytics Rules, Workbooks, Playbooks, Hunting Queries, Notebooks, and so on.

These components can be managed easily through the Azure Portal, but what can I use to modify all these programmatically?

 

Here is a table that summarizes what can be used for each:

 

Component

Automated with

Onboarding

API, Powershell, ARM

Alert Rules

API, Powershell

Hunting Queries

API, Powershell

Playbooks

ARM

Workbooks

ARM

Connectors

API

 

  • Powershell: Special thanks to Wortell for writing the AzSentinel module, which greatly facilitates many of the tasks. We will use it in the three components that support it (Onboarding, Alert Rules, Hunting Queries).
  • API: Some components don’t currently have a Powershell module and can only be configured programmatically via API. The Sentinel API is now public, and its details can be found here. We will use it to enable Connectors.
  • ARM: This is Azure’s native management and deployment service. You can use ARM templates to define Azure resources as code. We will use it for Playbooks and Workbooks.

How to structure your Sentinel code repository

Here we would like to show what we think is the recommended way to structure your repository.

 

 

 

 

 

 

 

 

|
|- contoso/  ________________________ # Root folder for customer
|  |- AnalyticsRules/  ______________________ # Subfolder for Analytics Rules
|     |- analytics-rules.json _________________ # Analytics Rules definition file (JSON)
|
|  |- Connectors/  ______________________ # Subfolder for Connectors
|     |- connectors.json _________________ # Connectors definition file (JSON)
|
|  |- HuntingRules/ _____________________ # 
|     |- hunting-rules.json _______________ # Hunting Rules definition file (JSON)
|
|  |- Onboard/  ______________________ # Subfolder for Onboarding
|     |- onboarding.json _________________ # Onboarding definition file (JSON)
|
|  |- Pipelines/ _____________________ # Subfolder for Pipelines 
|     |- pipeline.yml _______________ # Pipeline definition files (YAML)
|
|  |- Playbooks/  ______________________ # Subfolder for Playbooks
|     |- playbook.json _________________ # Playbooks definition files (ARM)
|
|  |- Scripts/ _____________________ # Subfolder for script helpers 
|     |- CreateAnalyticsRules.ps1 _______________ # Script files (PowerShell)
|
|  |- Workbooks/  ______________________ # Subfolder for Workbooks
|     |- workbook-sample.json _________________ # Workbook definition files (ARM)

 

 

 

 

 

 

 

 

You can find a sample repository with this structure here.

 

We will use this same repository throughout this post as we have placed there the whole testing environment. Note: take into account that this is just a Minimum Viable Product and is subject to improvements. Feel free to clone it and enhance it.

 

Automating deployment of specific Microsoft Sentinel components

Now that we have a clear view of what to use to automate what and how to structure our code repository, we can start creating things. Let’s go, component by component, detailing how to automate its deployment and operation.

 

Onboarding

Thanks to the AzSentinel Powershell module by Wortell, we have a command that simplifies this process. We just need to execute the following command to enable Sentinel on a given Log Analytics workspace:

 

Set-AzSentinel [-SubscriptionId <String>] -WorkspaceName <String> [-WhatIf] [-Confirm] [<CommonParameters>]

We have a created a script (InstallSentinel.ps1) with some more logic in it, so we can use it in our pipelines. This script takes a configuration file (JSON) as an input where we specify the different workspaces where the Sentinel (SecurityInsights) solution should be enabled. The file has the following format:

 

 

 

 

 

 

 

 

 

{
    "deployments": [
        {
            "resourcegroup": "<rgname>",
            "workspace": "<workspacename>"
        },
        {
            "resourcegroup": "<rgname2>",
            "workspace": "<workspacename2>"
        }
    ]
}

 

 

 

 

 

 

 

 

The InstallSentinel.ps1 script is located in our repo here and has the following syntax:

 

InstallSentinel.ps1 -OnboardingFile <String>

We will use this script in our pipeline.

 

Connectors

Sentinel Data Connectors can currently only be automated over the API, which is not officially documented yet. However, with Developer Tools enabled in your browser, it is quite easy to catch the related connector calls. Please take into account that this API might change in the future without notice, so be cautious when using it.

 

The following script runs through an example connecting to “Azure Security Center” and “Azure Activity Logs” to the Sentinel workspace. Both are very common connectors to collect data from your Azure environments. (Be aware that some connectors will require additional rights, connecting the “Azure Active Directory” source, for instance, will require additional AAD Diagnostic Settings permissions besides the “Global Administrator” or “Security Administrator” permissions on your Azure tenant.)

The “EnableConnectorsAPI.ps1” script is located inside our repo here and has the following syntax:

 

EnableConnectorsAPI.ps1 -TenantId <String> -ClientId <String> -ClientSecret <String> -SubscriptionId <String> -ResourceGroup <String> -Workspace <String> -ConnectorsFile <String>

The ConnectorsFile parameter references a JSON file that specifies all the data sources you want to connect to your Sentinel workspace. Here is a sample file:

 

 

 

 

 

 

 

{
    "connectors": [
    {
        "kind": "AzureSecurityCenter",
        "properties": {
            "subscriptionId": "subscriptionId",
            "dataTypes": {
                "alerts": {
                    "state": "Enabled"
                }
            }
        },
    },
    {
        "kind": "AzureActivityLog",
        "properties": {
            "linkedResourceId": "/subscriptions/subscriptionId/providers/microsoft.insights/eventtypes/management"
        }
    }]
}

 

 

 

 

 

 

 

 

The script will iterate through this JSON file and enable the data connectors one by one. This JSON file should be placed into the Connectors directory so the script can read it.

 

As you can imagine, there are some connectors that cannot be automated, like all the ones based on Syslog/CEF, as they require installing an agent.

 

Analytics Rules

The AzSentinel Powershell module provides a command to be able to create new Analytics Rules (New-AzSentinelAlertRule), passing a bunch of parameters to define the rule characteristics. An even more interesting command allows you to create analytics rules based on an input file where all the rules' properties are specified. This command is Import-AzSentinelAlertRule.

 

We have created a script that takes the workspace and rules file and creates the analytics rules accordingly.

 

The script is located inside our repo here and has the following syntax:

CreateAnalyticsRules.ps1 -Workspace <String> -RulesFile <String>

 

As you can see, one of the parameters is a rules file (in JSON format) where you will specify all the rules (of any type) that need to be added to your Sentinel environment. Here is a sample file:

 

 

 

 

 

 

 

 

{
  "Scheduled": [
    {
      "displayName": "AlertRule01",
      "description": "",
      "severity": "Medium",
      "enabled": true,
      "query": "SecurityEvent | where EventID == \"4688\" | where CommandLine contains \"-noni -ep bypass $\"",
      "queryFrequency": "5H",
      "queryPeriod": "6H",
      "triggerOperator": "GreaterThan",
      "triggerThreshold": 5,
      "suppressionDuration": "6H",
      "suppressionEnabled": false,
      "tactics": [
        "Persistence",
        "LateralMovement",
        "Collection"
      ],
      "playbookName": "",
      "aggregationKind": "SingleAlert",
      "createIncident": true,
      "groupingConfiguration": {
        "enabled": false,
        "reopenClosedIncident": false,
        "lookbackDuration": "PT5H",
        "entitiesMatchingMethod": "All",
        "groupByEntities": [
          "Account",
          "Ip",
          "Host",
          "Url"
        ]
      }
    },
    {
      "displayName": "AlertRule02",
      "description": "",
      "severity": "Medium",
      "enabled": true,
      "query": "SecurityEvent | where EventID == \"4688\" | where CommandLine contains \"-noni -ep bypass $\"",
      "queryFrequency": "5H",
      "queryPeriod": "6H",
      "triggerOperator": "GreaterThan",
      "triggerThreshold": 5,
      "suppressionDuration": "6H",
      "suppressionEnabled": false,
      "tactics": [
        "Persistence",
        "LateralMovement",
        "Collection"
      ],
      "playbookName": ""
    }
  ],
  "Fusion": [
    {
      "displayName": "Advanced Multistage Attack Detection",
      "enabled": true,
      "alertRuleTemplateName": "f71aba3d-28fb-450b-b192-4e76a83015c8"
    }
  ],
  "MLBehaviorAnalytics": [
    {
      "displayName": "(Preview) Anomalous SSH Login Detection",
      "enabled": true,
      "alertRuleTemplateName": "fa118b98-de46-4e94-87f9-8e6d5060b60b"
    }
  ],
  "MicrosoftSecurityIncidentCreation": [
    {
      "displayName": "Create incidents based on Azure Active Directory Identity Protection alerts",
      "description": "Create incidents based on all alerts generated in Azure Active Directory Identity Protection",
      "enabled": true,
      "productFilter": "Microsoft Cloud App Security",
      "severitiesFilter": [
        "High",
        "Medium",
        "Low"
      ],
      "displayNamesFilter": null
    }
  ]
}

 

 

 

 

 

 

 

 

As you can see, Fusion and MLBehaviorAnalytics rules need a field called alertRuleTemplateName. This is an ID that is consistent across all Sentinel environments, so you should use the same values in your own files. As Sentinel grows, we are adding more MLBehaviorAnalytics rules, so you might need to get the alertRuleTemplateName values in order for you to add them to your rules JSON file. In order to get the values for alertRuleTemplateName, you can execute the following command available in AzSentinel:

 

Get-AzSentinelAlertRuleTemplates -WorkspaceName <workspace_name> -Kind MLBehaviorAnalytics

The output will contain a name field that contains the alertRuleTemplateName value.

 

The script will iterate through this JSON file and create/enable the analytics rule alerts. The script also supports updating existing alerts that are already enabled. This JSON file should be placed into the Analytics Rules directory so the script can read it. The script also supports attaching playbooks for automated response to an alert. This is specified in the playbook property for each alert in the JSON file.

 

Workbooks

Workbooks are a native object in Azure, and therefore, can be created through an ARM template. The idea is that you would place all the custom workbooks that you have developed inside the Workbooks folder in your repo, and any change on these will trigger a pipeline that creates them in your Sentinel environment.

 

We have created a script (placed in the same repo here) that can be used to automate this process. It has the following syntax:

CreateWorkbooks.ps1 -SubscriptionId <String> -ResourceGroup <String> -WorkbooksFolder <String> -Workspace <String>

The script will iterate through all the workbooks in the WorksbooksFolder and deploy them into your Microsoft Sentinel instance.

 

Also, consider that the deployment will fail if a workbook with the same name already exists.

If you're building your own workbook ARM template, make sure that you add "sentinel" as the workbookType in the template (look at our examples here)

 

Hunting Rules

In order to automate the deployment of Hunting Rules, we will use the AzSentinel module.

We have a created another script that takes as an input a JSON file where all the Hunting Rules are defined. The script will iterate over them and create/update them accordingly.

 

The syntax for this script is the following:

CreateHuntingRulesAPI.ps1 -Workspace <String> -RulesFile <String>

 

Playbooks

This will work the same way as Workbooks. Playbooks use Azure Logic Apps in order to automatically respond to incidents. Logic Apps are a native resource in ARM, and therefore we can automate its deployment with ARM templates. The idea is that you would place all the custom playbooks that you have developed inside the Playbooks folder in your repo, and any change on these will trigger a pipeline that creates them in your Sentinel environment.

 

We have created a script (placed in the same repo here) that can be used to automate this process. It has the following syntax:

CreatePlaybooks.ps1 -ResourceGroup <String> -PlaybooksFolder <String>

This script will succeed even if the playbooks are already there.

 

Building your Sentinel as Code in Azure DevOps

Now that we have a clear view of how to structure our code repository and what to use to automate each Sentinel component, we can start creating things in Azure DevOps. This is a high-level list of tasks that we will perform:

 

  • Create an Azure DevOps organization
  • Create a project in Azure DevOps
  • Create a service connection to your Azure environment/s
  • Create variables
  • Connect your existing code repository with your Az DevOps project
  • Create pipelines

Let’s review them one by one.

 

Create an Azure DevOps organization

This is the first step in order to have your Azure DevOps environment. You can see the details on how to do this here.

 

Create a project in Azure DevOps

A project provides a repository for source code and a place for a group of people to plan, track progress, and collaborate on building software solutions. It will be the container for your code repository, pipelines, boards, etc. See instructions on how to create it here.

 

Create a service connection to your Azure environment

In order to talk to our Azure environment, we need to create a connection with specific Azure credentials. In Azure DevOps, this is called a service connection. The credentials that you will use to create this service connection are typically a service principal account defined on Azure.

 

You have full details on how to create a service connection here. Once you have created the principal, you will need to grant it access to your Azure environment where Sentinel would live.

 

These are the fields you need to provide to create your service connection:

 

clipboard_image_0.png

Take a note of the Connection name you provide, as you will need to use this name in your pipelines.

 

Create variables

We are going to need several variables defined in the Azure DevOps environment so they can be passed to our scripts to specify the Sentinel workspace, resource group, config files, and API connection information.

 

As we will need these variables across all our pipelines, the best thing to do is create an Azure DevOps variable group. With this, we can define the variable group once and then reuse it in different pipelines across our project. Here you have instructions on how to do it.

 

We have called our variable group “Az connection settings”; this is important because we will reference this name in our pipelines. Here is a screenshot of the variables that we will need to define:

clipboard_image_1.png

 

Connect your existing code repository with your Az DevOps project

In this article, you can see how to import an existing repo into Az DevOps. It works for Github, Bitbucket, Gitlab, and other locations. See instructions here.

 

Create pipelines

There are two ways to create our Azure Pipelines: in classic mode or as YAML files. We are going to create them as YAML files because that way, we can place them into our code repository so they can be easily tracked and reused anywhere. Here you have the basic steps to create a new pipeline.

 

In the new pipeline wizard, select Github YAML in the Connect step:

clipboard_image_2.png

Then select your repository and then choose Starter pipeline if you want to build your own pipeline, or Existing Azure Pipelines YAML file if you want to use the ones we already have in the repository:

Capture.PNG

We are going to create one CI (build) pipeline for Scripts and several CICD (build+deploy) pipelines (one for each Sentinel component).

 

Create a CI pipeline for Scripts

We will treat Scripts slightly differently than the rest. This is because it is not a Sentinel component, and the scripts themselves won’t get deployed to Azure. We will just use them to deploy other things.

Because of this, the only thing we need to do with scripts is to make sure they are available in the other pipelines to be used as artifacts. To accomplish this, we just need two tasks in our CI pipeline: Copy Files and Publish Pipeline Artifact. 

 

Update! we have now added a syntax validator in our pipelines based on the Files Validator task available in the Visual Studio marketplace. You will need to install this task if you want to use our templates.

 

Here is an example of the YAML code that will define this pipeline:

 

 

 

 

 

 

# Scripts build pipeline
# Copies script files to the agent and publishes an artifact with them

trigger:
 paths:
   include:
     - Scripts/*

pool:
  vmImage: 'windows-2019'

steps:
- task: CopyFiles@2
  displayName: 'Copy Scripts'
  inputs:
    SourceFolder: Scripts
    TargetFolder: '$(build.artifactstagingdirectory)'
- task: Files-Validator@1
  inputs:
    rootDir: '$(build.artifactstagingdirectory)/*.ps1'
    validateXML: false
    validateJSON: false
    validateYAML: false
    validatePS: true
- task: PublishPipelineArtifact@1
  displayName: 'Publish Pipeline Artifact'
  inputs:
    targetPath: Scripts
    artifact: Scripts

 

 

 

 

 

 

As you can see, we have added three tasks, one to copy the script files, another one that checks the PowerShell syntax, and the last one to publish the pipeline artifacts. You can find this pipeline in our Github repo here.

 

Create CICD pipelines for each Sentinel component

With the Scripts now available as an artifact, we can now use them in our Sentinel component pipelines. These pipelines will be different from the previous one because we will do CI and CD (build+deploy). We define these in our YAML pipeline file as stages.

 

Here is one sample pipeline for Analytics Rules:

 

 

 

 

 

 

# Analytics Rules build and deploy pipeline
# This pipeline publishes the rules file as an artifact and then uses a powershell task to deploy

name: build and deploy Alert Rules
resources:
 pipelines:
   - pipeline: Scripts
     source: 'scriptsCI'
trigger:
 paths:
   include:
     - AnalyticsRules/*

stages:
- stage: build_alert_rules

  jobs:
    - job: AgentJob
      pool:
       name: Azure Pipelines
       vmImage: 'vs2017-win2016'
      steps:
       - task: CopyFiles@2
         displayName: 'Copy Alert Rules'
         inputs:
          SourceFolder: AnalyticsRules
          TargetFolder: '$(Pipeline.Workspace)'
       - task: Files-Validator@1
         inputs:
           rootDir: '$(Pipeline.Workspace)/*.json'
           validateXML: false
           validateJSON: true
           validateYAML: false
           validatePS: false
       - task: PublishBuildArtifacts@1
         displayName: 'Publish Artifact: RulesFile'
         inputs:
          PathtoPublish: '$(Pipeline.Workspace)'
          ArtifactName: RulesFile

- stage: deploy_alert_rules
  jobs:
    - job: AgentJob
      pool:
       name: Azure Pipelines
       vmImage: 'windows-2019'
      variables: 
      - group: Az connection settings
      steps:
      - download: current
        artifact: RulesFile
      - download: Scripts
        patterns: '*.ps1'
      - task: AzurePowerShell@4
        displayName: 'Create and Update Alert Rules'
        inputs:
         azureSubscription: 'Soricloud Visual Studio'
         ScriptPath: '$(Pipeline.Workspace)/Scripts/Scripts/CreateAnalyticsRules.ps1'
         ScriptArguments: '-Workspace $(Workspace) -RulesFile analytics-rules.json'
         azurePowerShellVersion: LatestVersion
         pwsh: true

 

 

 

 

 

As you can see, we now have two stages: build and deploy. We also had to define resources to reference the artifact that we need from our Scripts build pipeline.

 

The build stage is the same as the one we did for scripts, the only difference being that we validate the JSON files syntax (again, using the Files Validator task)

 

In the deployment stage, we have a couple of new things. First, we are pointing to the variable group that we defined some minutes ago. For that, we use the variables keyword. Then we need to download the artifacts that we will use in our deployment task. For that, we use the download keyword.

 

As the last step in our CICD pipeline, we will use an Azure Powershell task where we will point to our script and specify any parameters needed. As you can see, we reference the imported variables here. One last peculiarity of this pipeline is that we need to use Powershell Core (required by AzSentinel), so we need to specify that with pwsh.

 

If everything went correctly, we would run this pipeline now and verify that our Sentinel analytics rules were deployed automatically. :smile:

 

This and all the other pipelines for the rest of the components are in our repo inside the Pipelines folder.

For Onboarding, the pipeline has no automatic triggers, as we consider that this would be executed only once at installation time.

 

Working with multiple workspaces

 

Whether you are a customer with an Microsoft Sentinel environment containing multiple workspaces or you’re a partner that needs to operate several customers, you need to have a strategy to manage more than one workspace.

 

As you have seen during the article, we have used a variable group to store details like resource group and workspace name. These values will change if we need to manage multiple workspaces, so we would need more than one variable group. For example, one for customer A and another for customer B, or one for Europe and one for Asia.

 

After that’s done, we can choose between two approaches:

  1. Add more stages to your current pipelines. Until now, we only had one deploy stage that deployed to our only Sentinel environment, but now we can add additional stages (with the same steps and tasks) that deploy to other resource groups and workspaces.
  2. Create new pipelines. We can just clone our existing pipelines and just modify the variable group to point to a different target environment.

 

In Summary

We have shown you how to describe your Microsoft Sentinel deployment using code and then use a DevOps tool to deploy that code into your Azure environment.

180 Comments
Copper Contributor

Great article! Regarding Connector you can use this script here https://github.com/azsec/azure-sentinel-tools/blob/master/scripts/Connect-AzureSecurityCenter.ps1 to connect all ASCs from a list of target subscription.

 

I hope API will be released soon. The real-world issue is not the creation but the update and continuous change on Analytics rule that require a little tricky in the pipeline. A use case if that you would want to tune the analytics rule to increase fidelity - so having a fixed rule is not something we expect.

 

I'm not sure if you have tested Update case?

Microsoft

Thanks for the comments @azsec !

 

Yes, we have tested the update of Analytics Rules and it works. Of course there might be corner cases where it doesn't work...but in general you can just update the json file, the pipeline will trigger automatically, identify that the rule already exists and update accordingly. Take a look at lines 54 to 172 in the script here, this is where existing rules are handled.

 

In any case, as we say in the post, this is an MVP and for sure it can be improved.

 

Regards

Copper Contributor

Thank you for replies. In the real-world deployment you would probably name your rule with a unique ID (GUID) and when performing an update the pipeline should know which rule it needs to update. This would sound like an egg-and-chicken story. Otherwise the pipeline checks the display name and get its unique ID (aka name).

Awesome blogpost :cool: Thanks for Sharing with the Community!

Brass Contributor

Hi @Javier Soriano,


Thanks for sharing this.
It would be great to be able to replicate our initial deployment and keep improving as required.

 

Regarding the DevOps approach, I was able to import the repo, build the scripts artifact, but keep getting errors when trying to run the onboardingCICD.yml pipeline.


Error is: "Unable to resolve definition scriptsCI in project ...."

I cannot find any documentation regarding that error. Any ideas?

 

Thanks in advance

Microsoft

Hi @caiodaruizcorrea ,

 

It looks like the pipeline is not able to find the source pipeline (in our case scriptsCI). Here you have the reference documentation on how the pipeline artifact is defined in YAML.

 

Review that the name scriptsCI is the actual name of your pipeline. To do that, click on Pipelines->Pipelines, and then select Edit/Rename for your scripts pipeline. It should look like this:

 

Capture.PNG

 

Let me know if this doesn't fix it.

 

Regards

Brass Contributor

Hi @Javier Soriano,

 

I ended up figuring it out myself yesterday after a bunch of failures that was the pipeline name that I was using initially to build the scripts artifact was just using a default name instead.

 

Thanks for the screenshot anyway!


Regarding, the onboarding yaml file (and AzSentinel Powershell module), is it supposed to create the pre-requisites for Sentinel such as the Azure resource group, analytics workspace, and Sentinel link to the workspace, or is it expecting them to be created in advance?

 

Thanks again

Microsoft

hi @caiodaruizcorrea , no, it will not create the workspace or the resource group. The script expects these two things to be already in place.

 

Glad you figured out the issue :)

 

Regards

Copper Contributor

Hi Javier,

 

Thank for all your help.

 

Is it possible to deploy alert rules via YAML files, similar to the ones available in sentinel GitHub with long KQL queries, via the pipelline. Using JSON files make long queries difficult to write and maintain.

 

Regards

Microsoft

Yes, why not. You just need to build the logic so the script is able to iterate through the YAML file with the rules. You could use other sources too...text files, csv files, etc.

 

Regards

Copper Contributor

Hi Javier,

 

i cant seem to be able to deploy long queries, e.g: - is there a limit?

 

 

let starttime = 14d;
let endtime = 1d;
// The number of operations below which an IP address is considered an unusual source of role assignment operations
let alertOperationThreshold = 5;
let createRoleAssignmentActivity = AzureActivity
| where OperationName == \"Create role assignment\";
createRoleAssignmentActivity
| where TimeGenerated between (ago(starttime) .. ago(endtime))
| summarize count() by CallerIpAddress, Caller
| where count_ >= alertOperationThreshold
| join kind = rightanti (
createRoleAssignmentActivity
| where TimeGenerated > ago(endtime)
| summarize StartTimeUtc = min(TimeGenerated), EndTimeUtc = max(TimeGenerated), ActivityTimeStamp = makelist(TimeGenerated), ActivityStatus = makelist(ActivityStatus),
OperationIds = makelist(OperationId), CorrelationId = makelist(CorrelationId), ActivityCountByCallerIPAddress = count()
by ResourceId, CallerIpAddress, Caller, OperationName, Resource, ResourceGroup
) on CallerIpAddress, Caller
| extend timestamp = StartTimeUtc, AccountCustomEntity = Caller, IPCustomEntity = CallerIpAddress

 

Microsoft

Mmm, I would need to check that. Adding @Pouyan Khabazi as he built the AZSentinel module and may have more details. 

Brass Contributor

hi @kay106 Just tested your example query and was able to successfully create the alert rule using the import function, see below the JSON file I have used:

{
  "analytics": [
    {
      "displayName": "AlertRule010001",
      "description": "",
      "severity": "Medium",
      "enabled": true,
      "query": "let starttime = 14d;
      let endtime = 1d;
      // The number of operations below which an IP address is considered an unusual source of role assignment operations
      let alertOperationThreshold = 5;
      let createRoleAssignmentActivity = AzureActivity
      | where OperationName == \"Create role assignment\";
      createRoleAssignmentActivity
      | where TimeGenerated between (ago(starttime) .. ago(endtime))
      | summarize count() by CallerIpAddress, Caller
      | where count_ >= alertOperationThreshold
      | join kind = rightanti (
      createRoleAssignmentActivity
      | where TimeGenerated > ago(endtime)
      | summarize StartTimeUtc = min(TimeGenerated), EndTimeUtc = max(TimeGenerated), ActivityTimeStamp = makelist(TimeGenerated), ActivityStatus = makelist(ActivityStatus),
      OperationIds = makelist(OperationId), CorrelationId = makelist(CorrelationId), ActivityCountByCallerIPAddress = count()
      by ResourceId, CallerIpAddress, Caller, OperationName, Resource, ResourceGroup
      ) on CallerIpAddress, Caller
      | extend timestamp = StartTimeUtc, AccountCustomEntity = Caller, IPCustomEntity = CallerIpAddress",
      "queryFrequency": "5H",
      "queryPeriod": "6H",
      "triggerOperator": "GreaterThan",
      "triggerThreshold": 5,
      "suppressionDuration": "6H",
      "suppressionEnabled": false,
      "tactics": [
        "Persistence",
        "LateralMovement",
        "Collection"
      ],
      "playbookName": ""
    }
  ]
}

Output: 

Annotation 2020-03-15 150720.png

 

please let me know if you keep experiencing error's, you can also open an issue on GitHub and share your error message etc. with us for further troubleshooting: https://github.com/wortell/AZSentinel/issues

Microsoft

Thanks @Pouyan Khabazi ! I also tested with the pipeline used in our repo and it is successful as well.

 

Let us know if you are still facing the issue

Copper Contributor

Great blog post!

Brass Contributor

I get the below error when I run the pipeline.  Appreciate your advice. 


The Pipeline is not valid. Unable to resolve latest version for pipeline Scripts. This could be due to inaccessible  pipeline or no version is available.

Microsoft

Hi @PrashTechTalk , take a look at the comment from @caiodaruizcorrea above...I think he faced the same issue and at the end had to do with the name you provide to the pipeline.

 

Also, make sure that you run the ScriptsCI pipeline first, so the artifacts are available.

 

Regards

Brass Contributor

Thanks Javier.  File Naming was the issue as you clearly pointed.   I progressed a step ahead a step in onboarding sentinel on the workspace although job results are successful I don't see sentinel being enabled on my workspace.  Trying to figure out what the issue is.... 

error.PNGspace.

Microsoft

It looks like your onboarding file was not there...if you look at the script here, you can see that if you have workspaces to act on, it would get into the for loop and write a message for each workspace that is being processed, and you didn't get anything. 

Copper Contributor

Hi @Javier Soriano,

I am working to enable more connects such as a Threat intelligence - TAXII, Azure AD and Threat Intelligence Platforms.
After modify the Json with a parameters necessary I am struggling in the Powershell script to Add those connectors.

One of those lines I saw this comment:
#unknown ID, clarify with Javi
$curiousId = "1e1b282a-ce14-4feb-8bc1-48249fab9109"
$uri = "$baseUri/providers/Microsoft.SecurityInsights/dataConnectors/${curiousId}?api-version=2019-01-01-preview"


It is using to call the API, could you clarify it?
Another point, Do you have a good documentation about that API? 

 
Copper Contributor

Hi Javier,

 

I can deploy the Alert Rules to Dev, Qa, Prod, and various other tennats. The only issue is the naming, I need the naming to reflect the envinronment. E.g RO-001-APGDev... RO-002-APGQa... RO-002-APGPrd etc. What is the best way to reflect this naming covention on the Alert Rule name?

 

Thanks, 

Kay

Microsoft

Hi @kay106 , the way I would do it is creating a new variable in your variable group that contains the environment (Dev, Prod, etc.) and then using that variable in your script, where you will append the alert rule name to whatever environment is being pushed at that time.

 

Hope this helps.

 

Regards

Copper Contributor

Hi Javier,

 

Thanks I got this to work by modifing the script.

 

I now face another challenge. I have backlashes in my KQL query e.g:

 

SecurityEvent
| where EventID == \"5145\"
| where AccountType == \"User\"
| where ShareName == \"\\\\*\\SYSVOL$\"

 

^ i need the four backlashes behind * and the two after the *

 

Whenever I use backlash including double backlash to escape, the Alert rule isn't created. Please let me know how I can counter this problem.

 

Thank you in advance.

 

Brass Contributor

@Javier Soriano   - Sadly i am beating around the bush to understand the error. My script is able to call & execute the shell file from Git.  But the below line from the file doesn't works...  Write-Host line "Processing workspace..."  before this script works but the later one doesn't write nor failing with any error.   Information of resource group & workspace at Onboarding.json file is also correct.

 

What other issues can you point this in case ? 

 

 $solutions = Get-AzOperationalInsightsIntelligencePack -resourcegroupname $item.resourcegroup -WorkspaceName $item.workspace -WarningAction:SilentlyContinue 

 

Prash915_0-1585954268885.png

 

Thanks.

Microsoft

Hi @PrashTechTalk , add some debugging to the script, for example, print the workspace variable to see if it contains anything

Brass Contributor

@Javier Soriano - Thankyou for your response..

 

Debugging scripts to print workspace variables prints all values before this line .

 $solutions = Get-AzOperationalInsightsIntelligencePack -resourcegroupname $item.resourcegroup -WorkspaceName $item.workspace -WarningAction:SilentlyContinue 

 

When i tried executing this line at a powershell console it works perfectly fine but not the same when executed at devops pipe.   

However i noticed the following when enabled diagnostics.  I am not sure if the issue is related to the .net framework version.. appreciate your response.

 

devops diagnostics.PNG

Thanks

Microsoft

Hi @PrashTechTalk , from you previous screenshot I see that at least there was one workspace to process...once there, it should go to either install Sentinel or discard. Did you check the agent that you're using in the pipeline? In our example we are using windows-2019 image and powershell core. See the yaml pipeline definition here: https://github.com/javiersoriano/sentinelascode/blob/master/Pipelines/onboardingCICD.yml

Copper Contributor

Hi,

About the Workbooks ...

I understand to create a new ones I need to change workbook ID inside json file and change the value serializedData in this json as well.

My question is, how can I convert json samples I can find in Sentinel Github to these serializedData format?

Could you help me?

Microsoft

Hi @alexlimabh ,

 

No, you don't have to change the workbookId in the json file. Workbook ID is passed as a parameter that you add to your variable group in Azure DevOps.

You have two ways to get the json data from a workbook, full ARM template or Gallery template. I recommend using the full ARM template and just placing it in your Workbooks folder in your repo...that should work just fine. If you choose Gallery template that will contain just the serialized parameter contents and you will have to do more copy/paste. See screenshot below:

tempsnip.png

 

 

Copper Contributor

Hi @Javier Soriano,

Thank you for the tip.

When I execute the script manually everything working.

However, after change the session workbookSourceId in json to use variables as you can see below:

 

"workbookSourceId": {
"type": "string",
"defaultValue": "/subscriptions/${SubscriptionId}/resourcegroups/${ResourceGroup}/providers/microsoft.operationalinsights/workspaces/${Workspace}",
"metadata": {
"description": "The id of resource instance to which the workbook will be associated"
}
},

 

The pipeline was execute with success and the logs looks like good, but the workbook wasn't created.

 

PS: I removed the subscriptionId manually.

 

2020-04-07T13:01:59.7495956Z Folder is: D:\a\1/Workbooks
2020-04-07T13:01:59.7536587Z Files are:  D:\a\1\Workbooks\securityalert.json
2020-04-07T13:02:18.5001829Z 
2020-04-07T13:02:18.5019374Z DeploymentName          : securityalert
2020-04-07T13:02:18.5037233Z ResourceGroupName       : ***
2020-04-07T13:02:18.5057194Z ProvisioningState       : Succeeded
2020-04-07T13:02:18.5082536Z Timestamp               : 4/7/2020 1:02:17 PM
2020-04-07T13:02:18.5116416Z Mode                    : Incremental
2020-04-07T13:02:18.5117206Z TemplateLink            : 
2020-04-07T13:02:18.5135246Z Parameters              : 
2020-04-07T13:02:18.5153321Z                           Name                   Type                       Value     
2020-04-07T13:02:18.5205181Z                           =====================  =========================  ==========
2020-04-07T13:02:18.5219399Z                           workbookDisplayName    String                     Azure Activity
2020-04-07T13:02:18.5236614Z                           workbookType           String                     sentinel  
2020-04-07T13:02:18.5255757Z                           workbookSourceId       String                     /subscriptions/$SubscriptionId/resourcegrou
2020-04-07T13:02:18.5276810Z                           ps/$ResourceGroup/providers/microsoft.operationalinsights/workspaces/$Workspace
2020-04-07T13:02:18.5297694Z                           workbookId             String                     202bf405-ea37-4056-8a92-7727de4dc790
2020-04-07T13:02:18.5318274Z                           
2020-04-07T13:02:18.5332367Z Outputs                 : 
2020-04-07T13:02:18.5353029Z                           Name             Type                       Value     
2020-04-07T13:02:18.5372683Z                           ===============  =========================  ==========
2020-04-07T13:02:18.5391000Z                           workbookId       String                     /subscriptions/XXXXXX-XXX-XXXX-XXX-XX
2020-04-07T13:02:18.5410061Z                           /resourceGroups/***/providers/microsoft.insights/workbooks/202bf405-ea37-4056-8a
2020-04-07T13:02:18.5444302Z                           92-7727de4dc790
2020-04-07T13:02:18.5446523Z                           
2020-04-07T13:02:18.5457200Z DeploymentDebugLogLevel : 

If I hard coded everything (SubscriptionId, WorkSpace, ResourceGroups) I can deploy without any issue.

Any clue to fix it?

 

Brass Contributor

@Javier Soriano,  Issue was at the service connection as i established SP automatic connection (because the tool says recommended)  instead of manual.   Thus resolved upon creating Service principal with manual configuration . Thankyou.

 

 

 

 

 

 

Microsoft

@alexlimabh you don't have to modify the workbook json file to enter the workbookSourceId. The script takes care of that. You just need to add a new variable to the variable group that contains the workbookId and the script will take care of the rest.

 

If you're grabbing the workbook from outside Sentinel, make sure that the workbookType parameter is set to sentinel. If you don't do this it won't be created within Sentinel and you won't see it.

 

Regards

Copper Contributor

Hi @Javier Soriano ,
I am here again, I am trying to create a new connector, but I am experiencing some issues.
I modified the json connector and the EnableConnectorsAPI.ps1 script.
When I tried to create the connector for Azure AD, Threat Intelligence, the API returned the message: "Internal server error (HTTP status code: 500)". First, I think there is something about permissions. I checked the permission and the Azure DevOps Service Connection has contributor rights in the subscription and Security Administrator role in AD and user user_impersonation in the API Permission. I tried again and got the same message. I added the user as Owner and GA and the same error continued.
However, when I tested with my user, I was successful, then I assume my script was good.
PS: I used this link tutorial to generate a token for my user.

(https://www.sepago.de/blog/how-to-generate-a-bearer-access-token-for-azure-rest-access-with-username...).
After trying a few other things, I don't know how I can move forward. Do you have any tips to help me?

Microsoft

Hi @alexlimabh , automating certain connectors via service principal is not something that Sentinel supports today. Those are connectors that need Azure AD level permissions (instead of Azure-only permissions). There's work ongoing to enabling this scenario, but as of now, you will have to enable those connector with your user identity.

 

Regards

Brass Contributor

@Javier Soriano - Sadly AzSentinel  commandlets does not support PowerShell Version below 6.2 . When tried executing a powershell .ps1 file from a local machine its a pain to make sure powershell version is upgraded to minimum 6.2 . Not really practical as most local machines with windows 10 and supported OS have powershell with 5.x version.   Its good if these commandlets support previous powershell version. 

 Works absolutely fine when run on a Azure powershell console or through DevOps pipes.

 

Set-AzSentinel [-SubscriptionId <String>] -WorkspaceName <String> [-WhatIf] [-Confirm] [<CommonParameters>]

 

Microsoft

Tagging @Pouyan Khabazi  in case he can comment. @PrashTechTalk you can also look at opening an issue in the AzSentinel powershell project on github

Brass Contributor

@Javier Soriano : How can we protect the intellectual property of rules, queries and playbooks etc. from end customers in case we provide sentinel as a managed service model?

Microsoft

Hi @Deepanshu_Marwah take a look at minute 51:20 on this webinar: https://www.youtube.com/watch?v=hwahlwgJPnE&feature=youtu.be where @Ofer_Shezaf explains how this scenario would work. We are also working on a blog post summarizing the different scenarios.

Brass Contributor

@Javier Soriano In the light house model where we might leverage customer's subscription instead of creating a new subscription by CSP, there is suppose to be BlackBox capability that protects partners analytic rules. Is there any ETA on that ?

Microsoft

Hi @Deepanshu_Marwah , yes, as of today you can create an analytics rule in your own tenant querying the customer tenant. That way the customer won't be able to see it.

Deleted
Not applicable

Hi guys, 

 

first of all great article, i've used this as reference for our design concept to deploy workbooks.

Now I actually have a question about maintaining/creating/updating workbooks.(please refer me if I this is not the place to ask me this question).

As in this article you have set up a PowerShell script that deploys per workbook, each workbook in a separate json file that contains actually the workbook data(queries etc).

 

Is there another approach of how you could do this?

I just like to know . I could give you a sample of how we have done this.  https://raw.githubusercontent.com/joerianto83/templates/master/sampleworkbook

in my opinion this is not efficient, and doesn't give any good overview and it so faulty sensitive. Like to know your opinion about it.

What I would prefer is the method you are using here, create separate workbooks  keep the logic and data separated.

 

 

Microsoft

Hi @Deleted , interesting approach. We chose the other approach because it's easier IMO. You can just grab the workbooks from github or from the azure portal and just place it in a folder and it will get deployed.

 

Your approach would work as well, what I would do though is parametrize it a little bit more. Basically modify the template to be able to deploy an array of workbooks that are passed in a separate parameters file. That way the same template would work for any number of workbooks with any kind of queries. Makes sense? If you get to do it, I'd like to see it!

Iron Contributor

Hi @Javier Soriano I'd love to get this working.

I've installed AzSentinel and I can use the READ commands, but any write-related commands return an error 400.

I'm logged into Azure as the Global Admin.

I'm running this from the Azure powershell window.

eg:

New-AzSentinelAlertRule -WorkspaceName "dbazLAW3" -DisplayName "test1" -Description "b
lah" -Severity "High" -Enabled $true -Query 'blah' -QueryFrequency "5M" -QueryPeriod "5M" -TriggerOperator "GreaterThan" -TriggerThreshold 0
-SuppressionDuration "" -SuppressionEnabled $false -Tactics @("Collection") -PlaybookName ""
New-AzSentinelAlertRule: Unable to invoke webrequest with error message: Response status code does not indicate success: 400 (Bad Request).

Any idea on how to troubleshoot this?

Microsoft

Hi @SocInABox , I just tried with the latest version from my windows terminal and it worked fine. Could you try from your local machine? From Azure DevOps it will work fine for sure because it uses an agent that is consistent across any environment.

 

Regards

Iron Contributor

From my local machine I'm getting token expired.

I see the powershell is talking to this url:
https://management.azure.com/subscriptions/<your  subscription id>/providers/microsoft.insights/alertrules?api-version=2016-03-01

What controls my access to this url?

 

 

Brass Contributor

@SocInABox I am using the bearer token generated from the Connect-AzAccount command in the AzSentinel module. When do you get the timeout message? Because there is an auto refresh token function which should prevent this from happening. 

Iron Contributor

Well that gives a clue related the the problem.

If I run Connect-AzAccount manually it returns my Account, but not the SubscriptionName or TenantId:

sentinel-analytics-library> Connect-AzAccount

WARNING: To sign in, use a web browser to open the page https://microsoft.com/devicelogin and enter the code ERW9XXXXX to authenticate.

 

Account           SubscriptionName TenantId Environment

-------           ---------------- -------- -----------

xxxxx@gmail.com                           AzureCloud

 

And I should have mentioned, the error after the Token error is related to the missing SubscriptionID:

Write-Error: /Users/xxxxx/.local/share/powershell/Modules/AzSentinel/0.6.4/AzSentinel.psm1:456

Line |

456 |         Get-LogAnalyticWorkspace @arguments

     |         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

     | No SubscriptionID provided

 

 

Brass Contributor

@SocInABox so is there an Azure Subscription associated to the account that you are using to login? and Is Azure Sentinel deployed in the same Subscription? If you need more help please create an incident on GitHub so that I can track the status: https://github.com/wortell/azsentinel/issues  (please also share the verbose output for troubleshooting purpose).

Copper Contributor

Hey Pouyan, thanks for your reply.

I suspect my issue lies in the Azure AD application permissions.

This article doesn't touch on any of the access requirements needed, so I'm looking into the app registration api permissions.

I'm also trying some other alternatives like using the 'az rest' command, which handles all of the token handshaking.

And I'm working with the Resource Explorer (resources.azure.com) to understand the different api related resources.

i.e. I need to understand all of the api permission fundamentals before digging into the devops side of things.

Any resources/tips you have for the above topics are appreciated.

(Oh and a tip for anyone working with the AZSentinel powershell commands, your best options are -Debug and -Verbose!)

Copper Contributor

I have several JSON templates for Playbooks and Logic apps. I can deploy them successfully with any issues. However, I have to manually authorize API connections used in Sentinel Playbook.

 

Is there a script/solution to authorize API connections without user interaction?

Co-Authors
Version history
Last update:
‎Nov 17 2021 04:04 AM
Updated by: