Microsoft Secure Tech Accelerator
Apr 03 2024, 07:00 AM - 11:00 AM (PDT)
Microsoft Tech Community
Closing an Incident in Azure Sentinel and Dismissing an Alert in Azure Security Center
Published Feb 19 2020 06:57 AM 20K Views
Microsoft

As we are exploring automation scenarios within Azure Security , we come across an unsolved mystery. When I use Azure Security Center connector into Azure Sentinel and generate Incidents, what happens when I close my Azure Sentinel Incident, does it close the related Azure Security Center alert?

The short of this is it does not. The Azure Sentinel Incident closes but the ASC Alert remains active. You have the option to then dismiss the ASC Alert within the ASC Portal. Sounds like an extra step to keep these systems synchronized.

 

Here is where Logic Apps comes to solve the mystery. Within the next portion of the article we will setup stakes, go into depth on the problem, and how to solve for this. If you want to just deploy the Azure Sentinel Playbook to close Incident and Dismiss ASC Alert at the same time click here.

 

Setting up base configuration of Azure Sentinel and Azure Security Center

 

To get started note that Azure Sentinel is Microsoft’s cloud native SIEM and SOAR tool, within a click of the Azure portal you can spin this up. Once spun up one of the first things customers want to do is bring in Data Connectors to start sending the Logs and Alerts from other 3rd party and Microsoft systems. One such popular connector is the Azure Security Center connector.

 

Azure Security Center has two awesome pillars of securing your Azure investments: Posture Management in the form of security recommendations and workload protection using threat detection that alert off IaaS and PaaS services. Check out the link to the full list of detections that generate alerts. It is impressive and a game changer for cloud security. Azure Security Center was the first public cloud offering to do recommendations and threat detection within a public cloud provider for years before the other cloud vendors built a service.

 

After the ASC connector is established in Azure Sentinel, you can generate Incidents based off those ASC Alerts, giving your SOC a SIEM and SOAR capabilities on ASC Alerts. To do this in Azure Sentinel follow the steps below:

 

  1. Go to Analytics blade > Create Microsoft Incident creation rule Sentinelrulecreate.png 
  2.  You will be taken to a setup wizard fill in for your needs. To start I choose High and Medium Severity alerts from Azure Security Center 
    Sentinelrulecreate2.png 

  3.  On the next screen we will validate and review our choices before telling Azure Sentinel to generate Incidents on ASC Alerts.

     
    Sentinelrulecreate3.png

     

  4. Once created I can now see our rule and any others I brought in from Azure Sentinel’s templates or from Microsoft Security products like Azure Security Center or Microsoft Defender ATP or others. Remember you can always click back on the rule and edit to tune these further.

     

    Sentinelrulecreate4.png 
  5. On the Azure Security Center side we want to ensure our Subscriptions are set to the Standard Tier. On each Azure Subscription Microsoft offers a 30 day opt out trial of ASC Standard tier, be sure if evaluating ASC you mark your calendar 25 days in advance. asc.png 

  6. Finally ensure on Data Collection blade that auto provisioning is turned on. asc3.png 
  7.  For testing purposes we will spin up a Windows 2012 R2 VM on Azure isolated within its own VNET. Log into the VM and download the following ACS.zip   

  8. Unzip and place AlertGeneration in the root of C:\ . In C:\AlertGeneration run the following PS script ScheduledTasksCreation.ps1 to create scheduled tasks that will generate Azure Security Center alerts every 5 minutes.  

alert1.png

alert2.png

 

At this point we can test and close a Azure Sentinel Incident, you will observe that the corresponding ASC Alert is still active under Threats. You can manually dismiss the ASC Alert by clicking on the three ellipses and choosing Dismiss.

 

alerdismiss.png

 

Due to this extra process step from a SOC Analyst to keep Azure Security Center in Sync with Azure Sentinel, I took the task to investigate how these two systems store information. If we can match on any fields directly or indirectly and pass the necessary information to the Azure Security Center Alert API to dismiss.

 

Lining up data sources to find potential matches

 

While examining the Get Azure Sentinel Incident I noticed we had some interesting property fields that could be useful in matching.

 

 

{
      "id": "/subscriptions/{subscriptionid}/resourceGroups/{resourcegroupname}/providers/Microsoft.OperationalInsights/workspaces/{workspacename}/providers/Microsoft.SecurityInsights/Cases/7e2100df-d301-4cac-a9f7-9a7e66903b33",
      "name": "7e2100df-d301-4cac-a9f7-9a7e66903b33",
      "etag": "\"04001f78-0000-0100-0000-5e4611150000\"",
      "type": "Microsoft.SecurityInsights/Cases",
      "properties": {
        "title": "Suspicious command execution",
        "description": "Machine logs indicate a suspicious command line execution by user %{user name}.",
        "severity": "High",
        "status": "Closed",
        "labels": [],
        "closeReason": "TruePositive",
        "closedReasonText": "Closed via Playbook to Dismiss ASC Alert",
        "endTimeUtc": "2020-02-13T21:37:44Z",
        "startTimeUtc": "2020-02-13T21:37:44Z",
        "owner": {
          "objectId": null,
          "email": null,
          "name": null
        },
        "lastUpdatedTimeUtc": "2020-02-14T03:16:37Z",
        "createdTimeUtc": "2020-02-13T21:38:26.3121407Z",
        "relatedAlertIds": [
          "8e724b86-039c-48b1-9ef8-75b9d05c2b43"
        ],
        "relatedAlertProductNames": [
          "Azure Security Center"
        ],
        "tactics": [
          "Execution"
        ],
        "caseNumber": 372,
        "lastComment": "Via RDP session from IP: 136.57.182.66 Eagle logged in and enabled a task and executed a run runing cmd line script. Confirmed with Developer this is just a test.",
        "totalComments": 1,
        "metrics": {
          "SecurityAlert": 1
        },
        "firstAlertTimeGenerated": "2020-02-13T21:38:22.5983349Z",
        "lastAlertTimeGenerated": "2020-02-13T21:38:22.5983349Z"
      }
    }

 

 

Title - Matches the alertDisplayName of ASC Alert but is not unique enough in a larger context of duplicate alerts

 

Status – Closed

 

relatedAlertProductNames - this allows us to be distinctive,

 

startTimeUtc - time the Incident was created in Azure Sentinel, this correlates fairly well against ASC detectedTimeUTC but not exactly. Can be off my milliseconds when testing, potential could be off by seconds as well.

 

relatedAlertIds – a corollary guid used in SecurityAlert log Analytics table.

 

On the Get Azure Security Center Alert we also get some interesting information from the property fields in the returned response.

 

 

      "id": "/subscriptions/{subscriptionid}/resourceGroups/{resourcegroupname}/providers/Microsoft.Security/locations/centralus/alerts/2518206709356240952_6010a57b-91e0-422f-8f35-9587f5330771",
      "name": "2518206709356240952_6010a57b-91e0-422f-8f35-9587f5330771",
      "type": "Microsoft.Security/Locations/alerts",
      "properties": {
        "systemSource": "Azure",
        "vendorName": "Microsoft",
        "alertDisplayName": "Suspicious command execution",
        "alertName": "SuspiciousCMDExecution",
        "detectedTimeUtc": "2020-02-13T21:37:44.3759047Z",
        "description": "Machine logs indicate a suspicious command line execution by user %{user name}.",
        "remediationSteps": "1. Validate with '%{user name}' if this action was intentional. If not, Escalate the alert to the information security team\r\n2. Make sure the machine is completely updated and has an updated Anti-Virus installed\r\n3. Install and run Microsoft’s Malicious Software Removal Tool (see https://www.microsoft.com/en-us/download/malicious-software-removal-tool-details.aspx)\r\n4. Run Microsoft’s Autoruns utility and try to identify unknown applications that are configured to run at login (see https://technet.microsoft.com/en-us/sysinternals/bb963902.aspx)\r\n5. Run Process Explorer and try to identify unknown running processes (see https://technet.microsoft.com/en-us/sysinternals/bb896653.aspx)",
        "actionTaken": "Undefined",
        "reportedSeverity": "High",
        "compromisedEntity": "LEVIATHAN",
        "associatedResource": "/subscriptions/{subscriptionid}/resourceGroups/rgHacks4Claps/providers/Microsoft.Compute/virtualMachines/Leviathan",
        "subscriptionId": "{subscriptionid}",
        "instanceId": "6010a57b-91e0-422f-8f35-9587f5330771",
        "extendedProperties": {
          "user name": "WORKGROUP\\Leviathan$",
          "process name": "c:\\windows\\system32\\cmd.exe",
          "command line": "c:\\windows\\system32\\cmd.exe /c net user&echo \"123\"&echo \"open 123 127.0.0.1\" > c:\\alertgeneration\\dummy.txt&exit",
          "process id": "0x398",
          "account logon id": "0x3e7",
          "user SID": "S-1-5-18",
          "parent process id": "0x338",
          "enrichment_tas_threat__reports": "{\"Kind\":\"MultiLink\",\"DisplayValueToUrlDictionary\":{\"Report: Suspicious Command Execution\":\"https://interflowwebportalext.trafficmanager.net/reports/DisplayReport?callerIdentity=ddd5443d-e6f4-441c-b52b-5278d2f21dfa&reportCreateDateTime=2020-02-13T03%3a02%3a12&reportName=MSTI-TS-Suspicious-Command-Execution.pdf&tenantId=96d75fc3-d906-40b3-b826-52a32e91eb61&urlCreateDateTime=2020-02-13T03%3a02%3a12&token=LBN9K4RWKAuxbc%20Sl05Z/Mx1N1XRrrclhLvrIv715a0=\"}}",
          "resourceType": "Virtual Machine"
        },
        "state": "Dismissed",
        "reportedTimeUtc": "2020-02-13T21:38:24.5447196Z",
        "workspaceArmId": "/subscriptions/{subscriptionid}/resourcegroups/rgswiftoperations/providers/microsoft.operationalinsights/workspaces/swiftenvlogs",
        "confidenceScore": 0.348,
        "confidenceReasons": [
          {
            "type": "Process",
            "reason": "Suspicious process cmd.exe triggered this alert on multiple machines in this subscription"
          },
          {
            "type": "Computer",
            "reason": "This alert was triggered multiple times on machine LEVIATHAN in the last 24 hours"
          },
          {
            "type": "User",
            "reason": "User ANONYMOUS LOGON was involved in multiple alerts on machine LEVIATHAN in the last 7 days"
          }
        ],
        "canBeInvestigated": true,
        "isIncident": false,
        "entities": [
          {
            "$id": "centralus_551",
            "dnsDomain": "",
            "ntDomain": "",
            "hostName": "LEVIATHAN",
            "netBiosName": "LEVIATHAN",
            "azureID": "/subscriptions/{subscriptionid}/resourceGroups/rgHacks4Claps/providers/Microsoft.Compute/virtualMachines/Leviathan",
            "omsAgentID": "006d32ae-345d-480c-ae25-c6c183cca496",
            "osFamily": "Windows",
            "osVersion": "Windows",
            "isDomainJoined": false,
            "type": "host"
          },
          {
            "$id": "centralus_552",
            "name": "Leviathan$",
            "ntDomain": "WORKGROUP",
            "host": {
              "$ref": "centralus_551"
            },
            "sid": "S-1-5-18",
            "isDomainJoined": true,
            "type": "account",
            "LogonId": "0x3e7"
          },
          {
            "$id": "centralus_553",
            "processId": "0x338",
            "commandLine": "",
            "host": {
              "$ref": "centralus_551"
            },
            "type": "process"
          },
          {
            "$id": "centralus_554",
            "directory": "c:\\windows\\system32",
            "name": "cmd.exe",
            "type": "file"
          },
          {
            "$id": "centralus_555",
            "processId": "0x398",
            "commandLine": "c:\\windows\\system32\\cmd.exe /c net user&echo \"123\"&echo \"open 123 127.0.0.1\" > c:\\alertgeneration\\dummy.txt&exit",
            "elevationToken": "Default",
            "creationTimeUtc": "2020-02-13T21:37:44.3759047Z",
            "imageFile": {
              "$ref": "centralus_554"
            },
            "account": {
              "$ref": "centralus_552"
            },
            "parentProcess": {
              "$ref": "centralus_553"
            },
            "host": {
              "$ref": "centralus_551"
            },
            "type": "process"
          },
          {
            "$id": "centralus_556",
            "sessionId": "0x3e7",
            "startTimeUtc": "2020-02-13T21:37:44.3759047Z",
            "endTimeUtc": "2020-02-13T21:37:44.3759047Z",
            "type": "host-logon-session",
            "host": {
              "$ref": "centralus_551"
            },
            "account": {
              "$ref": "centralus_552"
            }
          }
        ]
      }
    }

 

 

name – this is the alert name you pass to dismiss the ASC alert in the API

 

alertDisplayName – Matches the title of Azure Sentinel Incident but is not unique enough in a larger context of duplicate alerts

 

detectedTimeUTC – this correlates fairly well against Azure Setinel’s startTimeUtc but not exactly off by milliseconds when testing, potential could be off by seconds as well.

 

State – active or dismissed

 

Direct Match Conclusion

 

What we are left with is a non unique direct match that could have false positives.

 

title == alertDisplayedName

startTimeUtc == detectedTimeUTC < only at the minutes

status == state

 

Due to this we could build a matching logic but questions arise what if there is a delay in timestamps, they are already off by milliseconds what if off by seconds ? How much of a time window would we use to match 1 sec off, 3 seconds off ? The names are too generic on the alert with no unique resource or identifier within the alert name. What if there are multiples or alerts within seconds of each other with same names.

 

You can see that direct match while possible may not be ideal in larger and more complex environments. So how can we solve this mystery, perhaps with a indirect matching method ?

 

The Clue

 

Enter the GUI for Azure Sentinel, while clicking around in the Incident under full details I noticed it had an alert ID, the same that was in the API call as relatedAlertIds but didn’t show up in ASC Alert API call.

 

incidentfulldetails.png

The Alert Id even has a link that takes us to Log Analytics – SecurityAlert table of data that both Azure Sentinel and Azure Security Center can right into.

 

Within that generated KQL query laid the clue that could unravel Azure Sentinel to ASC in a more precise manner.

 

SystemAlertID == AlertLink

 

indirectmatch.png

 

Many times it pays to go above code and look at the Portal UI where hints can be observed visually. Armed with this knowledge and some testing I went to work on a Azure Sentinel Playbook that a SOC Analyst can run when they are ready to close their Incident and then dismiss the corresponding Azure Security Center alert.

 

Designing Azure Sentinel Playbook

 

Designing the Logic app took a few iterations but I finally landed on a pattern that worked in testing. The Logic App pattern is as follows.

 

logicapp1.png

 logicapp2.png

 

logicapp3.png

logicapp4a.png

 

Remember clicking on a dynamic content \ expression will reveal the coded expression used, recall that the Parse JSON still gave us an array rather than foreach when we click in and select the value we will use 0 Index of the array to pass the strings in

 

body('Run_query_and_list_results')['value'][0]['ascsubid']

body('Run_query_and_list_results')['value'][0]['ascrgname']

body('Run_query_and_list_results')['value'][0]['ascalertname']

 

logicapp4.png

 

Conclusion

 

When designing or solving for a problem you should look at the data structures involved behind the scenes and determine can I make a direct match ? If I can’t make a direct match can I use another source of truth on data table somewhere and indirectly match ? Logic Apps are very powerful and easy to use to solve for problems. You don’t need to rely on code dependencies or PowerShell modules. Logic Apps can be used as a managed identity and given certain rights against the Azure Subscription or Resources to make it easy and secure obtaining authorization without having to use a ServicePrincipal and passing AppID and Key to obtain a bearer token to then pass within API call.

 

If you want to try this out you can deploy the Azure Sentinel playbook containing this logic from the  Repository on Github here: https://github.com/Azure/Azure-Sentinel/tree/master/Playbooks/Close-Incident-ASCAlert 

 

deploy.png

 

Special thanks to:

Yuri Diogenes for reviewing this post.

 

9 Comments

Awesome Blogpost ! Thank you :cool:

Microsoft

Cool @Nathan Swift very useful!

Following your concept we could create the same playbook for DefenderATP, AzureATP, CloudApp and third party if they expose API.

Is it correct?

Microsoft

@sifriger Thank you. That is correct a similar concept could be used if APIs and correlation exist in the Microsoft Security products and their alerts. As an example MDATP has an API to update alerts - https://docs.microsoft.com/en-us/windows/security/threat-protection/microsoft-defender-atp/update-al... | and could be correlated by SecurityAlerts KQL table: {field}VendorOriginalId == MDATP {field}Id

Microsoft

Can i close manage Sentinel as well?

a sort of bidirectional if i am in defender i can close the incident in Sentinel and if i am in Sentinel i can close in Defender? 

Microsoft

@sifriger - possibly though getting very complex. Logic Apps may not be best programming model to build that bidirectional synchronization via event driven closure, with a scenario across multiple products. Feel free to DM me in Private Messages or Teams if you would like to discuss more.

 

A slightly different concept was explored though it was timer based one way authoritative sync: Sentinel Close Incident -> ASC Dismiss : https://github.com/swiftsolves-msft/Azure-Sentinel-Playbooks/tree/master/IncidentClosed-Sync-ASCAler...

Microsoft

@PriscilaViana - wrote an awesome blog and playbook extending capability from Sentinel to Dismiss alerts generated from and on MCAS - https://medium.com/@priscilaviana/playbook-for-azure-sentinel-mcas-integration-f939746d3209

Microsoft

@Nathan Swift is there any plans to make this a background item for closing out alerts. If it is closed in ASC then it gets closed in the rest?

Microsoft

Thanks I would check out the private previews and Azure Road map on this item. At this time I am not working on a sync between both systems, it would have to correlate and prevent race conditions between the two platforms in Logic App code. One might be able to build a one way sync the other way though scan dismissed ASC alerts or trigger off the Azure Activity Logs for dismissed alert, potentially correlate SecurityAlertId to AzureSentinel incident and close Sentinel incident.

Microsoft

@Nathan Swift Awesome. I will look into building that out and testing what we can do.

Version history
Last update:
‎Feb 19 2020 07:01 AM
Updated by: