Blog Post

FastTrack for Azure
6 MIN READ

Generate Azure Policy Compliance Alerts By Sending Custom Data to Log Analytics - Part 2 - Automated

DJBartles's avatar
DJBartles
Icon for Microsoft rankMicrosoft
Feb 09, 2024

Purpose

This article is Part 2 of the series that shows you how to setup alerting when an Azure Policy Compliance State changes.  Part 1 of this article (https://techcommunity.microsoft.com/t5/fasttrack-for-azure/generate-azure-policy-compliance-alerts-by-sending-custom-data) demonstrated the architecture and the deployment of the resources through the Azure Portal.  This Part 2 will show how to automate the deployment of the resources for this Policy Compliance Alerting Solution.

 

Assumptions

General knowledge of Azure, PowerShell, and the appropriate role-based access control assignment for resource creation, both in the portal and with command-line interfaces.

 

Challenge

The challenge is the same as stated in Part 1 of this series.  The ability to have event-driven alerting of Azure Policy Compliance changes is important for organizations for varying reasons.  This solution can help close that gap.

 

Solution

The solution described in this article will follow a very similar architecture to the one described in Part 1.  There are a few changes that we will describe in this article.  In this Part 2, we will show how to automate the deployment of the resources needed for this Policy Alerting Solution using PowerShell and Bicep.  

 

Requirements

  1. Code Editor:  You will need a code editor for editing the PowerShell and Bicep code.  We suggest Visual Studio Code.
  2. PowerShell:  We recommend that you install PowerShell version 7.4.1 (HERE) (latest version at the time of this writing).
    1. PowerShell Modules:  We recommend you install the latest Az module 11.2 (latest version at the time of this writing).
  3. Bicep:  You will need Bicep installed.  We recommend the latest version (HERE).
  4. Download Code:  Download the code for this solution from GitHub HERE.  Once you download all the code/files/directories, it is important to keep the file/directory structure intact.  The script that orchestrates the deployment assumes the file/directory structure is the same as it was before download.

 

Differences (From Part 1)

  1. Automation:  This approach is automated.  The Part 1 article used a step-by-step Portal approach while this one is a mostly automated deployment.
  2. Authentication:  This automated solution does not use an Entra App registration OR a KeyVault to store its secrets for the purpose of authentication.  This automated solution uses the Azure Function App service principal to access the Data Collection Rule so it can write data to the Log Analytics Workspace.  This is done by assigning the "Monitoring Metrics Publisher" to the Function Service Principal.
  3. Resources:  Because of the new authentication approach, there is no requirement for an Entra App, KeyVault resource, or KeyVault role to deploy this solution.

 

Execution

Once you have installed/configured all of the items specified in the Requirements section, the next step is to prepare the code for execution.  The code leverages both PowerShell and Bicep.  The script you will launch is the PolicyAlert-Launcher.ps1 file.  This PowerShell script will orchestrate the process of deploying the resources needed for this Policy Alerting solution.  The script will call other PowerShell and Bicep files during this process so keeping the directory structure intact is important.

 

Open the PolicyAlert-Launcher.ps1 file in your code editor; we used VS Code.  At the top of the PowerShell script, PolicyAlert-Launcher.ps1, you will see the "param" section that holds all of the required parameters for the script.  You will need to update these parameter values inside of the script in the editor (and save them) OR add those command-line arguments when you execute the script (eg .\PolicyAlert-Launcher.ps1 -RGName "My-RG-Name").  I find it easier to just change the parameter values inside the script in the editor.  Here is a breakdown of the parameters and what to expect:

 

 

 

 

 

param(
    [string]$AzureEnvironment = "AzureCloud",
    [string]$SubscriptionId = "",
    [string]$RGName = "rg-",
    [string]$Location = "eastus",
    [string]$functionAppName = "FNApp-",
    [string]$functionTriggerName = "PolicyAlertTrigger",
    [string]$appServicePlanName = "ASP-",
    [string]$appInsightsName = "AI-",
    [string]$storageAccountName = "",
    [string]$storageSku = "Standard_LRS",
    [string]$appServicePlanSku = "Y1",
    [string]$LAWName = "LAW-",
    [string]$eventGridSubName = "",
    [string]$topicName  = "",
    [string]$dcrName = "DCR-",
    [string]$dceName = "DCE-",
    [string]$customTableName = "PolicyAlert",
    [string]$alertRuleName = "AR-",
    [string]$actionGroupName = "AG-",
    [string]$actionGroupEmail = "joe@contoso.com",
    [string]$functionBicep = ".\function-app\main.bicep",
    [string]$eventGridBicep = ".\event-grid\main.bicep",
    [string]$dcrBicep = ".\data-collection-rule\main.bicep",
    [string]$alertBicep = ".\alert-rules\main.bicep",
    [string]$reminders = ".\reminders.txt",
    [string]$OutputFile = ".\PolicyAlert-Launcher-Log.log",
    [string]$ScriptVer = "v1.0.4"
)

 

 

 

 

NOTES: 

  • This code will create an Event Grid Event subscription at the Azure Subscription level.  Only one Event Grid SystemTopic can be installed at the Azure Subscription level so you will need to make sure that you do not have one, delete the existing one, or modify the code to use the existing one.  If you do not do one of those options, the deployment will fail at that step.
  • We recommend you keep the file/directory structure as it is.  Otherwise, you will need to change the paths specified in the Parameters section.
  • Make sure your storage account name parameter is unique because the deployment will fail if another storage account has that name in Azure.  It is a best practice to use a very unique name and follow the storage account naming restrictions.
  • The script will detect if the Resource Group name or the Log Analytics Workspace name exists.  If it does, it will use that resource.  If either of them do not already exist, the script will create a new one.

Once all the requirements are completed and you have verified the Notes from above, it is time to execute the code.  In your PowerShell prompt, change directory to the root of the code, which would be at the same level you have stored the PolicyAlert-Launcher.ps1 script.  Note, this is also the same directory where you should have the subdirectories, like data-collection-rule, event-grid, and function-app.

 

Now that your PowerShell prompt is at that directory level, you can execute the script with the command ".\PolicyAlert-Launcher.ps1" Once the code starts running, you will be prompted for your login to make changes to your subscription.  The account you use will need to have the correct roles to deploy and configure all of the resources.  As the script proceeds, you will see logging on-screen in the PowerShell terminal as well as a log file will be created in the same directory by default.

 

When the script completes, all of the resources should be deployed into the Resource Group that was specific in the parameters.  The only remaining task is to update the PowerShell code that is now nested inside of the Function App that was created.  The easiest way to do this is to use the Azure Portal to open the Function App.  Once you have the Function App Opened in the portal, click on the "trigger" name, as indicated by your naming in the parameters.  Now on the next page click the "Code + Test" button on the left.

 

 

 

This should open an editor where you will see the "run.ps1" PowerShell code within your Function App.  You will need to update the values of the 3 variables at the top of the code ($Table, $DcrImmutableId, $DceURI).  The PolicyAlert-Launcher.ps1 script should have created a file named "reminders.txt" in the same directory.  Open that text file and you should see the values to use for the 3 variables in the Function App PowerShell code on your screen.  Once you have put in those 3 values, click Save.

 

At this point, Policy Compliance Change data should start flowing into your custom Log Analytics Table.  This may take a little time before the data starts flowing, depending on what kind of policies you have deployed, and the last policy compliance scan happened.

 

The code will have also created a query based Alert Rule that will send email notifications when a policy changes compliance state.  This can be tuned to your preference in the Alert Rules.  The query implemented by the code matches what was implemented in Part 1 of this article in the Portal.

 

Wrap-Up

This concludes the build of the resources for the Policy Compliance Alerting Solution.  Be sure to subscribe and follow the posts in this series as well as the GitHub repo as follow-up articles and code updates will be coming on this topic.

References

Part 1: Generate Azure Policy Compliance Alerts By Sending Custom Data to Log Analytics (microsoft.com)

 

Updated Feb 09, 2024
Version 1.0
  • The Event Hub is where the basic filtering is setup for what data is captured.  As I recall, there are 3 different PolicyInsights data feeds, which this article recommends selecting 2 of those 3.  There are additional filters at the Event Hub Subscription level but I do not think we implement any of those in this article.  I do not know how PolicyInsights data would publish data for resources that do not exist.  Why do you say the data shows the wrong compliance state?  Also, I do not understand your comment about "wrong RG under subject".

     

    I am not sure I completely understand the problem you are having but I don't think I have seen anything like you describe.  Essentially, the Event Hub Subscription published the data and the Function pushes that data into Log Analytics.  The data can be filtered at the Event Hub Subscription level OR the Azure Function code level.  My approach in these articles has been to send the data into the LAW and then you can use refined queries to pull the data you need.

  • shaal's avatar
    shaal
    Copper Contributor

    DJBartles Hi,

    I have implemented your solution for some time now and got good amount of logs. the logs are showing resources that does not exist or exists on wrong RG under subject, wrong compliancestate as well and much more. Tried to contact Azure and they are trying to debug it. I wonder if you have faced any such thing?

  • If setup properly, anytime the PolicyInsights data is published, the Event Hub should capture it and send it through.  You can look at the Event Grid System Topic to see if there has been any PolicyInsights traffic.  If there is traffic, then you can look at the log on the Function to see when the code was triggered and what logging is there.  In the Function App powershell code, there should be a logging section as an example that you can uncomment for additional data in the logs.

     

    Those are just a couple of examples of how to look for the data if its not coming in as expected.  If you suspect the EventGrid of FunctionApp are setup correctly but are not functioning, then you could open a support ticket for assistance.

  • shaal's avatar
    shaal
    Copper Contributor

    Hi

    I have a question about the logs and alert. after setting up the solution, the information flow in the table, but nothing comes again. i set it up on 30 juli, so i get the events of that day, no more (nothing changed in policies status since that time). should not it keep scanning every 24h and log it? even without changes from outside on the infra?

  • I really do not know what role is needed for what principal in your situation.  Like I said, I do not currently have this setup this way to test so its hard for me to say without seeing it.

     

    If you are in a time crunch, you could do those couple steps in the portal or manually to move forward.  That may actually show you more about what you are missing as well.

     

    I would also check to make sure that you have the resource providers installed as well.  Given your error message, it is not likely the cause but its easy enough to check.  I would make sure that all of the RP's are loaded from everything mentioned in your error message(s).  

  • shaal's avatar
    shaal
    Copper Contributor

    Hi, thanks. yes it is a bit strange why the enterprise is old, while creating the things now. i went to your code and assigned egSubscriptionSource on mg id, then deployed. all setup except the EG-sub, with this error. it is doable via portal as well.

    I agree it could be RBAC thing. I have to go IAM on MG and add roles for the object id that is showing up. inst?

    I tried to add EventGrid Contributor and EventGrid EventSubscription Contrinbutor roles. or what role do you think of?

    Do you think any need to create identity (as for function), or another app reg, or stay with the object id shown in error message?

  • I do not have this lab running at the moment, but I do not recall seeing an Enterprise App in my Identity tenant.  I am not sure if that is related to this solution.  I see yours is dated in 2020.

     

    Based on the error you posted, I would first look at the scope part of it.  You mentioned that you setup the EventGrid at the MG level.  How did you do that?  The templates/code was written to deploy EventGrid to the Subscription so maybe you need to adjust the scope in the code so that the EG Subscription knows how to find the EG.  The error indicates an error trying to access the eventGridFilters, which is where the code sets the subscription to event types of "Microsoft.PolicyInsights.PolicyStateChanged" and "

    Microsoft.PolicyInsights.PolicyStateCreated".  The EG and Subscription are created in the code in the template "\event-grid\main.bicep".  The Subscription ID is in the parameter "egSubscriptionSource".
     
    If everything looks ok with the scope, then I would focus on Access Control Roles.  Maybe the account running this code does not have the proper RBAC roles at the MG level.  I would give that a check as well.
     
    Based on the information you provided, that is where I would look first.  We did not test this code to deploy at the MG level so I cannot validate if there are any other adjustments required, but in theory it should work with the right adjustments.
  • shaal's avatar
    shaal
    Copper Contributor

    Hi, tryig to run it across MG. i deploy eventgrid on MG level, but failing to create eventgrid-sub, with error:

    Deployment has failed with the following error: {"code":"Publisher Notification Error","message":"Failed to enable publisher notifications.","details":[{"code":"Publisher Provider Error","message":"GET request for
    https://management.azure.com/tenants/xxx/providers/Microsoft.Management/managementGroups/xxxxx/eventGridFilters/_default?api-version=2020-10-01
    failed with status code: Forbidden, code: AuthorizationFailed and message: The client 'xxxx' with object id 'xxxx' does not have authorization to perform action 'microsoft.policyinsights/eventGridFilters/read' over scope '/providers/Microsoft.Management/managementGroups/xxxx/providers/microsoft.policyinsights/eventGridFilters/_default' or the scope is invalid. If access was recently granted, please refresh your credentials.."}]}

    the  The client 'xxxx' with object id 'xxxx' is an enterprise app

    any idea?