General knowledge of Azure, PowerShell, and the appropriate role-based access control assignment for resource creation, both in the portal and with command-line interfaces.
The challenge is the same as stated in Part 1 of this series. The ability to have event-driven alerting of Azure Policy Compliance changes is important for organizations for varying reasons. This solution can help close that gap.
The solution described in this article will follow a very similar architecture to the one described in Part 1. There are a few changes that we will describe in this article. In this Part 2, we will show how to automate the deployment of the resources needed for this Policy Alerting Solution using PowerShell and Bicep.
Code Editor: You will need a code editor for editing the PowerShell and Bicep code. We suggest Visual Studio Code.
PowerShell: We recommend that you install PowerShell version 7.4.1 (HERE) (latest version at the time of this writing).
PowerShell Modules: We recommend you install the latest Az module 11.2 (latest version at the time of this writing).
Bicep: You will need Bicep installed. We recommend the latest version (HERE).
Download Code: Download the code for this solution from GitHub HERE. Once you download all the code/files/directories, it is important to keep the file/directory structure intact. The script that orchestrates the deployment assumes the file/directory structure is the same as it was before download.
Differences (From Part 1)
Automation: This approach is automated. The Part 1 article used a step-by-step Portal approach while this one is a mostly automated deployment.
Authentication: This automated solution does not use an Entra App registration OR a KeyVault to store its secrets for the purpose of authentication. This automated solution uses the Azure Function App service principal to access the Data Collection Rule so it can write data to the Log Analytics Workspace. This is done by assigning the "Monitoring Metrics Publisher" to the Function Service Principal.
Resources: Because of the new authentication approach, there is no requirement for an Entra App, KeyVault resource, or KeyVault role to deploy this solution.
Once you have installed/configured all of the items specified in the Requirements section, the next step is to prepare the code for execution. The code leverages both PowerShell and Bicep. The script you will launch is the PolicyAlert-Launcher.ps1 file. This PowerShell script will orchestrate the process of deploying the resources needed for this Policy Alerting solution. The script will call other PowerShell and Bicep files during this process so keeping the directory structure intact is important.
Open the PolicyAlert-Launcher.ps1 file in your code editor; we used VS Code. At the top of the PowerShell script, PolicyAlert-Launcher.ps1, you will see the "param" section that holds all of the required parameters for the script. You will need to update these parameter values inside of the script in the editor (and save them) OR add those command-line arguments when you execute the script (eg .\PolicyAlert-Launcher.ps1 -RGName "My-RG-Name"). I find it easier to just change the parameter values inside the script in the editor. Here is a breakdown of the parameters and what to expect:
This code will create an Event Grid Event subscription at the Azure Subscription level. Only one Event Grid SystemTopic can be installed at the Azure Subscription level so you will need to make sure that you do not have one, delete the existing one, or modify the code to use the existing one. If you do not do one of those options, the deployment will fail at that step.
We recommend you keep the file/directory structure as it is. Otherwise, you will need to change the paths specified in the Parameters section.
Make sure your storage account name parameter is unique because the deployment will fail if another storage account has that name in Azure. It is a best practice to use a very unique name and follow the storage account naming restrictions.
The script will detect if the Resource Group name or the Log Analytics Workspace name exists. If it does, it will use that resource. If either of them do not already exist, the script will create a new one.
Once all the requirements are completed and you have verified the Notes from above, it is time to execute the code. In your PowerShell prompt, change directory to the root of the code, which would be at the same level you have stored the PolicyAlert-Launcher.ps1 script. Note, this is also the same directory where you should have the subdirectories, like data-collection-rule, event-grid, and function-app.
Now that your PowerShell prompt is at that directory level, you can execute the script with the command ".\PolicyAlert-Launcher.ps1" Once the code starts running, you will be prompted for your login to make changes to your subscription. The account you use will need to have the correct roles to deploy and configure all of the resources. As the script proceeds, you will see logging on-screen in the PowerShell terminal as well as a log file will be created in the same directory by default.
When the script completes, all of the resources should be deployed into the Resource Group that was specific in the parameters. The only remaining task is to update the PowerShell code that is now nested inside of the Function App that was created. The easiest way to do this is to use the Azure Portal to open the Function App. Once you have the Function App Opened in the portal, click on the "trigger" name, as indicated by your naming in the parameters. Now on the next page click the "Code + Test" button on the left.
This should open an editor where you will see the "run.ps1" PowerShell code within your Function App. You will need to update the values of the 3 variables at the top of the code ($Table, $DcrImmutableId, $DceURI). The PolicyAlert-Launcher.ps1 script should have created a file named "reminders.txt" in the same directory. Open that text file and you should see the values to use for the 3 variables in the Function App PowerShell code on your screen. Once you have put in those 3 values, click Save.
At this point, Policy Compliance Change data should start flowing into your custom Log Analytics Table. This may take a little time before the data starts flowing, depending on what kind of policies you have deployed, and the last policy compliance scan happened.
The code will have also created a query based Alert Rule that will send email notifications when a policy changes compliance state. This can be tuned to your preference in the Alert Rules. The query implemented by the code matches what was implemented in Part 1 of this article in the Portal.
This concludes the build of the resources for the Policy Compliance Alerting Solution. Be sure to subscribe and follow the posts in this series as well as the GitHub repo as follow-up articles and code updates will be coming on this topic.