Intune
6 TopicsGpresult Like Tool For Intune
Hi, Jonas here! Or as we say in the north of Germany: "Moin Moin!" I had to troubleshoot a lot of Intune policies lately and I used a variety of tools for that. At the end, I built my own script to have a result which looks similar to what “GPresult /h” creates for on-premises group polices. The script is inspired by the following article: https://doitpshway.com/get-a-better-intune-policy-report-part-2 by Ondrej Sebela. It follows a similar approach, but without any module dependencies and fewer output options, as my script only generates an HTML page. What started as a script is now a module which might have more functions in the future. Feel free to read any of my other articles here: https://aka.ms/JonasOhmsenBlogs How to get the module The PowerShell module is called: "IntuneDebug" and can be installed or downloaded from the PowerShell Gallery. Install the module by running the following command: Install-Module -Name IntuneDebug The module repository can be found here https://aka.ms/IntuneDebug in case you want to download the module manually or want to contribute to it. The command to get the report is called: “Get-MDMPolicyReport” How to use Get-MDMPolicyReport The function can run without administrative permissions and without any parameters on a windows machine. But you can also start the function with administrative permissions to get more data about Intune Win32Apps and their install status. Use parameter “-MDMDiagReportPath” to load MDM report data captured on a remote machine. But more on that in section “How to use parameter -MDMDiagReportPath“ So, in summary, the function can run locally to output information specific to that device, or it can parse already captured data via the “-MDMDiagReportPath” parameter. It cannot gather data remotely, though. The function output As mentioned earlier, the only output of the function is an HTML file which will automatically open in Edge. The output is grouped into sections to make the report easier to read. The page looks like this when all sections are collapsed: Section: "DeviceInfo <Devicename>" DeviceInfo shows general information about the device and the Intune sync status: Section: "PolicyScope: Device" This section shows all the settings applied to the device grouped by area/product. Note: If you’re coming from ConfigMgr you might expect a policy ID in the report. While an Intune policy has an ID, the ID is not stored on the device. That’s by-design and that’s the reason why we just see the settings that apply to a device in this report. The following example shows some basic Defender and Delivery Optimization settings grouped together. You can also see the system's default value if there is one and the winning settings provider. This should typically be the MDM provider like Intune, but it could also be a different provider for some settings depending on the setup. Section: "PolicyScope: <SID> <UPN>" This section shows all the policies applied to a user. The user’s SID and UPN (UPN only when run locally) are visible in the policy-scope header. If there are multiple users working on a machine, each user will have their own section in the report. Section: "PolicyScope: EnterpriseDesktopAppManagement" This section shows all MSI installation policies from Intune. NOTE: Win32 and store apps are visible in the “Win32Apps” section. The application name is not available, instead I show the MSI filename to give an indication of what type of app that is. Section: "PolicyScope: Resources" Under resources we will see policies which typically contain some sort of payload. Like a certificate or Defender firewall rule. I tried to make each section as readable as possible. So, the output varies by type. Certificates for example, are shown in a different format as Defender firewall rules. NOTE: If the function runs without the parameter “-MDMDiagReportPath” it will try to enrich the policy info with as much data as possible. This is not possible when working with captured MDM-reports from a remote machine. The output might be limited in that case. Section: "PolicyScope: Local Admin Password Solution (LAPS)" This section shows all the settings applied to the device coming from a LAPS policy as well as some local settings. Section: "PolicyScope: Win32Apps" This section shows all available Win32App policies. Those apps can be installed already or just assigned as available. If you need more information about the installation status, you need to run the function with administrative permission. This only works locally and cannot be used with parameter “-MDMDiagReportPath” since the extra data is coming from the local registry. If a script is used for the detection or requirement, the script will be parsed and shown as it is. Use the copy button to copy the script and test it locally if needed. When the script is run as administrator locally, it will try to get more information about the actual installation status of an application: Section: "PolicyScope: Intune Scripts" Intune Scripts will show script policies and their current state. The example below shows a remediation script with the detection output string "Found". It does not have an remediation action and therefore no data for the related properties. Unfortunately, the script name is not part of the policy and cannot be shown here. But you can use Graph Explorer https://aka.ms/ge and use the following endpoint to get the script name by entering the script ID of your script: "https://graph.microsoft.com/beta/deviceManagement/deviceHealthScripts/<ScriptID>?$select=id,displayName" Where the data comes from The function will use the following command to generate an MDM report: MdmDiagnosticsTool.exe -out “C:\Users\PUBLIC\Documents\MDMDiagnostics\<DateTime>” NOTE: The tool MdmDiagnosticsTool.exe is part of the Windows operating system. More about it can be found HERE The tool will export the data to C:\Users\PUBLIC\Documents\MDMDiagnostics to a folder in the following format: "yyyy-MM-dd_HH-mm-ss" The function will then parse the following two files to extract the required data without administrative privileges: MDMDiagReport.html MDMDiagReport.xml Some data is directly read from the registry to enrich the output and in some cases administrator permissions are required. The Win32Apps and Intune script policy data is coming from the Intune Management Extension logfiles: C:\ProgramData\Microsoft\IntuneManagementExtension\Logs\AppWorkload*.log C:\ProgramData\Microsoft\IntuneManagementExtension\Logs\HealthScripts*.log NOTE: The folders under “C:\Users\PUBLIC\Documents\MDMDiagnostics” will be deleted when the creation time is older than one day. This can be changed with parameter “-CleanUpDays” set to a higher value than one day. How to use parameter “-MDMDiagReportPath” Simply generate MDM report data, either with the MdmDiagnosticsTool.exe, via the settings app or via Intune. Then copy the files to a system with the IntuneDebug module on it and unpack the report data. You can now run the function with the parameter “-MDMDiagReportPath” and point it to the unpacked report data. NOTE: The report header will contain the following when the parameter was used: “Generated from captured MDM Diagnostics Report” MdmDiagnosticsTool.exe example: mdmdiagnosticstool.exe -area "DeviceEnrollment;DeviceProvisioning;Autopilot" -zip C:\temp\MDMDiagnosticsData.zip Settings app example: Intune Example: I hope you find this tool helpful. In case of any issues or suggestions, head over to GitHub via https://aka.ms/IntuneDebug and create an issue or pull request. Stay safe! Jonas Ohmsen Code disclaimer This sample script is not supported under any Microsoft standard support program or service. This sample script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of this sample script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of this script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use this sample script or documentation, even if Microsoft has been advised of the possibility of such damages.Update Entra ID Device Extension Attributes via PowerShell & Create Dynamic Security Groups.
2) Overview of Extension Attributes and Updating via PowerShell What Are Extension Attributes? Extension attributes (1–15) are predefined string fields available on Entra ID device objects. They are exposed to Microsoft Graph as the extensionAttributes property. These attributes can store custom values like department, environment tags (e.g., Prod, Dev), or ownership details. Why Use Them? Dynamic Group Membership: Use extension attributes in membership rules for security or Microsoft 365 groups. Policy Targeting: Apply Defender for Endpoint (MDE) policies, Conditional Access or Intune policies to devices based on custom tags. For details on configuration of the policies refer below documentation links. https://learn.microsoft.com/en-us/defender-endpoint/manage-security-policies https://learn.microsoft.com/en-us/intune/intune-service/ https://learn.microsoft.com/en-us/entra/identity/conditional-access/ Updating Extension Attributes via PowerShell and Graph API Use Microsoft Graph PowerShell to authenticate and update device properties. Required permission: “Device.ReadWrite.All”. 3) Using PowerShell to Update Extension Attributes create app registration in Entra ID with permissions Device.ReadWriteall and Grant admin Consent. Register an app How to register an app in Microsoft Entra ID - Microsoft identity platform | Microsoft Learn Graph API permissions Reference. For updating Entra ID device properties you need “Device.ReadWrite.all” permission and Intune administrator role to run the script. Microsoft Graph permissions reference - Microsoft Graph | Microsoft Learn Below is the script Important things to note and update the script with your custom values. a) update the path of the excel file in the script. column header is 'DeviceName' Note: You may want to use CSV instead of excel file if Excel is not available on the admin workstation running this process. b) update the credential details - tenantId,clientId & clientSecret in the script. Client id and client secret are created as a part of app registration. c) update the Externsionattribute and value in the script. This is the value of the extension attribute you want to use in dynamic membership rule creation. ___________________________________________________________________________ #Acquire token $tenantId = "xxxxxxxxxxxxxxxxxxxxx" $clientId = "xxxxxxxxxxxxxxxx" $clientSecret = "xxxxxxxxxxxxxxxxxxxx" $excelFilePath = "C:\Temp\devices.xlsx" # Update with actual path $tokenResponse = Invoke-RestMethod -Uri "https://login.microsoftonline.com/ $tenantId/oauth2/v2.0/token" -Method POST -Body $tokenBody $accessToken = $tokenResponse.access_token # Import Excel module and read device names Import-Module ImportExcel $deviceList = Import-Excel -Path $excelFilePath foreach ($device in $deviceList) { $deviceName = $device.DeviceName # Assumes column header is 'DeviceName' Get device ID by name $headers = @{ "Authorization" = "Bearer $accessToken"} $deviceLookupUri = "https://graph.microsoft.com/beta/devices?`$filter=displayName eq '$deviceName'" try { $deviceResponse = Invoke-RestMethod -Uri $deviceLookupUri -Headers $headers -Method GET } catch { Write-Host "Error querying device: $deviceName - $_" continue } if ($null -eq $deviceResponse.value -or $deviceResponse.value.Count -eq 0) { Write-Host "Device not found: $deviceName" continue } $deviceId = $deviceResponse.value[0].id # Prepare PATCH request $uri = "https://graph.microsoft.com/beta/devices/$deviceId" $headers["Content-Type"] = "application/json" $body = @{ extensionAttributes = @{ extensionAttribute6 = "MDE" } } | ConvertTo-Json -Depth 3 try { $response = Invoke-RestMethod -Uri $uri -Method Patch -Headers $headers -Body $body Write-Host "Updated device: $deviceName"} catch { Write-Host "Failed to update device: $deviceName - $_" } } Write-Host "Script execution completed." ________________________________________________________________________________________________________________________ Here’s a simple summary of what the script does: Gets an access token from Microsoft Entra ID using the app’s tenant ID, client ID, and client secret (OAuth 2.0 client credentials flow). Reads an Excel file (update the path in $excelFilePath, and ensure the column header is DeviceName) to get a list of device names. Loops through each device name from the Excel file: Calls Microsoft Graph API to find the device ID by its display name. If the device is found, sends a PATCH request to Microsoft Graph to update extensionAttribute6 with the value "MDE". Logs the result for each device (success or failure) and prints messages to the console. 4) Using Extension Attributes in Dynamic Device Groups Once extension attributes are set, you can create a dynamic security group in Entra ID: Go to Microsoft Entra admin center → Groups → New group. Select Security as the group type and choose Dynamic Device membership. Add a membership rule, for example: (device.extensionAttributes.extensionAttribute6 -eq "MDE") 4. Save the group. Devices with extensionAttribute6 = MDE will automatically join. 5) Summary Extension attributes in Entra ID allow custom tagging of devices for automation and policy targeting. You can update these attributes using Microsoft Graph PowerShell. These attributes can be used in dynamic device group rules, enabling granular MDE policies, Conditional Access and Intune deployments. Disclaimer This script is provided "as-is" without any warranties or guarantees. It is intended for educational and informational purposes only. Microsoft and the author assume no responsibility for any issues that may arise from the use or misuse of this script. Before deploying in a production environment, thoroughly test the script in a controlled setting and review it for compliance with your organization's security and operational policies.Creating Custom Intune Reports with Microsoft Graph API
Systems administrators often need to be able to report on data that is not available in the native reports in the Intune console. In many cases this data is available to them through Microsoft Graph. However, in some instances administrators may need to pull data from other sources or store it for tracking trends over time. For example, generating a custom dashboard to track Windows 365 license costs requires pulling data from Microsoft Graph and combining it with licensing details that are not available in Graph, but may be stored in another location (an IT Asset Management Tool for example). The Windows 365 Cost Dashboard is an example of how you can combine Intune data from Microsoft Graph with information pulled from another source. This guide provides step-by-step instructions to pull data from Microsoft Graph API, ingest it to Azure Log Analytics, and connect to your workspace with Power Bi. This solution demonstrates how to gather and store Graph API data externally for richer reporting and integrate it with data from an additional data source to produce a dashboard tailored to your unique needs. By using this dashboard as an example, administrators can unlock deeper insights while leveraging Intune's powerful foundation. The solution: This dashboard and the accompanying PowerShell script are meant to demonstrate an end-to-end example of gathering data from Microsoft Graph and ultimately being able to visualize it in a Power Bi dashboard. While it does create the Azure Infrastructure needed to complete the scenario in the demonstration, it can be extended to gather and report additional information. What does this do? This example consists of two separate pieces – the Power Bi dashboard and a PowerShell script that creates all the Azure resources needed to gather data from Microsoft Graph and ingest it into a Log Analytics workbook. This post will discuss all of the infrastructure elements that are created and the steps to get your data from Log Analytics into the Power Bi dashboard, but I want to strip away all of the “extra” elements and talk about the most important part of the process first. Prerequisites The scripts shared in this blog post assume that you already have an Azure subscription and a resource group configured. You need to have an account with the role of “Owner” on the resource group (or equivalent permissions) to create resources and assign roles. The account will also need to have the “Application Developer” role in Entra Active Directory to create an App Registration. To run the resource creation script, you will need to have several modules available in PowerShell. To see the full list please review the script on GitHub. From Microsoft Graph API to Log Analytics: How we get there Microsoft Graph API can give us a picture of what our environment looks like right now. Reporting on data over time requires gathering data from Graph and storing it in another repository. This example uses a PowerShell script running in Azure Automation, but there are several different ways to accomplish this task. Let’s explore the underlying process first, and then we can review the overall scope of the script used in the example. The Azure Automation runbook [CloudPCDataCollection] calls Graph API to return details about each Windows 365 Cloud PC. It does this by making GET requests to the following endpoints: https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs https://graph.microsoft.com/v1.0/users/<userPrincipalName> As a best practice, we should only return the properties from an API endpoint that we need. To do that, we can append a select query to the end of the URI. Queries allow us to customize requests that are made to Microsoft Graph. You can learn more about Select (and other query operators) here. The example dashboard allows you to report on Windows 365 cost over time based on properties of the device (the provisioning policy, for example), or the primary user (department). We will request the Cloud PCs id, display name, primary user’s UPN, the service plan name and id (needed to cross reference our pricing table in Power Bi), the Provisioning Policy name, and the type (Enterprise, Frontline dedicated, or Frontline Shared). The complete URI to return a list of Cloud PCs is: https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs?$select=id,displayName,userPrincipalName,servicePlanName,servicePlanId,ProvisioningPolicyName,ProvisioningType Once we have a list of Cloud PCs, we need to find the primary user for each device. We can return a specific user by replacing the <userPrincipalName> value in the users URI above with the primary user UPN for a specific Cloud PC. Since we only need the department, we will minimize the results by only selecting the userPrincipalName (for troubleshooting), and department. The complete URI is: https://graph.microsoft.com/v1.0/users/<userPrincipalName>?$select=userPrincipalName,department Data sent to a data collection endpoint needs to be formatted correctly. Requests that don’t match the required format will fail. In this case, we need to create a JSON payload. The properties in the payload need to match the order of the properties in the data collection rule (explained later) and the property names are case sensitive. The automation script handles the creation of the JSON object, including matching the case and order requirements as shown here: # Get Cloud PCs from Graph try { $payload = @() $cloudPCs = Invoke-RestMethod -Uri 'https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs?$select=id,displayName,userPrincipalName,servicePlanName,servicePlanId,ProvisioningPolicyName,ProvisioningType' -Headers @{Authorization="Bearer $($graphBearerToken.access_token)"} $CloudPCArray= @() $CloudPCs.value | ForEach-Object { $CloudPCArray += [PSCustomObject]@{ Id = $_.id DisplayName = $_.displayName UserPrincipalName = $_.userPrincipalName ServicePlanName = $_.servicePlanName ServicePlanId = $_.servicePlanId ProvisioningPolicyName = $_.ProvisioningPolicyName ProvisioningType = $_.ProvisioningType } } # Prepare payload foreach ($CloudPC in $CloudPCArray) { If($null -ne $CloudPC.UserPrincipalName){ try { $UPN = $CloudPc.userPrincipalName $URI = "https://graph.microsoft.com/v1.0/users/$UPN" + '?$select=userPrincipalName,department' $userObj = Invoke-RestMethod -Method GET -Uri $URI -Headers @{Authorization="Bearer $($graphBearerToken.access_token)"} $userDepartment = $UserObj.Department } catch { $userDepartment = "[User department not found]" } } else { $userDepartment = "[Shared - Not Applicable]" } $CloudPC | Add-Member -MemberType NoteProperty -Name Department -Value $userDepartment $CloudPC | Add-Member -MemberType NoteProperty -Name TimeGenerated -Value (Get-Date).ToUniversalTime().ToString("o") $payload += $CloudPC } } catch { throw "Error retrieving Cloud PCs or user department: $_" } After the payload has been generated, the script sends it to a data collection endpoint using a URI that is generated by the setup script. # Send data to Log Analytics try { $ingestionUri = "$logIngestionUrl/dataCollectionRules/$dcrImmutableId/streams/$streamDeclarationName`?api-version=2023-01-01" $ingestionToken = (Get-AzAccessToken -ResourceUrl 'https://monitor.azure.com//.default').Token Invoke-RestMethod -Uri $ingestionUri -Method Post -Headers @{Authorization="Bearer $ingestionToken"} -Body ($payload | ConvertTo-Json -Depth 10) -ContentType 'application/json' Write-Output "Data sent to Log Analytics." } catch { throw "Error sending data to Log Analytics: $_" } Getting access tokens with a managed identity Security should be top of mind for any Systems Administrator. When making API calls to Microsoft Graph, Azure, and other resources you may need to provide an access token in the request. Access to resources controlled with an App Registration in Entra. In the past, this required using either a certificate or client secret. Both options create management overhead, and client secrets that are hard coded in scripts present a considerable security risk. Managed identities are managed entirely by Entra. There is no requirement for an administrator to manage certificates or client secrets, and credentials are never exposed. Entra recently introduced the ability to assign a User-assigned managed identity as a federated credential on an App Registration. This means that a managed identity can now be used to generate an access token for Microsoft Graph and other azure resources. You can read more about adding the managed identity as a federated credential here. Requesting an access token via federated credentials happens in two steps. First, the script uses the managed identity to request a special token scoped for the endpoint ‘api://AzureADTokenExchange'. #region Step 2 - Authenticate as the user assigned identity #This is designed to run in Azure Automation; $env:IDENTITY_header and $env:IDENTITY_ENDPOINT are set by the Azure Automation service. try { $accessToken = Invoke-RestMethod $env:IDENTITY_ENDPOINT -Method 'POST' -Headers @{ 'Metadata' = 'true' 'X-IDENTITY-HEADER' = $env:IDENTITY_HEADER } -ContentType 'application/x-www-form-urlencoded' -Body @{ 'resource' = 'api://AzureADTokenExchange' 'client_id' = $UAIClientId } if(-not $accessToken.access_token) { throw "Failed to acquire access token" } else { Write-Output "Successfully acquired access token for user assigned identity" } } catch { throw "Error acquiring access token: $_" } #endregion That token is then exchanged in a second request to the authentication endpoint in the Entra tenant for a token that is scoped to access 'https://graph.microsoft.com/.default' in the context of the App Registration. #region Step 3 - Exchange the access token from step 2 for a token in the target tenant using the app registration try { $graphBearerToken = Invoke-RestMethod "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token" -Method 'POST' -Body @{ client_id = $appClientId scope = 'https://graph.microsoft.com/.default' grant_type = "client_credentials" client_assertion_type = "urn:ietf:params:oauth:client-assertion-type:jwt-bearer" client_assertion = $accessToken.access_token } if(-not $graphBearerToken.access_token) { throw "Failed to acquire Bearer token for Microsoft Graph API" } else { Write-Output "Successfully acquired Bearer token for Microsoft Graph API" } } catch { throw "Error acquiring Microsoft Graph API token: $_" } #endregion Azure Resource Creation Script The PowerShell script included in this example will complete the following tasks: Creates a Log Analytics Workspace Define a custom table in the newly created workspace to store Cloud PC data Configure a data collection endpoint and data collection rule to ingest data into the custom table Create an Azure Automation account and runbook to retrieve data from Microsoft Graph and send it to the data collection endpoint Establish a User Assigned Managed Identity to run the data collection script from Azure Automation Register an App and assign a service principal with required Microsoft Graph permissions Add the Managed Identity as a federated credential within the App Registration Assign workbook operator and Monitoring Metrics Publisher roles to the Managed Identity Steps to Implement: 1. Download the script and Power BI Dashboard: Download the Power Bi dashboard and PowerShell script from GitHub: Windows 365 Custom Report Dashboard 2. Update Variables: Modify the PowerShell script to include your Tenant ID, Resource Group Name, and location Adjust other variables to fit your specific use case while adhering to Azure naming conventions 3. Run the PowerShell Script: Execute the script to create the necessary Azure resources and configurations. 4. Verify Resource Creation: Log into the Azure Portal. Navigate to Log Analytics and confirm the creation of the W365CustomReporting workspace. Click on Settings > Tables and confirm the W365_CloudPCs_CL table was created Search for Automation Accounts and locate AzAut-CustomReporting. 5. Run the Runbook and Pull Data into Log Analytics: Open the CloudPCDataCollection runbook, select Edit > Edit in portal and the click on Test Pane. Click start to test the CloudPCDataCollection runbook and ensure data ingestion into Log Analytics. The runbook may take several minutes to run. You should see a “Completed” status message and the output should include, “Data sent to Log Analytics.” Return to the Log Analytics workspace and select “Logs.” Click on the table icon in the upper left corner of the query window. Select Custom Logs > W365_CloudPCs_CL and click on “Run.” (Please note: initial data ingestion may take several minutes to complete. If the table is not available, please check later.) The table Logs should populate with data from the last 24 hours by default. Click on Share > Export to Power BI (as an M query)Export the data to Power BI using an M query. The file should download. Open the file to view the completed query. Select the contents of the file and copy it to the clipboard. 6. Import Data into Power BI Dashboard: Open the Power BI template. In the table view on the right side of the screen, right click on the CloudPCs table and select “Edit Query.” Click on “Advanced Editor” on the ribbon to edit the query. Paste the contents of the downloaded M Query file in the editor and click “Done.” A preview of your data should appear. We need to make sure the columns match the data in the template. Right click on the “Time Generated” column and select Transform > Date Only. Right click on the same column and select “Rename.” Rename the column to “Date” Click “Close and Apply” to apply your changes and update the dashboard. 7. Update the Pricing and Service Plan Details table (Optional) The Pricing and Service Plan Details table was created via manual data entry, which allows for it to be updated directly within Power BI. To update the dashboard with your pricing information, right click on PricingAndServicePlanDetails table and select edit query Click on the gear icon to the right of “Source” Find the SKU Id that matches the Windows 365 Enterprise or Frontline licenses in your tenant; update the price column to match your pricing 8. (Optional) Update the timespan on the imported M query to view data over a longer period When we initially viewed the logs in Log Analytics, we left the time period set with the default value, “Last 24 Hours.” That means that the query that was created will only show data from the last day, even if the runbook has been configured to run on a schedule. We can edit that behavior by updating the table query. Edit the Cloud PCs table as you did before. In the advanced editor find the “Timespan” property. The Timespan value uses ISO 8601 durations to select data over a specific period. For example, “P1D” will show data from the previous 1 day. The past year would be represented by “P1Y” or “P365D”. Learn more about ISO 8601 duration format here: ISO 8601 - Wikipedia Please note that this query can only return data that is stored in Log Analytics. If you set it to “P1Y,” but only have collected information from the past month, you will still only see 1 month worth of data. Parting thoughts This example demonstrates how a systems administrator can leverage Microsoft Graph, Azure Log Analytics, and Power Bi to create custom reports. The script provided creates all the required resources to create your own custom reports. You can leverage the concepts used in this example to add additional data sources and expand your Log Analytics workbooks (by adding additional columns or tables) to store other data pulled from Microsoft Graph. By following this example, Systems Administrators can build custom Intune reports that integrate data from Microsoft Graph and external sources. This solution provides comprehensive, historical reporting, helping organizations gain valuable insights into their IT environments. Additional Credit: The script to create resources was adapted from the process described by Harjit Singh here: Ingest Custom Data into Azure Log Analytics via API Using PowerShell. Please visit that post for additional information on creating the underlying resources. Limitations: This example is not intended to be ready for production use. While the script creates the underlying infrastructure, it does not automatically schedule the Azure Automation runbook, nor does it change the default retention period in Log Analytics beyond 30 days. The use of Log Analytics and Azure Automation can incur charges. You should follow your organization’s guidelines when scheduling runbooks or updating retention policies. The pricing details table was created based on the Windows 365 SKUs listed on the Product names and service plan identifiers for licensing and the corresponding retail prices for Windows 365 Enterprise and Frontline as of February 26, 2025. You may need to update the pricing details to match your license costs or connect to an outside data source where your license details are stored to accurately reflect your cost details. Disclaimer The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.Migrating BitLocker Recovery Key Management from ConfigMgr to Intune: A Practical Guide
Hi, I'm Herbert Fuchs, a Cloud Solution Architect. In this blog, I’ll guide you through migrating existing BitLocker recovery keys from Configuration Manager to Intune—especially for scenarios involving already encrypted devices. While many posts cover Intune setup basics for greenfield deployments, this guide dives deeper into real-world considerations for Hybrid-Joined, co-managed environments. Current Setup: ConfigMgr BitLocker Management In many organizations, BitLocker encryption and key management is handled via MBAM Standalone or the Configuration Manager BitLocker feature. In both cases, the MBAM Agent Service is responsible for encrypting devices and configuring key protectors based on policy — either via GPO or Configuration Manager profiles. You configure a BitLocker policy and assign it to devices. For Configuration Manager, the Configuration tab will show a BitLocker configuration profile once the client receives the policy. Once the encryption process starts, the BitLocker API events show: Key protector creation TPM sealing Encryption initiation You can check encryption status via PowerShell or using manage-bde.exe. You can also compare the recovery password with what's available in the MBAM Helpdesk Portal. PowerShell: Manage-bde: Compare Key: Note: When Configuration Manager escrows the BitLocker key, the information is written to the registry in UNIX DateTime format. Here's how to convert it: $LastEscrowTime = Get-ItemPropertyValue HKLM:\SOFTWARE\Microsoft\CCM\BLM -Name 'LastEscrowTime' $oUNIXDate=[System.DateTimeOffset]::FromUnixTimeSeconds($LastEscrowTime) $oUNIXDate If your environment is running MECM 2203 and higher than you can test the Escrow through the local API, also the Key-Rotation: Function Invoke-CCMBitlockerEscrowKey { [CmdletBinding()] Param ( [Parameter(Mandatory = $false)] [switch]$rotate ) $ErrorActionPreference = 'stop' #ensure client agent is at least CB 2203 if (([wmi]"ROOT\ccm:SMS_Client=@").ClientVersion.Split('.')[2] -lt 9078){ Write-Host "Required client version is at least CB 2203! Aborting..." -ForegroundColor Yellow break } if ($rotate) { # remove escrowed reference to force key rotation Write-Verbose "Removing HKLM\SOFTWARE\Microsoft\CCM\BLM\Escrowed key (if exists), to force key rotation" Remove-Item HKLM:\SOFTWARE\Microsoft\CCM\BLM\Escrowed -Recurse -ErrorAction SilentlyContinue } # Execute Package/Program Try { $ReturnObj = New-Object System.Collections.ArrayList Write-Verbose "Connect CCM_BLM_KeyEscrow Class" $CCMBLMSDK = ([WMIClass]'root\ccm\clientsdk:CCM_BLM_KeyEscrow') Write-Verbose "Retrieving drive letter(s) of encrypted volumes" $EncryptedDrives = (([wmiclass]"ROOT\cimv2\Security\MicrosoftVolumeEncryption:Win32_EncryptableVolume").GetInstances() | Where-Object ProtectionStatus -EQ 1).DriveLetter # loop through all encrypted drives & escrow the recovery key foreach ($ed in $EncryptedDrives) { Write-Verbose "Execute EscrowKey-Method for drive $ed" $Escrow = $CCMBLMSDK.EscrowKey($ed) Write-Verbose "Fill up HashTable-Object with Information" $Input = @{ 'ReturnValue'= $Escrow.ReturnValue 'Escrowkey' = $Escrow.KeyID 'DriveLetter' = $ed } $InfoTable = New-Object PSObject -Property $Input [Void]$ReturnObj.Add($InfoTable) } Return $ReturnObj } Catch { Write-Host "Exception Type: $($_.Exception.GetType().FullName)" -ForegroundColor Red Write-Host "Exception Message: $($_.Exception.Message)" -ForegroundColor Red Write-Host "Exception Stack: $($_.ScriptStackTrace)" -ForegroundColor Red } } Invoke-CCMBitlockerEscrowKey -rotate -Verbose Step 1: Identify Co-Managed Devices In this migration scenario, we're working with Entra-Hybrid-Joined devices that are co-managed. First, set Endpoint Protection workload authority to Intune. Assign your devices to a staging collection. This will not immediately change BitLocker policies on the device — but prepares the system to receive policy from Intune. In this Registry-Area you can see the Windows Encryption Settings which are enforced: You'll also find the MBAM-Agent configurations here: You can verify workload authority using the CoManagementFlag via this PowerShell Function. The CoManagement Flag you get from the Configmgr-Control-Panel or the Registry: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\CCM\CoManagementFlags. You can also find this state in the SQL-View vClientCoManagementState. Function Get-CoMgmtClientFlag { [CmdletBinding()] Param ( [Parameter(Mandatory=$True)] [Int]$CoMgmtFlag ) $CoMgmtFlagsTable = @{ 'CompliancePolicy' = 2 'ConfigurationSettings' = 8 'Default' = 8193 'DiskEncryption' = 4096 'EpSplit' = 8192 'Inventory' = 1 'ModernApps' = 64 'None' = 0 'Office365' = 128 'ResourceAccess' = 4 'Security' = 32 'WUfB' = 16 } $FlagsObject = [ordered]@{} foreach ($FlagType in $CoMgmtFlagsTable.Keys) { if (($CoMgmtFlag -band $CoMgmtFlagsTable[$FlagType]) -ne 0) { $FlagsObject.Add($FlagType, $True) } } return $FlagsObject } Get-CoMgmtClientFlag -CoMgmtFlag 12527 Once the workload is set to Intune, Configuration Manager is no longer responsible for BitLocker. The original configuration item remains visible, but BitLockerManagementHandler will defer to Intune. Key Insight: Even if you decrypt the disk and reevaluate the BitLocker CI, ConfigMgr will report it as compliant—but it is no longer enforcing the settings. In the next step we will discuss the BitLocker Policy in Intune. In a Migration-Workflow, ensure you setup the same Encryption Policies as you did in your Configuration Manager Policies – with one exception Startup Pin. Intune does not require the MBAM-Agent to manage and control Disk-Encryption – the downside out of the Box you cannot configure a Silent/Unattended Encryption with a Startup PIN because no UI for a Standard User is provided. For Registry-Policies, you might want to deploy the Custom CSP MDMWinOverGPO. However, if you for instance, define a different Cipher-Strength you will always get a Non-Compliant-State. The Reason for such an activity it would be necessary to decrypt and encrypt the System again. Step 2: Create and Assign BitLocker Policy in Intune You can create BitLocker policies in Intune via: Endpoint Security > Disk Encryption Device Configuration Templates Settings Catalog Each has slightly different UI/UX and wording, so take care during setup. Recommendation: Use Endpoint Security > Disk Encryption—it maps directly to the Settings Catalog, and the UI enforces proper dependencies and validations. Example: Silent Encryption Configuration by Endpoint Security Disk Encryption Configure OS drive encryption settings, cipher strength, and recovery options. Assign the policy to a test group in Entra or your staged collection by Collection/Group Sync. Once assigned, the device will receive the policy via the MDM channel. You can verify this via the Windows Settings app or Registry: HKLM\SOFTWARE\Policies\Microsoft\FVE – As we can see now each Configuration Options now added to this space – which is difference to the Configuration Manager Policy item. HKLM\SOFTWARE\Microsoft\PolicyManager\current\device\BitLocker The PolicyManager is in general a good reference for tracking which provider is managing settings. Important: The KeyProtector RecoveryPassword will not automatically back up to Entra unless a new key protector is created and encryption is re-triggered. Step 3: Trigger Backup to Entra or Rotate Key To ensure key escrow to Entra: Option 1: Use Intune to Rotate BitLocker Key From Intune, trigger BitLocker Key Rotation for the device for ad hoc testing. Requires that Windows Recovery Environment (WinRE) is enabled. Windows Recovery Environment (Windows RE) | Microsoft Learn The Client will receive the Notification and execute the Rotation Successful Upload Event to Entra: Successful Upload Event to Active Directory Option 2: Use PowerShell Use the built-in PowerShell cmdlet to back up the recovery key manually. Ideal for scripting or proactive remediation: BackupToAAD-BitLockerKeyProtector BackupToAAD-BitLockerKeyProtector (BitLocker) | Microsoft Learn Here an example for this purpose: <# .Synopsis Backup Bitlocker Recovery Key to Entra .DESCRIPTION The Script will get all Volumes which have Bitlocker Protection On. For each of this Volumes we look for the RecoveryPassword KeyProtector. The ID of this KeyProtector is used to execute the BuiltIn-Cmdlet BackupToAAD-BitlockerRecoveryKey. The activities are added to a Hashtable for a Final Return to be displayed in Endpoint-Analytics. For Troubleshooting Write-Verbose Output can be called. .EXAMPLE BackupBitlockerKeyToEntra.ps1 -Verbose .REQUIREMENT The Script Execution requires Elevated Permissions #> [CmdletBinding()] Param() Try { Write-Verbose "Create empty Array-Object" $BLKeyObject = New-Object System.Collections.ArrayList Write-Verbose "Get all Volumes where Bitlocker Protection is on" $Volumes = Get-BitLockerVolume | where {$_.ProtectionStatus.value__ -eq 1} If ($Volumes -is [System.Object]) { Foreach ($Volume in $Volumes) { Write-Verbose "Get for Drive $($Volume.MountPoint) RecoveryPassword KeyProtector" $KeyProtector = (Get-BitLockerVolume -MountPoint $Volume.Mountpoint).KeyProtector | where {$_.KeyProtectorType -eq 'RecoveryPassword'} If ($KeyProtector) { Write-Verbose "Trigger Backup Bitlocker Recovery Key to Entra for Drive: $($Volume.MountPoint) with ID: $($KeyProtector.KeyProtectorId)" BackupToAAD-BitLockerKeyProtector -MountPoint $Volume.MountPoint -KeyProtectorId $KeyProtector.KeyProtectorId Write-Verbose "Prepare Return HashTable" $Input = @{ Drive = $Volume.MountPoint KeyProtector = $KeyProtector.KeyProtectorId BackupToEntra = $true } $ResultTable = New-Object PSObject -Property $Input [void]$BLKeyObject.Add($ResultTable) } } } Else { Write-Host "WARNING - The System does not have any Bitlocker Encrypted Drive!!!" Exit 1 } Write-Verbose "Backup-Execution successful" Return $BLKeyObject } Catch { Write-Error $_ } Note: In hybrid scenarios, keys may be escrowed to both AD and Entra. If Entra is unavailable during encryption and you've set "Do not enable BitLocker until recovery information is stored to AD DS...", the key will be escrowed to AD only. Recommendation: Use a Proactive Remediation Script to periodically validate and enforce Entra key escrow. You can safely run BackupToAAD-BitLockerKeyProtector multiple times without issues. You can verify backup locations using: manage-bde -protectors -get C -type RecoveryPassword Step 4: Test a Fresh Encryption Cycle To confirm full Intune-based encryption and key escrow: Confirm Policies are applied Decrypt the Volume Remove all key protectors Trigger an Intune policy sync Confirm silent encryption with proper key backup Tip: You can test this with a Generation 2 VM with a virtual TPM. Key Takeaways Ensure BitLocker workload is shifted to Intune before key migration. Match Intune Configuration Profile with existing Configuration Manager Policies – otherwise you get Non-Compliance Messages (Note that Bitlocker-PreProvisioning in a TaskSequences, implies Used Space Encryption) Use key rotation or PowerShell scripts to escrow keys to Entra. Hybrid-joined devices may escrow to both AD and Entra (this is by Design, there is no option to configure only Entra) Confirm encryption compliance locally via Settings app, Registry, and manage-bde.exe – or use the Intune Reports Consider a proactive remediation script to ensure consistent key backup. Intune does not offer RBAC for viewing recovery keys. Show BitLocker-Recovery-Key is an Entra-Permission Device management permissions for Microsoft Entra custom roles - Microsoft Entra ID | Microsoft Learn Thanks for reading! Let me know your feedback or share your own tips and tricks for BitLocker migration from ConfigMgr to Intune! Disclaimer The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.How to easily apply DISA STIGs with Intune
Introduction In today's digital landscape, ensuring the security and compliance of IT infrastructure is paramount. The Defense Information Systems Agency (DISA) provides Security Technical Implementation Guides (STIGs) to optimize security for various software and systems. Utilizing Microsoft Intune, administrators can create configuration profiles that adhere to these STIGs, thereby enhancing their organization's security posture. This blog will walk you through the process of creating Intune Configuration Profiles for DISA STIGs, complete with screenshots and detailed steps. Prerequisites Before diving into the configuration process, ensure you have the following: Access to the Intune admin center. Appropriate administrative privileges to create and manage configuration profiles. Familiarity with DISA STIGs and their requirements. Step-by-Step Guide Step 1: Access Intune Acquire DISA STIG Files: The first step in this process is to acquire the DISA STIG files from their official website (Group Policy Objects – DoD Cyber Exchange). These files contain the specific security guidelines and requirements you need to implement. Visit the DISA website, locate the relevant STIG files for your systems, and download them to your local machine. Prep files: Unzip the file you just downloaded then inside you should find another zipped file named like “Intune STIG Policy Baselines.” Unzip this file as well. Login to Intune with proper permissions: To begin, navigate to the Intune admin center at https://intune.microsoft.com or https://Intune.microsoft.us for Intune Government GCC-H/DoD (I am using a GCC-H instance of Intune, but these steps should be the same no matter what impact level you are using). Sign in with your administrator credentials: If you are using RBAC and least privilege you will need to have at least the “Policy and Profile Manager” role. Step 2: Create a New Configuration Profile Once logged in, follow these steps to create a new configuration profile: In the left-hand menu, select Devices -> Configuration profiles. Click on the Create profile button at the top, select “import policy” Select “Browse for files” and browse to the location where you unzipped the Intune STIG Policy Baselines, inside that folder go to the Intune Policies folder then Settings Catalog. Select your STIG of choice and provide a meaningful name and description for the profile and select save. Step 3: Configure Profile Settings Next, verify the profile settings align with the DISA STIG requirements: Once the profile has been created select view policy. Navigate through the settings and ensure every setting is meticulously configured to meet the STIG compliance guidelines. This may include settings such as password policies, encryption, and network security configurations. Ensure every setting meets the compliance standards of your organization. For example, Windows Spotlight is a feature that rotates the wallpaper and screensaver randomly if your organization uses custom wallpaper or screensavers you may want to have this completely disabled. Step 4: Assign the Profile and TEST, TEST, and TEST Again!! After configuring the profile settings, assign the profile to the appropriate groups: Next to Assignments select edit. Select the user or device groups that the profile should apply to, this should be a small but diverse group of devices or users that can provide feedback on the user experience of the settings being applied and or issues they cause because STIGS never break anything right!? Once you have assigned your groups click Review & Save then Save. Conclusion Creating Intune Configuration Profiles for DISA STIGs is a crucial step in maintaining robust security and compliance within your organization. By following this step-by-step guide, you can effectively configure and deploy profiles that adhere to stringent security standards, safeguarding your IT infrastructure. Stay vigilant and periodically review your profiles to ensure they remain compliant with evolving STIG requirements. Disclaimer While DISA has made this a fairly easy process with Microsoft Intune there are some caveats. In the folder where we found the Intune policies is a “Support files” folder which hold an excel spreadsheet with valuable information. There are still several STIG settings that are not natively set by Intune for various reasons (Not in Windows CSP, organization specific settings, etc.) They have also provided the Desired State Configuration (DSC) files to set a lot of these settings that will need to be deployed as a Win32_APP. This is outside the scope of this blog but stay tuned! Lastly, the spreadsheet provides STIG settings that will be a false positive when you use the Security Content Automation Protocol (SCAP) tool. This is due to the settings being set now through the Configuration Service Providers (CSP) and the tool is looking at the legacy registry locations. Unfortunately, until that tool gets updated to look in the new locations we will need to provide that to prove the settings have been configured. All screenshots and folder paths are from a non-production lab environment and can/will vary per environment. All processes and directions are of my own opinion and not of Microsoft and are from my years of experience with the Intune product in multiple customer environments Additional Resources Microsoft Intune Documentation: Microsoft Intune documentation | Microsoft Learn DISA STIGs: Security Technical Implementation Guides (STIGs) – DoD Cyber Exchange Intune Admin Center: intune.microsoft.com (Commercial/GCC) or Intune.microsoft.us for government (GCC-High/DoD) Stay tuned for future posts where we delve deeper into advanced configurations and best practices. Happy securing!