managed identity
11 TopicsCreating Custom Intune Reports with Microsoft Graph API
Systems administrators often need to be able to report on data that is not available in the native reports in the Intune console. In many cases this data is available to them through Microsoft Graph. However, in some instances administrators may need to pull data from other sources or store it for tracking trends over time. For example, generating a custom dashboard to track Windows 365 license costs requires pulling data from Microsoft Graph and combining it with licensing details that are not available in Graph, but may be stored in another location (an IT Asset Management Tool for example). The Windows 365 Cost Dashboard is an example of how you can combine Intune data from Microsoft Graph with information pulled from another source. This guide provides step-by-step instructions to pull data from Microsoft Graph API, ingest it to Azure Log Analytics, and connect to your workspace with Power Bi. This solution demonstrates how to gather and store Graph API data externally for richer reporting and integrate it with data from an additional data source to produce a dashboard tailored to your unique needs. By using this dashboard as an example, administrators can unlock deeper insights while leveraging Intune's powerful foundation. The solution: This dashboard and the accompanying PowerShell script are meant to demonstrate an end-to-end example of gathering data from Microsoft Graph and ultimately being able to visualize it in a Power Bi dashboard. While it does create the Azure Infrastructure needed to complete the scenario in the demonstration, it can be extended to gather and report additional information. What does this do? This example consists of two separate pieces – the Power Bi dashboard and a PowerShell script that creates all the Azure resources needed to gather data from Microsoft Graph and ingest it into a Log Analytics workbook. This post will discuss all of the infrastructure elements that are created and the steps to get your data from Log Analytics into the Power Bi dashboard, but I want to strip away all of the “extra” elements and talk about the most important part of the process first. Prerequisites The scripts shared in this blog post assume that you already have an Azure subscription and a resource group configured. You need to have an account with the role of “Owner” on the resource group (or equivalent permissions) to create resources and assign roles. The account will also need to have the “Application Developer” role in Entra Active Directory to create an App Registration. To run the resource creation script, you will need to have several modules available in PowerShell. To see the full list please review the script on GitHub. From Microsoft Graph API to Log Analytics: How we get there Microsoft Graph API can give us a picture of what our environment looks like right now. Reporting on data over time requires gathering data from Graph and storing it in another repository. This example uses a PowerShell script running in Azure Automation, but there are several different ways to accomplish this task. Let’s explore the underlying process first, and then we can review the overall scope of the script used in the example. The Azure Automation runbook [CloudPCDataCollection] calls Graph API to return details about each Windows 365 Cloud PC. It does this by making GET requests to the following endpoints: https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs https://graph.microsoft.com/v1.0/users/<userPrincipalName> As a best practice, we should only return the properties from an API endpoint that we need. To do that, we can append a select query to the end of the URI. Queries allow us to customize requests that are made to Microsoft Graph. You can learn more about Select (and other query operators) here. The example dashboard allows you to report on Windows 365 cost over time based on properties of the device (the provisioning policy, for example), or the primary user (department). We will request the Cloud PCs id, display name, primary user’s UPN, the service plan name and id (needed to cross reference our pricing table in Power Bi), the Provisioning Policy name, and the type (Enterprise, Frontline dedicated, or Frontline Shared). The complete URI to return a list of Cloud PCs is: https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs?$select=id,displayName,userPrincipalName,servicePlanName,servicePlanId,ProvisioningPolicyName,ProvisioningType Once we have a list of Cloud PCs, we need to find the primary user for each device. We can return a specific user by replacing the <userPrincipalName> value in the users URI above with the primary user UPN for a specific Cloud PC. Since we only need the department, we will minimize the results by only selecting the userPrincipalName (for troubleshooting), and department. The complete URI is: https://graph.microsoft.com/v1.0/users/<userPrincipalName>?$select=userPrincipalName,department Data sent to a data collection endpoint needs to be formatted correctly. Requests that don’t match the required format will fail. In this case, we need to create a JSON payload. The properties in the payload need to match the order of the properties in the data collection rule (explained later) and the property names are case sensitive. The automation script handles the creation of the JSON object, including matching the case and order requirements as shown here: # Get Cloud PCs from Graph try { $payload = @() $cloudPCs = Invoke-RestMethod -Uri 'https://graph.microsoft.com/beta/deviceManagement/virtualEndpoint/cloudPCs?$select=id,displayName,userPrincipalName,servicePlanName,servicePlanId,ProvisioningPolicyName,ProvisioningType' -Headers @{Authorization="Bearer $($graphBearerToken.access_token)"} $CloudPCArray= @() $CloudPCs.value | ForEach-Object { $CloudPCArray += [PSCustomObject]@{ Id = $_.id DisplayName = $_.displayName UserPrincipalName = $_.userPrincipalName ServicePlanName = $_.servicePlanName ServicePlanId = $_.servicePlanId ProvisioningPolicyName = $_.ProvisioningPolicyName ProvisioningType = $_.ProvisioningType } } # Prepare payload foreach ($CloudPC in $CloudPCArray) { If($null -ne $CloudPC.UserPrincipalName){ try { $UPN = $CloudPc.userPrincipalName $URI = "https://graph.microsoft.com/v1.0/users/$UPN" + '?$select=userPrincipalName,department' $userObj = Invoke-RestMethod -Method GET -Uri $URI -Headers @{Authorization="Bearer $($graphBearerToken.access_token)"} $userDepartment = $UserObj.Department } catch { $userDepartment = "[User department not found]" } } else { $userDepartment = "[Shared - Not Applicable]" } $CloudPC | Add-Member -MemberType NoteProperty -Name Department -Value $userDepartment $CloudPC | Add-Member -MemberType NoteProperty -Name TimeGenerated -Value (Get-Date).ToUniversalTime().ToString("o") $payload += $CloudPC } } catch { throw "Error retrieving Cloud PCs or user department: $_" } After the payload has been generated, the script sends it to a data collection endpoint using a URI that is generated by the setup script. # Send data to Log Analytics try { $ingestionUri = "$logIngestionUrl/dataCollectionRules/$dcrImmutableId/streams/$streamDeclarationName`?api-version=2023-01-01" $ingestionToken = (Get-AzAccessToken -ResourceUrl 'https://monitor.azure.com//.default').Token Invoke-RestMethod -Uri $ingestionUri -Method Post -Headers @{Authorization="Bearer $ingestionToken"} -Body ($payload | ConvertTo-Json -Depth 10) -ContentType 'application/json' Write-Output "Data sent to Log Analytics." } catch { throw "Error sending data to Log Analytics: $_" } Getting access tokens with a managed identity Security should be top of mind for any Systems Administrator. When making API calls to Microsoft Graph, Azure, and other resources you may need to provide an access token in the request. Access to resources controlled with an App Registration in Entra. In the past, this required using either a certificate or client secret. Both options create management overhead, and client secrets that are hard coded in scripts present a considerable security risk. Managed identities are managed entirely by Entra. There is no requirement for an administrator to manage certificates or client secrets, and credentials are never exposed. Entra recently introduced the ability to assign a User-assigned managed identity as a federated credential on an App Registration. This means that a managed identity can now be used to generate an access token for Microsoft Graph and other azure resources. You can read more about adding the managed identity as a federated credential here. Requesting an access token via federated credentials happens in two steps. First, the script uses the managed identity to request a special token scoped for the endpoint ‘api://AzureADTokenExchange'. #region Step 2 - Authenticate as the user assigned identity #This is designed to run in Azure Automation; $env:IDENTITY_header and $env:IDENTITY_ENDPOINT are set by the Azure Automation service. try { $accessToken = Invoke-RestMethod $env:IDENTITY_ENDPOINT -Method 'POST' -Headers @{ 'Metadata' = 'true' 'X-IDENTITY-HEADER' = $env:IDENTITY_HEADER } -ContentType 'application/x-www-form-urlencoded' -Body @{ 'resource' = 'api://AzureADTokenExchange' 'client_id' = $UAIClientId } if(-not $accessToken.access_token) { throw "Failed to acquire access token" } else { Write-Output "Successfully acquired access token for user assigned identity" } } catch { throw "Error acquiring access token: $_" } #endregion That token is then exchanged in a second request to the authentication endpoint in the Entra tenant for a token that is scoped to access 'https://graph.microsoft.com/.default' in the context of the App Registration. #region Step 3 - Exchange the access token from step 2 for a token in the target tenant using the app registration try { $graphBearerToken = Invoke-RestMethod "https://login.microsoftonline.com/$TenantId/oauth2/v2.0/token" -Method 'POST' -Body @{ client_id = $appClientId scope = 'https://graph.microsoft.com/.default' grant_type = "client_credentials" client_assertion_type = "urn:ietf:params:oauth:client-assertion-type:jwt-bearer" client_assertion = $accessToken.access_token } if(-not $graphBearerToken.access_token) { throw "Failed to acquire Bearer token for Microsoft Graph API" } else { Write-Output "Successfully acquired Bearer token for Microsoft Graph API" } } catch { throw "Error acquiring Microsoft Graph API token: $_" } #endregion Azure Resource Creation Script The PowerShell script included in this example will complete the following tasks: Creates a Log Analytics Workspace Define a custom table in the newly created workspace to store Cloud PC data Configure a data collection endpoint and data collection rule to ingest data into the custom table Create an Azure Automation account and runbook to retrieve data from Microsoft Graph and send it to the data collection endpoint Establish a User Assigned Managed Identity to run the data collection script from Azure Automation Register an App and assign a service principal with required Microsoft Graph permissions Add the Managed Identity as a federated credential within the App Registration Assign workbook operator and Monitoring Metrics Publisher roles to the Managed Identity Steps to Implement: 1. Download the script and Power BI Dashboard: Download the Power Bi dashboard and PowerShell script from GitHub: Windows 365 Custom Report Dashboard 2. Update Variables: Modify the PowerShell script to include your Tenant ID, Resource Group Name, and location Adjust other variables to fit your specific use case while adhering to Azure naming conventions 3. Run the PowerShell Script: Execute the script to create the necessary Azure resources and configurations. 4. Verify Resource Creation: Log into the Azure Portal. Navigate to Log Analytics and confirm the creation of the W365CustomReporting workspace. Click on Settings > Tables and confirm the W365_CloudPCs_CL table was created Search for Automation Accounts and locate AzAut-CustomReporting. 5. Run the Runbook and Pull Data into Log Analytics: Open the CloudPCDataCollection runbook, select Edit > Edit in portal and the click on Test Pane. Click start to test the CloudPCDataCollection runbook and ensure data ingestion into Log Analytics. The runbook may take several minutes to run. You should see a “Completed” status message and the output should include, “Data sent to Log Analytics.” Return to the Log Analytics workspace and select “Logs.” Click on the table icon in the upper left corner of the query window. Select Custom Logs > W365_CloudPCs_CL and click on “Run.” (Please note: initial data ingestion may take several minutes to complete. If the table is not available, please check later.) The table Logs should populate with data from the last 24 hours by default. Click on Share > Export to Power BI (as an M query)Export the data to Power BI using an M query. The file should download. Open the file to view the completed query. Select the contents of the file and copy it to the clipboard. 6. Import Data into Power BI Dashboard: Open the Power BI template. In the table view on the right side of the screen, right click on the CloudPCs table and select “Edit Query.” Click on “Advanced Editor” on the ribbon to edit the query. Paste the contents of the downloaded M Query file in the editor and click “Done.” A preview of your data should appear. We need to make sure the columns match the data in the template. Right click on the “Time Generated” column and select Transform > Date Only. Right click on the same column and select “Rename.” Rename the column to “Date” Click “Close and Apply” to apply your changes and update the dashboard. 7. Update the Pricing and Service Plan Details table (Optional) The Pricing and Service Plan Details table was created via manual data entry, which allows for it to be updated directly within Power BI. To update the dashboard with your pricing information, right click on PricingAndServicePlanDetails table and select edit query Click on the gear icon to the right of “Source” Find the SKU Id that matches the Windows 365 Enterprise or Frontline licenses in your tenant; update the price column to match your pricing 8. (Optional) Update the timespan on the imported M query to view data over a longer period When we initially viewed the logs in Log Analytics, we left the time period set with the default value, “Last 24 Hours.” That means that the query that was created will only show data from the last day, even if the runbook has been configured to run on a schedule. We can edit that behavior by updating the table query. Edit the Cloud PCs table as you did before. In the advanced editor find the “Timespan” property. The Timespan value uses ISO 8601 durations to select data over a specific period. For example, “P1D” will show data from the previous 1 day. The past year would be represented by “P1Y” or “P365D”. Learn more about ISO 8601 duration format here: ISO 8601 - Wikipedia Please note that this query can only return data that is stored in Log Analytics. If you set it to “P1Y,” but only have collected information from the past month, you will still only see 1 month worth of data. Parting thoughts This example demonstrates how a systems administrator can leverage Microsoft Graph, Azure Log Analytics, and Power Bi to create custom reports. The script provided creates all the required resources to create your own custom reports. You can leverage the concepts used in this example to add additional data sources and expand your Log Analytics workbooks (by adding additional columns or tables) to store other data pulled from Microsoft Graph. By following this example, Systems Administrators can build custom Intune reports that integrate data from Microsoft Graph and external sources. This solution provides comprehensive, historical reporting, helping organizations gain valuable insights into their IT environments. Additional Credit: The script to create resources was adapted from the process described by Harjit Singh here: Ingest Custom Data into Azure Log Analytics via API Using PowerShell. Please visit that post for additional information on creating the underlying resources. Limitations: This example is not intended to be ready for production use. While the script creates the underlying infrastructure, it does not automatically schedule the Azure Automation runbook, nor does it change the default retention period in Log Analytics beyond 30 days. The use of Log Analytics and Azure Automation can incur charges. You should follow your organization’s guidelines when scheduling runbooks or updating retention policies. The pricing details table was created based on the Windows 365 SKUs listed on the Product names and service plan identifiers for licensing and the corresponding retail prices for Windows 365 Enterprise and Frontline as of February 26, 2025. You may need to update the pricing details to match your license costs or connect to an outside data source where your license details are stored to accurately reflect your cost details. Disclaimer The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.Lesson Learned #527:Calling Azure OpenAI with Managed Identity via sp_invoke_external_rest_endpoint
A day ago, I was working on a service request where our customer got the following error message: {"response":{"status":{"http":{"code":401,"description":""}},"headers":{"Date":"Mon, 07 Jul 2025 19:36:30 GMT","Content-Length":"297","Content-Type":"application\/json","apim-request-id":"cabfb91a-5ede-4371-91d5-XXX","x-ms-client-request-id":"Not-Set","strict-transport-security":"max-age=31536000; includeSubDomains; preload","x-content-type-options":"nosniff"}},"result":{"error":{"code":"PermissionDenied","message":"The principal `XXXX-YYYY-4d9e-8e70-13c98fb84e7a` lacks the required data action `Microsoft.CognitiveServices/accounts/OpenAI/deployments/chat/completions/action` to perform `POST /openai/deployments/{deployment-id}/chat/completions` operation."}}}. Following I would like to share my experience resolving this issue. The first thing was to try reproducing the issue in our lab. So, I began integrating Azure OpenAI with Azure SQL Database to perform sentiment analysis based on fictitious feedback. I created a SQL table called CustomersInfo which contains fictitious customer feedback: CREATE TABLE CustomersInfo ( CustomerID INT PRIMARY KEY IDENTITY(1,1), Name NVARCHAR(100), Feedback NVARCHAR(MAX), Sentiment NVARCHAR(20) NULL ); INSERT INTO CustomersInfo (Name, Feedback) VALUES ('Anna', 'The product arrived damaged and no one responded to my messages.'), ('John', 'I loved the service, it was fast and the product is excellent.'), ('Emily', 'It was okay, but I think packaging could be better.'), ('David', 'I will never buy here again, terrible service.'), ('Sophia', 'Everything was perfect, thank you for the follow-up.'), ('Michael', 'Delivery time was average, but the product did not meet expectations.'), ('Laura', 'Great overall experience, I would recommend it.'), ('James', 'I expected more quality for the price I paid.'), ('Isabella', 'Easy to order, great customer support.'), ('Robert', 'I didn’t like it, but at least they gave me a refund.'); I configured Azure OpenAI and permissions by creating an Azure OpenAI resource Endpoint: https://openaiexample.openai.azure.com Model: gpt-4 Roles granted to the EntraID user that is connecting to the database: Cognitive Services OpenAI User Cognitive Services User I then enabled the SQL database to call the OpenAI endpoint using Managed Identity: CREATE DATABASE SCOPED CREDENTIAL [https://openaiexample.openai.azure.com] WITH IDENTITY = 'Managed Identity', SECRET = '{"resourceid":"https://cognitiveservices.azure.com"}'; DECLARE @response NVARCHAR(MAX); DECLARE Payload NVARCHAR(MAX) = '{ "messages": [ { "role": "system", "content": "You are a helpful assistant." }, { "role": "user", "content": "Classify the sentiment of the following customer feedback as Positive, Negative, or Neutral. Feedback: The product arrived damaged and no one responded to my messages." } ], "max_tokens": 50 }'; EXEC sp_invoke_external_rest_endpoint @url = 'https://openaiexample.openai.azure.com/openai/deployments/gpt-4.1-jmjurado/chat/completions?api-version=2025-01-01-preview', @method = 'POST', @credential = [https://openaiexample.openai.azure.com], Payload = Payload, @response = @response OUTPUT; DECLARE @json NVARCHAR(MAX) = @response; SELECT JSON_VALUE(c.value, '$.message.content') AS Respuesta FROM OPENJSON(@json, '$.result.choices') AS c; To encapsulate this logic, we created a stored procedure: CREATE OR ALTER PROCEDURE AnalizarSentimiento @Texto NVARCHAR(MAX), @Sentimiento NVARCHAR(50) OUTPUT AS BEGIN DECLARE @response NVARCHAR(MAX); DECLARE Payload NVARCHAR(MAX) = '{ "messages": [ { "role": "system", "content": "You are a helpful assistant." }, { "role": "user", "content": "Classify the sentiment of the following customer feedback as Positive, Negative, or Neutral. Feedback: ' + @Texto + '" } ], "max_tokens": 50 }'; EXEC sp_invoke_external_rest_endpoint @url = 'https://openaiexample.openai.azure.com/openai/deployments/gpt-4.1-jmjurado/chat/completions?api-version=2025-01-01-preview', @method = 'POST', @credential = [https://openaiexample.openai.azure.com], Payload = Payload, @response = @response OUTPUT; SELECT @Sentimiento = JSON_VALUE(c.value, '$.message.content') FROM OPENJSON(@response, '$.result.choices') AS c; END; Now, I'm ready to execute the procedure and retrieve the data: DECLARE @Sentimiento NVARCHAR(50); EXEC AnalizarSentimiento @Texto = 'The product arrived damaged and no one responded to my messages.', @Sentimiento = @Sentimiento OUTPUT; SELECT @Sentimiento AS Resultado; However, I got the following result {"response":{"status":{"http":{"code":401,"description":""}},"headers":{"Date":"Mon, 07 Jul 2025 19:36:30 GMT","Content-Length":"297","Content-Type":"application\/json","apim-request-id":"cabfb91a-5ede-4371-91d5-XXX","x-ms-client-request-id":"Not-Set","strict-transport-security":"max-age=31536000; includeSubDomains; preload","x-content-type-options":"nosniff"}},"result":{"error":{"code":"PermissionDenied","message":"The principal `XXXX-YYYY-4d9e-8e70-13c98fb84e7a` lacks the required data action `Microsoft.CognitiveServices/accounts/OpenAI/deployments/chat/completions/action` to perform `POST /openai/deployments/{deployment-id}/chat/completions` operation."}}} Analyzing the error message, it appears that principal ID that I'm using for this Managed Identity to perform the call has not access to the endpoint. Even adding the client_id keyword didn't resolve the issue. CREATE DATABASE SCOPED CREDENTIAL [https://openaiexample.openai.azure.com] WITH IDENTITY = 'Managed Identity', SECRET = '{"client_id":"YYYYY-ZZZZ-4b75-XXX-9ab8e8c14c1e", "resourceid":"https://cognitiveservices.azure.com"}'; After several troubleshooting steps and by analyzing the error message, I identified that the client_id from the error message referenced in the response belongs to the system-assigned managed identity of the Azure SQL Server. Once I granted the necessary permissions to this identity, I was able to connect and successfully perform the operation.481Views0likes0CommentsDMS - Support for Managed Identity for Azure SQL Managed Instance migration
Azure Database Migration Service (DMS) has introduced a new feature that supports the use of Managed Identity for migrating to Azure SQL Managed Instance. This enhancement simplifies the migration process and ensures secure and seamless integration with Azure Database Migration services. In this blog post, we will dive into the prerequisites, permissions or role(s) required, and how to use this associated Managed identity for migrating to Azure SQL Managed Instance. Currently, this feature is supported through Azure Portal, PowerShell, and Az cmdlets. Prerequisites Before you begin the migration to Azure SQL Managed Instance using Managed identity, ensure that following prerequisites are in place: 1. The Target Azure SQL Managed Instance's associated Managed Identity: Azure Database Migration Service only supports Managed Identity that is associated with the target Azure SQL Managed Instance. How to identify the associated Managed Identity? Once you start the migration to Azure SQL Managed Instance using Azure Database Migration Service and on second page, select the target Azure SQL Managed instance, its associated Managed Identity will be displayed if "Use Managed Identity" is selected (default), as highlighted below. Alternatively, you can follow these steps: a) Go to the target Azure SQL Managed Instance's home page. b) On the left menu, under Security > Identity: If User-assigned Managed Identity is present, the associated Managed Identity will be same as selected under the Primary Identity. If there is no User-assigned Managed Identity and only System-assigned Managed Identity is enabled, the associated Managed Identity will be System-assigned Managed Identity and have the same name as the Azure SQL Managed Instance's name. For example, for ABCSQLMI - Azure SQL Managed Instance the System-assigned Managed Identity will be "ABCSQLMI". 2) Permissions: Assign the "Storage Blob Data Reader" role on the storage account to the target instance's associated Managed Identity. Steps to Assign Permissions In the Azure portal, go to the storage account that will be used in migration for keeping the backup files. On the left menu under Access Control (IAM), click on "+Add" > Add role assignment Select or search for builtin role "Storage Blob Data Reader", click Next. Assing this role access to Managed Identity by selecting the associated Managed Identity identified in the previous step as the member. Note: Ensure that the storage account has the "Allow storage account key access" enabled. How to use associated Managed identity for migration? Upon initiating the migration to Azure SQL Managed Instance using Azure Database Migration Service, navigate to the second page and select the target Azure SQL Managed Instance. If the "Use Managed Identity" option is selected (default), the associated Managed Identity will be displayed and used for the migration (as shown in the first image above). Once Managed identity is used for the migration, DMS will utilize this Managed identity for reading the backup files on the Azure blob storage and thus removing the need for SAS keys. Limitations: Azure Database Migration Service supports Managed Identity that is associated with the target Azure SQL Managed Instance only. It can be either User assigned, or System assigned Managed identity. Currently, this feature is supported through Azure Portal only. Ensure that the storage account has the "Allow storage account key access" enabled. Benefits of using Managed Identity: Using Managed Identity for Azure SQL Managed Instance migrations offers several security benefits: Enhanced Security: Managed identities eliminate the need to use SAS key, reducing the risk of SAS key token exposure. Simplified Management: As associated Managed Identity of the target Azure SQL MI is used, it allows for seamless integration with Azure Database Migration services, making it easier to manage access permissions and roles. Improved Efficiency: The streamlined authentication process speeds up migrations and reduces the complexity of managing SAS keys. Improved Compliance: By using Managed Identity, user can ensure that they adhere to security best practices and compliance requirements, as it is managed securely by Azure. All the above benefits make Managed Identity better than SAS key token. Learn more. Conclusion The new feature supporting Managed Identity in Azure Database Migration Service for Azure SQL Managed Instance migrations offers a secure and efficient way to manage permissions during the migration process. By following the steps outlined above and leveraging the security benefits of Managed Identity, you can ensure a smooth and secure migration to Azure SQL Managed Instance.499Views0likes0CommentsConfiguring Azure Blob Trigger Identity Based Connection
If you are tired of having to manage connection strings and secrets for your blob triggered azure functions, then you will be glad to know that as of Azure Blobs extension version 5.0.0 you now can configure these connections using managed identities .External Table in ADX
Hi, I'm trying to create an external table in ADX which uses a Synapse Analytics (SA) database view (called undelivered). The undelivered view itself is query data from a Cosmos analytical store I've create a user defined idenity Added the identiy to the ADX cluster, SA and Cosmos Updated the ADX database: .alter-merge cluster policy managed_identity[ { "ObjectId": "a3d7ddcd-d625-4715-be6f-c099c56e1567", "AllowedUsages": "ExternalTable" } ] Created the database users in SA -- Create a database user for the ADX Managed Identity CREATE USER [adx-synapse-identity] FROM EXTERNAL PROVIDER; -- Grant read permissions ALTER ROLE db_datareader ADD MEMBER [adx-synapse-identity]; GRANT SELECT ON OBJECT::undelivered TO [adx-synapse-identity]; From within SA I can "SELECT * FROM undelivered" and the correct information is returned But when I come to create the external table in ADX: .create-or-alter external table MyExternalTable ( Status: string ) kind=sql table=undelivered ( h@'Server=tcp:synapse-xxxxx.sql.azuresynapse.net,1433;Database="Registration";ManagedIdentityClientId=<key>;Authentication=Active Directory Managed Identity;' ) with ( managed_identity = "<key>" ) I get the error: Managed Identity 'system' is not allowed by the managed_identity policy for usage: ExternalTable So even with me specifying the managed identity I want to use it is still trying to use the system one. How can I get the external table created with the correct managed identity? Any questions please just ask Thanks100Views0likes0CommentsAPI Guide: Resubmitting from a specific Action in Logic Apps Standard
In collaboration with Sofia Hubendick This how-to article explains the process of resubmitting a Logic App Standard from a specific action via API. If you want to resubmit the workflow from the beginning, you can use the https://learn.microsoft.com/sv-se/rest/api/appservice/workflow-trigger-histories/resubmit?view=rest-appservice-2024-04-01&tabs=HTTP instead. Workflow Run Histories - Resubmit Authentication I used a managed identity for authentication, which simplifies the process by eliminating the need to obtain a token manually. Additionally, I implemented the new Logic App Standard Operator role. URL The URL for resubmitting an action looks like this: https://management.azure.com/subscriptions/[subscriptionId]/resourceGroups/[resourceGroupName]/providers/Microsoft.Web/sites/[logicAppName]/hostruntime/runtime/webhooks/workflow/api/management/workflows/[workflowName]/runs/[runId]/resubmit?api-version=2022-03-01 Mandatory URL Path Parameters Name Description subscriptionId The Azure subscription Id resourceGroupName The name of the resource group containing the Logic App logicAppName The name of the Logic App workflowName The name of the workflow runId The id of the workflow run to be resubmitted Request Body The API request body is structured as follows; replace the placeholder with the name of the action: { "actionsToResubmit": [ { "name": "[action name]" } ] } Response Name Description 202 Accepted OK Other Status Codes Error response describing why the operation failed.377Views0likes0CommentsLondon Reactor Meetup September 2024 - Cyber Security
Hey everyone! Thanks for joining the London Reactor Meetup today. Here you can find the resources that have been shared during the meetup and the speakers contact details. Upcoming You can find all upcoming Reactor events HERE Speaker contact and resources: Rafah Knight, CEO @ SecureAI SecureAI LinkedIn Chris Noring LinkedIn Liam Hampton LinkedIn161Views0likes0CommentsHow to use Sqlpackage with Managed Identity
To export Azure SQL database using Sqlpackage and Managed Identity: Step1 Enable system assigned managed identity on an Azure VM Step2 - Enable AAD auth on Azure SQL server - Conn to Azure SQL database via AAD admin - Create contained user for the managed identity (using Azure VM name as contained username) create user <vmname> from external provider; alter role db_owner add member <vmname>; Step3 On the Azure VM where we enabled System assigned Managed Identity, execute below to test getting access token: # Using PowerShell’s Invoke-WebRequest, make a request to the local managed identity's endpoint to get an access token for Azure SQL: $response = Invoke-WebRequest -Uri 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=https%3A%2F%2Fdatabase.windows.net%2F' -Method GET -Headers @{Metadata="true"} # Convert the response from a JSON object to a PowerShell object: $content = $response.Content | ConvertFrom-Json # Extract the access token from the response: $AccessToken = $content.access_token Step4 Run sqlpackage + managed identity to export database ./sqlpackage.exe /at:$AccessToken /Action:Export /TargetFile:"C:\AdventureWorksLT.bacpac" \ /SourceConnectionString:"Server=tcp:{yourserver}.database.windows.net,1433;Initial Catalog=AdventureWorksLT;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;" # OR ./sqlpackage.exe /at:$($AccessToken_Object.Token) /Action:Export /TargetFile:"C:\AdventureWorksLT.bacpac" \ /SourceConnectionString:"Server=tcp:{yourserver}.database.windows.net,1433;Initial Catalog=AdventureWorksLT;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;" Reference: Tutorial: Use a managed identity to access Azure SQL Database - Windows - Azure AD - Microsoft Entra | Microsoft Learn How managed identities for Azure resources work with Azure virtual machines - Microsoft Entra | Microsoft Learn SqlPackage Export - SQL Server | Microsoft Learn6.9KViews0likes2CommentsQuery Azure Monitor Logs with Managed Identity
Hi, It would seem that when using the in-built connector for using KQL to query Azure Monitor Logs you cannot use a Managed Identity, only a Service Principle. Am I missing something here? I have my logic app working using the HTTP request to the Log Analytics API and authenticating with a Managed Identity, but it is far from ideal as I then have to parse the JSON output instead of just being able to attach the query output as a table to the resulting email. Does anyone know if/when the Azure Monitor Connector might be updated to support Managed Identities?