First published on MSDN on Apr 02, 2018
How do you monitor Azure Analysis Services? How many users are connected and who are they? These are great questions to understand around your AAS environment. While metric data is exposed via the Metrics blade for Azure Analysis Services in the portal, it's a quick means to answer these questions. What if you wanted to combine this information with other operational data within your organization? Especially around the area of QPU (or query processing units, which is how AAS is priced ). While extended events work in Azure Analysis Services, parsing the resulting XML files into human readable form is cumbersome, difficult, and time consuming. Just as many aspects of the cloud should require some thought process into how systems are designed, it's prudent to rethink the monitoring aspect as well. By using Microsoft Log Analytics , it's possible to build a complete monitoring solution for Azure Analysis Services with a simple Power BI Dashboard that can be accessed along with the rest of needed operational information for system administrators. Log Analytics provides near real-time data exploration for telemetry. There is a great reference here from Brian Harry. The remainder of this post serves to detail setting up this process. One of the elegant approaches of this solution is that once it is set up, apart from the refresh of the power bi report no maintenance is required between the azure analysis services and monitoring step (as opposed to extended events). Without further delay, let's look at the steps required.
As the first step in the process, the first item we need to create is a Log Analytics instance, which is a part of Microsoft OMS (Operations Management Suite). For anyone unfamiliar, OMS is essentially Bing for your telemetry analytics data. For more information, see this " What is Log Analytics " page on the Microsoft docs. In the Azure portal, simply select "Create a resource", and then type Log Analytics:
After clicking Create, select to either use an existing OMS workspace or to link to an existing one within the tenant. Choose to either use an existing resource group or create a new one, and then specify a location. Finally, select the pricing tier and click Ok:
Once this is complete, you'll be presented with the OMS workspace node, which is essentially just a summary page. For now, just leave it as is. Next up we'll configure the magic. One of the major benefits of OMS is the ability to configure both on prem machines to forward information to Log Analytics, or to configure PAAS applications to transmit data , and what's what we'll do here. The doc is relatively straightforward, and the only issue that I encountered when setting it up was that my subscription did not have the Microsoft.insights namespace registered for my subscription. Below is the complete PowerShell Script that I ran against my sub utilizing the script outlined in the Microsoft docs:
##followed the blog post outlined here: https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-collect-azurepass-posh ##save the azurermdiagnostic script locally that is saved on the powershell gallery. #save-script -Name Enable-AzureRMDiagnostics -Path "C:\PowerShellScripts\" ##create a connection to azure #Login-AzureRmAccount ##install any modules that don't already exist Install-Module AzureRm.Insights Install-Module AzureRM.OperationalInsights Install-Module AzureRM.Resources Install-Module AzureRM.Profile ##check and see the current state of the provider namespace for microsoft insights #Get-AzureRMResourceProvider -ProviderNamespace Microsoft.Insights ##register it if needed #Register-AzureRMResourceProvider -ProviderNamespace Microsoft.insights ##run the saved script from earlier C:\PowerShellScripts\Enable-AzureRMDiagnostics.ps1 #Step 1. select the subscripton where the paas offering resides #Step 2. SElect the resource type that you wanto to send to azure. in this case, find the resource type for Microsoft.AnalysisServices/servers #step 3. Provide the&nbsp; category of logs to process or type ALL. #allow the script to run.While running the script, note when it asks for the category of logs to track. For Azure AS, there are 2 options: Engine and Service. The Engine category is what traditionally would be tracked under either profiler or extended events, and the Service category is used for monitoring and tracking traditional perfmon counters (aka metrics).
//AASEvents: AzureDiagnostics | extend LocalTime = TimeGenerated -4h | order by LocalTime desc //AASMetrics: AzureMetrics | extend LocalTime = TimeGenerated -4h | where ResourceProvider == "MICROSOFT.ANALYSISSERVICES" | order by LocalTime descRunning these queries results in the data that AAS has sent showing on screen. The next step is to export the data to Power BI. Fortunately Log Analytics again makes this easy:
All of these scenarios could be accomplished using the Azure Analysis Services PowerShell cmdlets .
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.