Power BI Reports for Azure Lab Services - Part 2: Visualizing Lab Information
Published Aug 25 2021 04:45 PM 6,194 Views
Microsoft

In the last blog post, we used Power BI to visualize data from Cost Management for your labs.  This blog post will focus on how to gather information about your labs and visualize that data.  We’ll create a sample report that answers the questions “What is the total time a student has used a vm for a lab?” and “How many students have claimed a vm for a lab?”

 

planetmaher_0-1629871220409.png

To create this report, here are the

  1. Gather the data.  We need to call to Azure and get information about our lab related resources and store in format usable by Power BI.
  2. Get the data. We need to import data into Power BI.
  3. Transform the data.   We will be gathering data from a few types of Lab Services resources and need to update the data schema to show how each piece is related.
  4. Create the data visualization.
  5. Publish the report for others to see.

Gather the data

Most of the information we will need can be gathered from the Az.LabServices PowerShell module (preview).  If you’ve never used the Az.LabServices, see the LabServices PowerShell Module blog post.

 

To gather the information, we’ll need to run a PowerShell script routinely to gather all the information.  There are few options to choose from.  You can create an Azure PowerShell function that has a timer trigger.  Alternately, you can create a PowerShell runbook  in Azure Automation .  In this blog post, I’ll go over running a script in an Azure PowerShell function.

 

The Azure Function will navigate through the subscription and find

  • all labs in the subscription
  • all lab virtual machines in a lab
  • note any lab virtual machines assigned to/claimed by a student in a lab
  • all users for each lab.

Information will be saved in CSV files in separate blob containers in Azure Storage.  One blob container for lab related information, one for lab virtual machine related information, and one for lab user information.

 

  1. Create an Azure Function App
    1. For Runtime Stack, select PowerShell Core.
    2. For Publish setting, select Code.
    3. By default, Azure Function Apps use the Consumption (Serverless) plan.  The consumption plan has a maximum timeout of 10 minutes.  If you have many labs, consider using premium or dedicated plans, which have longer timeouts.
  2. Create a Timer triggered function.
    1. When creating the function, you’ll be prompted to enter a CRON expression.  Enter 0 0 5 * * * to have the function   Default time zone is UTC.  To change the time zone, see NCRONTAB time zones.
  3. Copy code from Publish-LabServicesData.ps1 into the run.ps1 file for the newly created trigger function.  Don’t forget to change the parameter signature to match the expected inputs for the function.  In our example, param block should be param($Timer).
  4. The code for run.ps1 requires the Az module.  Azure PowerShell functions manage their dependencies in the requirements.psd1 file.  Uncomment the line to use the Az module. 

planetmaher_1-1629871220430.png

  1. The code for run.ps1 uses the Az.LabServices.psm1 module.  We need to add the psm1 and psmd1 files to the Function App to use the Az.LabServices module.   Custom modules need to be placed in a specific directory pattern.  The final pattern for the files will be

 

 

wwwroot
 | - MyFunction
 | | - run.ps1
 | | - function.json
 | - Modules
 | | - Az.LabServices
 | | | - Az.LabServices.psd1
 | | | - Az.LabServices.psm1
 | - local.settings.json
 | - host.json
 | - requirements.psd1

 

 

      1. Open the Kudu console for the Function App.
      2. Create a Modules folder under the wwwroot folder, if it doesn’t exist already.planetmaher_2-1629871220437.png
      3. Create an Az.LabServices folder under the Modules folder.  It should be on the same level as the folder for the functions.planetmaher_3-1629871220440.png
      4. Upload the Az.LabServices.psm1 to the Az.LabServices folder. You can drag the file from your file explorer to the directory structure in Kudo. 
      5. planetmaher_4-1629871220448.png
      6. The Az.LabServices module will need a module manifest to be recognized by the FunctionApp, so let’s create that now.  Run the command from the site/wwwroot/Modules/Az.LabServices folder:
        New-ModuleManifest -Path Az.LabServices.psd1 -ModuleVersion "1.0" -Author "Microsoft Corporation" -RootModule 'Az.LabServices.psm1' -CmdletsToExport "*"

planetmaher_5-1629871220471.png

planetmaher_6-1629871220483.png

  1. The function gathers all the information about your labs and saves all the information in blob storage of a Storage Account.  Before we run the function, we need to give it the values for our and its resource group.  We’ll do that by setting the values in the application settings for the Function App. Add StorageAccountName and StorageAccountResourceGroupName application settings to describe

planetmaher_7-1629871220510.png

 

  1. Lastly, we need to enable a Managed Identity for the Azure Function App.  This will allow the app the permissions needed to execute the Az.LabServices and Az.Storage cmdlets used in the run.ps1.  Create a system-assigned identity for the Function App or add an user-assigned identity to the Function App. 

planetmaher_8-1629871220527.png

Important! Don’t forget to create a role assignment for the managed identity.  I allowed contributor access to the subscription, but you can create more restrictive assignments. 

 

To create a role assignment, click the Azure role assignments button, then choose Add Role Assignment (preview).  One the flyout menu, set the Scope to Subscription, Subscription to your subscription and the Role to Contributor.  Click Save.  Note, it will take a few minutes for the permission to propagate.

planetmaher_9-1629871220547.png

 

The function is ready to go!  Run the function to publish our lab data so we can start creating our Power BI report.

 

Note: There is some personally identifiable information being recorded, including user email and name.   Information will be gathered from resources you specify and saved in storage accounts you specify.  You have control over where data is stored, how it used, and when it is purged.

Consume the data

In Power BI Desktop we will use the Azure Table Storage connector to connect to the data.  This can be done in the same report used in last week’s blog post or an entirely new report. 

  • name of the storage account specified in previous step. Select Next.
  1. Enter the storage account key for the storage account.
  2. Select the containers r. The default name for the containers (as specified in the run.ps1) are labinfo-dailyexports, vminfo-dailyexports, and userinfo-dailyexports
  3. Select Transform Data. This will open the Power Query editor

Transform the data

We now have data for labs, virtual machines and users.  Let’s cleanup the data so we can create our reports.

 

The data from the blobs contains the historical and latest data for labs in the subscription.  First thing we need to do is prune the data because the blob storage has both the latest and historical data.  Optionally, you can create duplicates of all the tables and use one for historical data and one for the latest data.

 

For each table (labinfo-dailyexports, vminfo-dailyexports, and userinfo-dailyexports), transform the data as follows.

  1. If you want to work the latest data, Filter the Name column to keep rows that equal the name of the file that contains the latest file.  In our example, that would be labinfo-latest.csv, vminfo-latest.csv, or userinfo-latest.csv, depending on which table you are in.

planetmaher_10-1629871220553.png

If you want to work with the historical data, keep only the files that do not equal the latest file. 

  1. Select combine files.  PowerBI will automatically change data types for the date and duration columns.
  2. Save transformations.

Before creating any visualizations, let’s make sure Power BI has correctly detected how the lab, virtual machine and user tables are related.  To do this, open the model view in PowerBI desktop.  If working with historical data, all relationships will be many to many. If working with the latest data, relationships should be:

Source

Relationship

Destination

labinfo-dailyexports[LabId]

1 to many

userinfo-dailyexports[LabId]

labinfo-dailyexports[LabId]

1 to many

vminfo-dailyexports[LabId]

 

The detected link only uses the LabIds of each table.  Consider modifying some relationships manually to include all columns that match up.  Depending on what type of reports are created, the minimum relationship description may not be enough.  Model should look like the following.

planetmaher_15-1629872577148.png

Create the data visualization

Now, that we have all the relationships set, let’s cover just a couple of the questions we can answer with this information.

 

How many students have claimed a virtual machine for a lab?

We’ll create a table that shows the lab name, max number of users the lab can hold (lab capacity), number of users added to the lab, and number of users that have registered for the lab (count of claimed virtual machines). Note, in our data model, when a student is assigned a virtual machine by registering for a course, the virtual machine is considered claimed.  Therefore, counting the IsClaimed property on a virtual machine will tell use how many students have a lab virtual machine for their exclusive use for a lab.

 

We need to create a measure on the table for count of the claimed virtual machines.

 

 

 

ClaimedVMs = CALCULATE( COUNTROWS( 'vminfo-dailyexports' ), 'vminfo-dailyexports'[IsClaimed] = TRUE() )

 

 

 

For the table, set values to

  • 'vminfo-dailyexports’[LabName]
  • count of ‘vminfo-dailyexplorts’[VmId]
  • ‘labinfo-dailyexports’[ClaimedVMs]

planetmaher_12-1629871220572.png

planetmaher_16-1629872748283.png

 

What is the total time a student has used a vm for a lab?

To see the total usage for a student over all their labs, we’ll use the virtual machine usage information.  The total usage is stored as a duration type in the VmUsageString column.  It’s important to note, that durations are visualized as the total number of days in decimal form.  For our visualization, let’s display the usage in hours. We’ll need to create a new column in the vminfo-dailyexports table that converts the values from days to hours. Don’t forget to set the datatype to decimal before saving the column formula.

 

 

 

VMUsageInHours = 'vminfo-dailyexports'[VmUsageString] *24

 

 

 

For the table, set the values to

  • 'vminfo-dailyexports'[UserEmail]
  • sum of 'vminfo-dailyexports'[VMUsageInHours]

Rename the VMUsageInHours column to Usage (in hours) for this visual.

 

planetmaher_13-1629871220574.png

 

planetmaher_14-1629871220576.png

Publish the data

Last step is to publish the report!   See Publish datasets and reports from Power BI Desktop for

 

Happing Reporting!

The Lab Services Team

Co-Authors
Version history
Last update:
‎Feb 17 2022 11:30 AM
Updated by: