The growing number of devices connecting to the cloud is producing treasure troves of information. This information can be used to not only understand the state of a device, but the state of an entire environment such as a building, energy grid, or factory. However, this vast amount of data can produce its own challenges. As each device generates a time series stream that is often labeled with an encoded tag, it can be hard to understand how different streams are related, or how an overall environment is behaving.
This is where Azure Digital Twins comes in. Azure Digital Twins enables you to create digital models of assets, places, people, and processes, and then relate these items based on their real-world relationships. Now, you can use the Azure Digital Twins plugin for Azure Data Explorer to combine the digital models of your environment with time series data from your devices. More specifically, you can use the plugin to contextualize disparate time series data in ADX by reasoning across digital twins and their relationships to gain insights into the behavior of your modeled environments over time.
For example, with the new Azure Digital Twins plugin for ADX, you can write a query that...
To get you up and running with the new plugin, we’ve created an example scenario where you will combine data in a sample twin graph in Azure Digital Twins with sample time series data in ADX. More specifically, you’ll use joint Azure Digital Twins/ADX queries to understand the operational behavior of various portions of a power distribution grid.
az account set --subscription <your-subscription-ID>
az dt create -n <instance-name> -g <resource-group>
Note: Ensure the resource group exists.
az dt role-assignment create -n <instance-name> --assignee "<owneruser@microsoft.com>" --role "Azure Digital Twins Data Owner"
i. Go to the folder where you downloaded the repository. Navigate into /model/energy-grid-example.
ii. Select all the JSON files (models) and upload them.
i. Within /model/energy-grid-example, select distributionGrid.xlsx to import it.
let ADTendpoint = "https://MyExampleADTinstance.api.wcus.digitaltwins.azure.net";
//Get all twins in your Azure Digital Twins instance
let ADTendpoint = <your_ADT_endpoint>;
let ADTquery = "SELECT T FROM DIGITALTWINS T";
evaluate azure_digital_twins_query_request(ADTendpoint, ADTquery)
// Get the twin ID of all consumers fed by the Fall Street substation
let ADTendpoint = <your_ADT_endpoint>;
let ADTquery = "SELECT CSMR.$dtId as tid FROM DIGITALTWINS SUB JOIN CSMR RELATED SUB.feeds WHERE SUB.$dtId = 'sub_fall_street' AND IS_PRIMITIVE(CSMR.$dtId)";
evaluate azure_digital_twins_query_request(ADTendpoint, ADTquery)
// Chart the average hourly power usage of consumers fed by the Fall Street substation ('sub_fall_street') during week 10
let ADTendpoint = <your_ADT_endpoint>;
let ADTquery = "SELECT CSMR.$dtId as tid FROM DIGITALTWINS SUB JOIN CSMR RELATED SUB.feeds WHERE SUB.$dtId = 'sub_fall_street' AND IS_PRIMITIVE(CSMR.$dtId)";
let weekNumber = 10;
let startDate = datetime_add('week',weekNumber - 1, make_datetime(2021,1,1));
let endDate = datetime_add('week',1, startDate);
evaluate azure_digital_twins_query_request(ADTendpoint, ADTquery)
| extend twinId = tostring(tid)
| join kind=inner (SamplePowerRequirementHistorizedData) on twinId
| project timestamp, twinId, value
| where timestamp between (startDate .. endDate)
| summarize avg(value) by bin(timestamp,1h),twinId
| render timechart
// Identify any anomalies in the data
let ADTendpoint = <your_ADT_endpoint>;
let ADTquery = "SELECT PLANT.$dtId as tid FROM DIGITALTWINS PLANT JOIN GRID RELATED PLANT.feeds WHERE GRID.$dtId = 'pl_solar_gen' AND IS_PRIMITIVE(PLANT.$dtId)";
let weekNumber = 10;
let startDate = datetime_add('week',weekNumber - 1, make_datetime(2021,1,1));
let endDate = datetime_add('week',1, startDate);
evaluate azure_digital_twins_query_request(ADTendpoint, ADTquery)
| extend twinId = tostring(tid)
| join kind=inner (SamplePowerRequirementHistorizedData) on twinId
| project timestamp, twinId, value
| where timestamp between (startDate .. endDate)
| make-series AvgPower=avg(value) on timestamp step 10m by twinId
| extend NoGapsTemp=series_fill_linear(AvgPower)
| project timestamp, NoGapsTemp, twinId
| extend anomalies = series_decompose_anomalies(NoGapsTemp,4)
| render anomalychart with(anomalycolumns=anomalies)
When you're finished, follow these steps to delete the resources used in this walkthrough.
az dt delete -n <instance-name>
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.