Deriving real-time intelligence about the physical world with Azure Percept and Azure Digital Twins
Published Jul 27 2021 08:00 AM 3,434 Views
Microsoft

Imagine that you represent a company that manages facilities for various conferences and exhibitions. Part of your job is to ensure that products exhibited by participants (or customers) are being displayed at the right location assigned to the participant. Not only would you need a view of the facilities, but you would also require real-time intelligence about the displayed products and their locations. Now this is possible through the real-time intelligence of Azure Percept and real-world view of Azure Digital Twins.

 

Azure Percept is a comprehensive platform for creating edge AI solutions. It is great at running AI at the edge and communicating back to Azure IoT Hub.

Azure Digital Twins is a comprehensive platform for creating digital representation of real-world things. It is an amazing technology for creating a digital representation of the physical world.

 

When you combine the real-time intelligence of Azure Percept with real-world view of Azure Digital Twins, you get an intelligent digital representation of the real world.

 

In this post we will see how we can collect real-time AI on the edge through Azure Percept and send it to Azure Digital Twins, adding real-time intelligence to the digital representation of real-world entities such as places, devices, and people.

 

The following figure shows how real-time AI on the Edge can be combined with Azure Digital Twins to yield real-time data visualization:

 

mnabeel_48-1626935892048.png

The goal of this post is to show how to build an end-to-end solution that leverages Azure Percept and Azure Digital Twins to build a real-time intelligent data visualization.

 

Architecture

 

mnabeel_0-1626942360023.png

The main components for the architecture are:

  • The Azure Percept DK that is configured to run AI on the Edge.
  • Azure IoT Hub that is configured to receive messages from the Azure Percept device.
  • An Azure Digital Twins instance deployed with DTDL models representing the physical entities monitored by the solution.
  • Azure Event Hub setup to receive events/updates from the Azure Digital Twins instance. Azure Event Hub allows telemetry and event data to be made available to various stream-processing infrastructures and can receive millions of events per second which is exactly what we need to process a constant stream of events from the Azure Percept DK.
  • Two Azure Functions, one which serves as a bridge between Azure Percept and the Azure Digital Twins instance, and another one that listens to the Azure Event Hubs to receive updates from the Azure Digital Twins instance.
  • A front-end Dashboard application to receive the updates from Azure Digital Twins through EventHub. Any front-end platform can be used to receive these updates and provide data visualization. In our example, we are using a SignalR based Dashboard application.

 

Setup

 

Azure Percept DK Setup

If you do not have Azure Percept DK, you can use any IoT device (edge or leaf) that sends inference data to Azure IoT Hub.

To learn more about how to configure Azure Percept devices running AI visit Create a no-code vision solution in Azure Percept

The Azure Percept device’s job is to look for objects of interest using an AI model that detects various objects such as people, bottle etc. For this post we are using an example with two use cases:

  • Detect objects that are compliant: the presence of “bottle” is expected.
  • Detect objects that are not compliant: the presence of a “person” is not expected as this would raise safety concerns.

The Azure Percept DK is constantly looking for compliant objects (example: bottle) and non-compliant objects (example: person).

Here is how Azure Percept is detecting a “bottle”:

mnabeel_51-1626936536300.png

Azure Percept is constantly sending the inference data (information about detected objects such as bottle, erson) to Azure IoT Hub as device-to-cloud messages.

Here is an example of data sent from the Azure Percept DK to Azure IoT Hub:

Data coming from Azure Percept

 

 

 

{ 
  "body": { 
    "NEURAL_NETWORK": [ 
      { 
        "bbox": [ 
          0.32, 
          0.084, 
          0.636, 
          1 
        ], 
        "label": "person", 
        "confidence": "0.889648", 
        "timestamp": "1622658821393258467" 
      } 
    ] 
  }, 
  "enqueuedTime": "2021-06-02T18:33:41.866Z" 
}

 

 

 

This data will be paired up with Azure Digital Twins to yield real-time AI data visualization of real-world digital representation.

 

Azure Digital Twins Setup

Azure Digital Twins setup is divided into two distinct parts.

  • The first part deals with the setup of a model for Azure Digital Twins. The Digital Twins Definition Language (DTDL) models are used to define the digital representation of real-world entities indicating their properties and relationships. The digital entities can represent people, places, and things. For more details on DTDL models visit the Azure Digital Twins documentation.
  • The second part is the provisioning of the Azure Digital Twins instance.

In our example scenario we developed a model that has two components. One component of the model will be the site (mentioned as “PerceptSite” in the model) where the exhibition is taking place. The second component is the model for the floor (“PerceptSiteFloor”) which is assigned to a particular exhibition participant.

 

Azure Digital Twins Instance Setup

In the following steps, we are using the CLI to setup the Azure Digital Twins instance. You can find details for this at: Set up an instance and authentication (CLI) - Azure Digital Twins | Microsoft Docs.

For steps using the Azure Portal visit Set up an instance and authentication (portal) - Azure Digital Twins | Microsoft Docs

 

 

 

$rgname = "<your-prefix>" 
$random = "<your-prefix>" + $(get-random -maximum 10000) 
$dtname = $random + "-digitaltwin" 
$location = "westus2" 
$username = "<your-username>@<your-domain>" 
$functionstorage = $random + "storage" 
$telemetryfunctionname = $random + "-telemetryfunction" 
$twinupdatefunctionname = $random + "-twinupdatefunction" 

# Create resource group 
az group create -n $rgname -l $location 

# Create Azure Digital Twins instance 
az dt create --dt-name $dtname -g $rgname -l $location 

# Create role assignment for user needed to access Azure Digital Twins instance 
az dt role-assignment create -n $dtname -g $rgname --role "Azure Digital Twins Data Owner" --assignee $username -o json 

 

 

 

Model Setup

 

 

 

$sitemodelid = "dtmi:percept:DigitalTwins:Site;1" 

# Creating Azure Digital Twins model for Site 
az dt model delete --dt-name $dtname --dtmi $sitemodelid 
$sitemodelid = $(az dt model create -n $dtname --models .\SiteInterface.json --query [].id -o tsv) 
$sitefloormodelid = "dtmi:percept:DigitalTwins:SiteFloor;1" 

# Creating Azure Digital Twins model for Site floor 
$sitefloormodelid = $(az dt model create -n $dtname --models .\SiteFloorInterface.json --query [].id -o tsv) 

# Creating twin: PerceptSite 
az dt twin create -n $dtname --dtmi $sitemodelid --twin-id "PerceptSite" 

# Creating twin: PerceptSiteFloor 
az dt twin create -n $dtname --dtmi $sitefloormodelid --twin-id "PerceptSiteFloor" 
$relname = "rel_has_floors"

# Creating relationships" 
az dt twin relationship create -n $dtname --relationship $relname --twin-id "PerceptSite" --target "PerceptSiteFloor" --relationship-id "Site has floors"

 

 

 

Here is how this basic model looks like once it is created on Azure Digital Twins:

mnabeel_0-1626939036437.png

 

The above screenshot is taken from Azure Digital Twins Explorer view. Azure Digital Twins Explorer is a developer tool for the Azure Digital Twins service. It lets you connect to an Azure Digital Twins instance to understand, visualize, and modify your digital twin data.

Functions Apps Setup

 

Azure Digital Twins Ingestion App

mnabeel_1-1626939132912.png

 

The Azure Digital Twins Ingestion Function app will receive updates from Azure IoT Hub and forward those updates to Azure Digital Twins. The functions use in the code can be found in the Azure.DigitalTwins.Core library.

 

Here is the source code for the Azure Digital Twins Ingestion App:

 

 

 

namespace TwinIngestionFunctionApp 
{ 
    using Azure; 
    using Azure.Core.Pipeline; 
    using Azure.DigitalTwins.Core; 
    using Azure.Identity; 
    using Microsoft.Azure.EventHubs; 
    using Microsoft.Azure.WebJobs; 
    using Microsoft.Extensions.Logging; 
    using Newtonsoft.Json; 
    using Newtonsoft.Json.Linq; 
    using System; 
    using System.Net.Http; 
    using System.Text; 
    using IoTHubTrigger = Microsoft.Azure.WebJobs.EventHubTriggerAttribute; 

 
    public class TwinsFunction 
    { 
        private static readonly string adtInstanceUrl = Environment.GetEnvironmentVariable("ADT_SERVICE_URL"); 
        private static HttpClient httpClient = new HttpClient(); 

 
        [FunctionName("TwinsFunction")] 
        public async void Run([IoTHubTrigger("messages/events", Connection = "EventHubConnectionString")] EventData message, ILogger log) 
        { 

 
            if (adtInstanceUrl == null) log.LogError("Application setting \"ADT_SERVICE_URL\" not set"); 
            { 
                try 
                { 
                    //Authenticate with Digital Twins 
                    ManagedIdentityCredential cred = new ManagedIdentityCredential("https://digitaltwins.azure.net"); 
                    DigitalTwinsClient client = new DigitalTwinsClient(new Uri(adtInstanceUrl), cred, new DigitalTwinsClientOptions { Transport = new HttpClientTransport(httpClient) }); 
                    if (message != null && message.Body != null) 
                    { 

                       log.LogInformation(Encoding.UTF8.GetString(message.Body.Array)); 

 
                        // Reading AI data for IoT Hub JSON 
                        JObject deviceMessage = (JObject)JsonConvert.DeserializeObject(Encoding.UTF8.GetString(message.Body.Array)); 
                        string label = deviceMessage["NEURAL_NETWORK"][0]["label"].ToString(); 
                        string confidence = deviceMessage["NEURAL_NETWORK"][0]["confidence"].ToString(); 
                        string timestamp = deviceMessage["NEURAL_NETWORK"][0]["timestamp"].ToString(); 
                        if(!(string.IsNullOrEmpty(label) && string.IsNullOrEmpty(confidence) && string.IsNullOrEmpty(timestamp))) 
                        { 
                            var updateTwinData = new JsonPatchDocument(); 
                            updateTwinData.AppendAdd("/Label", label); 
                            updateTwinData.AppendAdd("/Confidence", confidence); 
                            updateTwinData.AppendAdd("/timestamp", timestamp); 
                            await client.UpdateDigitalTwinAsync("PerceptSiteFloor", updateTwinData); 
                            log.LogInformation($"Updated Device: PerceptSiteFloor with { updateTwinData} at: {DateTime.Now.ToString()}"); 
                        } 
                    } 
                } 
                catch (Exception e) 
                { 
                    log.LogError(e.Message); 
                } 

 
            } 
        } 
    } 
} 

 

 

 

Here is the video that shows how the inference data from Azure Percept is being received by Azure Digital Twins instance:

 

 

Publishing messages from Azure Digital Twins to Event Hub

 

In the previous sections we have seen how the messages from Azure Percept are forwarded to Azure Digital Twins through Azure IoT Hub using an Azure Function. In this section we will see how to prepare the Azure Digital Twins instance to send messages to Event Hub. This involves the following:

  • Setting up Event Hub
  • Setting up Azure Digital Twins to send messages to Event Hub.

Setting up Event Hub

The setting up of the Event Hub involves creating an Event Hub Namespace and creating an Event Hub.

To create Event Hub Namespace refer to details mentioned at Azure Quickstart - Create an event hub using the Azure portal - Azure Event Hubs | Microsoft Docs

To create Event Hub refer to details mentioned at Azure Quickstart - Create an event hub using the Azure portal - Azure Event Hubs | Microsoft Docs

Once the Event Hub is created, we can proceed to set up the route from Azure Digital Twins.

 

Setting up Azure Digital Twins to send messages to Event Hub

Use the following steps to setup Azure Digital Twins to send messages to Event Hub:

  1. Create Azure Digital Twins Endpoint
  2. Create Azure Digital Twins Event Route

 

Create Azure Digital Twins Endpoint

For this step will be using the Event Hub Namespace and Event Hub that was created in previous steps.

Following image illustrates the details of creating an Azure Digital Twins Endpoint:

mnabeel_2-1626939490703.png

 

Create Azure Digital Twins Event Route

The following image illustrates the details of creating an Azure Digital Twins Event Route:

mnabeel_3-1626939550331.png

The above steps will result in our Azure Digital Twins instance routing messages to our Event hub.

 

Azure Digital Twins Update App

 

The previous sections explained the steps to receive messages from Azure Percept to Event Hub using Azure IoT Hub and Azure Digital Twins. In this section we focus on the Azure function that will be receiving events from Event Hub. These events are showing the updates that Azure Digital Twins instance is receiving.

mnabeel_4-1626939622770.png

The purpose of Azure Function for receiving Azure Digital Twins updates through Event Hub is to create a source for any further processing. For example, this can be a source for 3d modeling platform or an API backend.

 

Here is the source code for the Azure Function that processes the Event Hub events coming from Azure Digital Twins:

 

 

 

namespace TwinsUpdateFunctionApp 
{ 
    using System; 
    using System.Collections.Generic; 
    using System.Linq; 
    using System.Net.Http; 
    using System.Text; 
    using System.Threading.Tasks; 
    using Microsoft.Azure.EventHubs; 
    using Microsoft.Azure.WebJobs; 
    using Microsoft.Extensions.Logging; 
    using Newtonsoft.Json; 
    using Newtonsoft.Json.Linq; 
    using TwinsUpdateFunctionApp.model; 
 
    public static class TwinsUpdateFunction 
    { 
        private static readonly string twinReceiverUrl = Environment.GetEnvironmentVariable("TWINS_RECEIVER_URL"); 

        [FunctionName("TwinsUpdateFunction")] 
        public static async Task Run([EventHubTrigger("[your-digitaltwin-eventhub]", Connection = "EventHubConnectionString")] EventData[] events, ILogger log) 
        { 
            var exceptions = new List<Exception>(); 
            List<TwinUpdate> twinUpdates = new List<TwinUpdate>(); 
            foreach (EventData eventData in events) 
            { 
                try 
                { 
                    string messageBody = Encoding.UTF8.GetString(eventData.Body.Array, eventData.Body.Offset, eventData.Body.Count); 
                    JObject twinMessage = (JObject)JsonConvert.DeserializeObject(messageBody); 
                    if (twinMessage["patch"] != null) 
                    { 
                        TwinUpdate twinUpdate = new TwinUpdate(); 
                        twinUpdate.ModelId = twinMessage["modelId"].ToString(); 
                        foreach (JToken jToken in twinMessage["patch"]) 
                        { 
                            if (jToken["path"].ToString().Equals("/FloorId", StringComparison.InvariantCultureIgnoreCase)) 
                            { 
                                twinUpdate.Floor = jToken["value"].ToString(); 
                            } 
                            if (jToken["path"].ToString().Equals("/FloorName", StringComparison.InvariantCultureIgnoreCase)) 
                            { 
                                twinUpdate.FloorName = jToken["value"].ToString(); 
                            } 
                            if (jToken["path"].ToString().Equals("/Label", StringComparison.InvariantCultureIgnoreCase)) 
                            { 
                                twinUpdate.Label = jToken["value"].ToString(); 
                            } 
                            if (jToken["path"].ToString().Equals("/Confidence", StringComparison.InvariantCultureIgnoreCase)) 
                            { 
                                twinUpdate.Confidence = jToken["value"].ToString(); 
                            } 
                            if (jToken["path"].ToString().Equals("/timestamp", StringComparison.InvariantCultureIgnoreCase)) 
                            { 
                                twinUpdate.Timestamp = jToken["value"].ToString(); 
                            } 
                        } 
 
                        using (HttpClient httpClient = new HttpClient()) 
                        { 
                            var requestURl = new Uri($"{twinReceiverUrl}?label={twinUpdate.Label}&confidence={twinUpdate.Confidence}&timestamp={twinUpdate.Timestamp}&floorId={twinUpdate.Floor}&floorName={twinUpdate.FloorName}"); 
                            StringContent queryString = new StringContent(messageBody); 
                            var response = httpClient.PostAsync(requestURl, queryString).Result; 
                        } 
 
                        twinUpdates.Add(twinUpdate); 
                    } 
                     
                    await Task.Yield(); 
                } 
                catch (Exception e) 
                { 
                    exceptions.Add(e); 
                } 
            } 
 
            if (exceptions.Count > 1) 
                throw new AggregateException(exceptions); 

 
            if (exceptions.Count == 1) 
                throw exceptions.Single(); 
        } 
    } 
} 

 

 

 

The above-mentioned code represents an Azure Function that is triggered by EventHubTrigger. After deserializing the data received from the Digital Twins update, the Azure function posts update to a SignalR front-end application using the following code:

 

 

 

                        using (HttpClient httpClient = new HttpClient()) 
                        { 
                            var requestURl = new Uri($"{twinReceiverUrl}?label={twinUpdate.Label}&confidence={twinUpdate.Confidence}&timestamp={twinUpdate.Timestamp}&floorId={twinUpdate.Floor}&floorName={twinUpdate.FloorName}"); 
                            StringContent queryString = new StringContent(messageBody); 
                            var response = httpClient.PostAsync(requestURl, queryString).Result; 
                        } 

 

 

 

Front-end Dashboard App

 

The last and the most important piece of our architecture is the front-end Dashboard app that will be used to visualize the Azure Data Twins updates. This is where the value of combing Azure Percept and Azure Digital Twins is revealed. Any front-end platforms can be used for this part depending on the requirements and constraints. To demonstrate the Front-end Dashboard App, we are using a SignalR based application with a single page. SignalR allows to make Web pages more dynamic, offering an eventing mecanism from server to client. Here is a link to a great tutorial on SignalR: Get started with ASP.NET Core SignalR | Microsoft Docs

 

Once the Dashboard App receives the data it automatically updates its view. The view contains a top section showing the “Data visualization through Azure Digital Twins updates and a bottom section showing the “Raw Azure Digital Twins updates”. The top section is an example of how you can present the data visualization on a view using the raw data mentioned on the bottom section.

 

mnabeel_0-1626940171467.png

 

For our example we had two use cases representing compliant and non-compliant objects detection. Here is how the data visualization on the front-end app will look like for compliant and non-compliant objects:

 

Compliant object detection:

mnabeel_1-1626940252929.png

 

Non-compliant object detection:

mnabeel_2-1626940316304.png

The source code for the example that we have used for data visualization can be found at: https://aka.ms/AAd96fy

 

Conclusion

Azure Percept and Azure Digital Twins are two distinct Azure services that are built to provide unique values. In this post we have seen how real-time AI with Azure Percept can be combined with the real-word digital representations of Azure Digital Twins to derive pertinent data visualization using real-time intelligence. This post has shown how simple it is to combine both technologies to achieve data visualization that we need.

Here is what the final application looks like:

Compliant use case

mnabeel_3-1626940471402.png

 

Non-compliant use case

mnabeel_4-1626940546176.png

Following video shows how Azure Percept is running AI on edge to identify objects and then how the Front-end Dashboard App displays the update flowing from Azure Digital Twins:

 

 

Let us know what you think commenting below and to stay informed, subscribe to our post here and follow us on Twitter @MicrosoftIoT .

Co-Authors
Version history
Last update:
‎Jul 26 2021 10:25 AM
Updated by: