If you recently bought an Azure Percept Development Kit, or you are about to buy one, you probably wonder what kind of solutions you can build out of the box. In this blog post I explain how to use the standard people detection model and how to build a room occupancy detection solution with it.
With the Azure Percept DK and its camera I can detect people in a room, upload data to Azure IoT Hub and stream it using an Azure Stream Analytics job into a Power BI dataset to offer a nice visualization in a Power BI report. The architecture looks like this:
In Azure Percept Studio you can easily deploy a sample model to your device. For this project I am using the "People detection" model.
Azure Percept will send detection data to your IoT hub, and in order to consume this data from another service, you need to create a view into the data stream which is called "consumer group". On the Azure Portal, in the IoT Hub page, under the option "Built-in endpoints", you can add a new consumer group by typing in a new name.
You can then go to your instance of Azure Stream Analytics and create a new job that will stream data from the IoT Hub into a Power BI dataset. Define the job name, resource group and location, and click the "Create" button.
For the job you need to define the input (where data is coming from), the output (where we want to send the data), and write a query defining how the data will be transformed between input and output.
First add a new input by clicking on "Inputs" under "Job topology" and click "Add stream input". Set the name for your input, select the IoT Hub you are using, select the consumer group we just created and use "service" for "Shared access policy name". The IoT Hub "service" policy is created by default, you can learn more about it here.
In the left pane select "Outputs" and in a similar way, add output by defining a name, a Power BI group workspace, a database and table name. Click "Authorize" to authorize the connection between Stream Analytics and Power BI.
Next, you need to configure a query for the job. A simple query we can use for this solution would select the number of detections and time when the event was processed:
SELECT
NEURAL_NETWORK AS Detections,
GetArrayLength (NEURAL_NETWORK) AS PersonCount,
EventProcessedUtcTime AS Time
INTO
perceptstreamoutput
FROM
perceptstreaminput
WHERE
NEURAL_NETWORK IS NOT NULL AND
GetArrayLength (NEURAL_NETWORK) > 0
Save the query and start the Stream Analytics job. Sign in to Power BI and go to the workspace you defined as the output for the job. Create a new report and pick the dataset where data is being streamed.
In your report add the visualization you prefer and add fields "Person Count" and "Time" to it. If you choose a "Stacked column chart" you should be able to visualize your detections as shown below:
You can see the full solution I built in the below video, in chich you can also hear a bit about the use case of this solution in the Real Estate business:
If you'd like to learn more about the use case of this solution feel free to check out the blog post "Using Azure Percept to build the next Smart Building".
Resources
Information about the Azure Percept development kit can be found here: http://aka.ms/getazurepercept.
Azure Percept Studio, which allows you to connect, build, customize and manage your Azure Percept edge solutions is described here: https://aka.ms/azureperceptbuild.
Azure Percept devices come with built-in security on every device and information about it can be found here: https://aka.ms/azureperceptsecure.
Azure Percept library of AI models is available here: https://aka.ms/azureperceptexplore.