Workplace safety is a concern that spans every industry and geography. While there are many innovations in this space, many challenges still remain for organizations. One of these challenges is the fact that many aspects of workplace safety are difficult to measure, monitor, and enforce compliance.
Azure Percept provides a set of capabilities that may help organizations address some of these workplace safety challenges. Most notably, the Azure Percept dev kit makes it easy to implement advanced vision AI by providing a platform with development tools, modular components, and a robust messaging infrastructure that can bring object detection and processing capabilities to the edge. These capabilities allow organizations to more easily build solutions that can address situations which are easy to detect visually, but difficult to measure with conventional hardware.
The solution, developed by KPMG as part of an Azure Percept Bootcamp, explores a scenario involving a fictional organization named Worldwide Widgets. Worldwide Widgets operates several regional distribution centers and would like to automate the enforcement of their safety protocols in order to provide a safe working environment for their employees and reduce the financial and operational risk that may result from a safety incident.
Worldwide Widgets would like to explore the possibility of using the Azure Percept dev kit in a proof of concept to monitor compliance with one of their safety regulations. The organization requires that loading docks are kept clear of obstructions while the dock is not in use. Lately, Worldwide Widgets observed an increase in situations where the docks in one of their facilities were not cleared after use. Obstructions were being left behind after loading trailers for distribution, and these obstructions present an ongoing safety hazard. Worldwide Widgets is a rapidly growing organization and its distribution staff is always busy, which makes it difficult to monitor compliance with this regulation. The organization would like to automate this task to allow its employees to focus on shipping products safely and on time.
A successful proof of concept will use Azure Percept to meet the following goals:
Additionally, a successful solution must minimize false positives. The solution should only identify situations where obstructions have truly been left unattended. Since this is an active loading dock, pallets, boxes, tools, and equipment will be detected while the dock is actively in use, but the solution must be able to distinguish between active loading scenarios and idle time. The solution should not alert when obstructions are detected when the dock is in active use.
The overall architecture for the solution is shown below.
The solution uses Azure Percept to detect the presence of obstructions and people from a live video stream of the loading dock. The solution processes this feed directly on the Azure Percept dev kit using an object recognition model developed in Azure Cognitive Services and deployed over the air to the Azure Percept dev kit. This object detection model is trained to identify obstructions and persons at the loading dock. The device processes live video stream on the device at the edge, and streams telemetry to deployed business logic, the Count Module, installed on the device. This module reduces the model inferrence output to a simple count of detected objects. The output from the Count Module is streamed to Azure IoT Hub.
Azure IoT Hub aggregates this streaming telemetry and pipes it to Azure Stream Analytics for processing and smoothing. Azure Stream Analytics will smooth the telemetry by averaging the telemetry over tumbling windows in order to reduce the sensitivity of the system. Azure Stream Analytics then writes the smoothed telemetry to an Azure SQL Database for persistence and visualization.
Once the smoothed telemetry data has been persisted to Azure SQL Database, the solution uses Power BI in order to visualize the data and present a live compliance dashboard. The solution applies a set of rules to monitor compliance. If the solution identifies an out of compliance scenario, the dashboard will indicate the type, location, and severity of the violation. Additionally, the dashboard will track compliance trends.
This data flow is summarized below.
This solution is implemented as a proof of concept using a residential garage as a substitute for a loading dock. After all, just like a safe loading dock is important for an efficient warehouse, a clean garage is part of an efficiently operating home. The specifics of the project implementation are described below.
The object detection model was trained using Azure Cognitive Services and deployed to the Azure Percept device using images captured in the simulated loading dock. Azure Percept Studio works with the Azure Percept dev kit to streamline this training process by using the Azure Percept dev kit to automatically capture and upload training images at a regular interval. These images are tagged and the object detection model can be trained, refined, and deployed using a no-code user interface.
Documentation for the training procedure is available here.
A sample video of the model applied against a video stream is provided below.
The count module was modified from the example code provided here. CountModule/main.py was modified to include the following code to count people and obstructions.
print(f'inference list: {inference_list}')
count_ppl = 0
count_obs = 0
if isinstance(inference_list, list) and inference_list:
now = datetime.fromtimestamp(int(inference_list[0]['timestamp'][:-9]))
count_ppl = len([x for x in inference_list if x["label"] == "person"])
count_obs = len([x for x in inference_list if x["label"] == "obstruction"])
and the telemetry payload is generated with the following code
json_data = {
'date': f'{now}',
'people_count': count_ppl,
'obstruction_count': count_obs
}
An Azure Stream Analytics job is used to process telemetry data as it arrives in Azure IoT Hub. The stream analytics job uses the following query. sql-database-target and iot-hub-source are configured in Azure Stream Analytics to point to the solution’s Azure IoT Hub and a table in Azure SQL Database.
SELECT
AVG(people_count) as people_count,
AVG(obstruction_count) as obstruction_count,
MAX(date) as ts
INTO
[sql-database-target]
FROM
[iot-hub-source]
WHERE
people_count IS NOT NULL
GROUP BY TumblingWindow(second, 60)
This query will average the reported person and obstruction counts over 60 second tumbling windows as the data arrives in Azure IoT Hub. The data is then written to a table in Azure SQL Database.
Power BI is configured to use the targeted Azure SQL Database table for reporting. The dashboard tracks the output and identifies situations where the average counts in a tumbling window indicate that obstructions are detected without the presence of workers. When these conditions occur, the dashboard will report the presence of unattended hazards in the loading dock. The dashboard also infers the severity of the violation based on factors including the number of obstructions, elapsed time unattended, and scheduled dock usage.
The dashboard tracks compliance performance on daily, weekly, and monthly windows. The type, location, and severity of violations are reported on a facility map
Even though this scenario represents an overly simplified representation of a complex workplace safety use case, it can be considered a starting point for a more sophisticated solution. The Azure Percept dev kit, and the corresponding Azure services allow an organization to quickly bring powerful object recognition and processing capabilities to live video streams at the edge. These capabilities enable an organization to monitor, analyze, and quickly take action to address potential workplace safety incidents using data that was previously difficult to gather using conventional means.
Resources for learning more about Azure Percept:
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.