As a commercial FAA Part 107 drone pilot I look for various ways to use drones to perform tasks that are dangerous, repetitive, or otherwise not optimal for humans to perform.
One of these dangerous tasks is inspecting rooftops for damage or deficiencies.
During regular routine inspections the task can be dangerous, but after damage from a storm or other event where the structure of the roof is in question then a drone is a great fit to get eyes on the rooftop.
I frequently fly drones using an autonomous flight path and I was interested in adding artificial intelligence to the drone to identify damage in real-time.
The Azure Percept DK is purpose built for rapid prototyping and I found that by using the Azure Percept DK and an Azure Custom Vision model I was able to use artificial intelligence to recognize damage on a roof in under a week.
Using the Azure Percept DK was by far the most rapid prototyping experience and quite frankly, the simplest path from concept to execution.
Learn more about Azure Percept here.
There are commercially available Artificial Intelligence systems that are used with drones, but the focus has been on improving flight autonomy.
Unmanned Aerial Systems, typically known as UAS or “drone” are the go-to platform for gathering environment data.
This data can be video, photos, or other environment properties via a sensor array.
As UAS use cases expand, the need for intelligent systems to assist UAS operators continues to grow.
I’ve created a walkthrough that you can use to get started with the Azure Percept DK and Azure Custom Vision to recognize objects.
I’ll also demonstrate the Azure Percept DK mounted to a drone performing a real inspection of a roof using the Azure Custom Vision model I created using this walkthrough.
The Azure Percept DK is a development kit that can rapidly accelerate prototyping of AI at the Edge solutions.
AI at the Edge is a concept where all processing of gathered data happens on a device.
There is not a need for an uplink to another location to process the gathered data.
Typically, the data processing happens in real time on the device.
What I’ve done is mounted an Azure Percept DK to a custom built UAS platform.
You will see that the Azure Percept DK is modular and fits on a small footprint.
The UAS is a medium sized 500mm platform very similar to a HolyBro x500.
There are two typical ways a UAS will fly over a target area such as a residential roof.
Using either of the operation methods above a flight for inspecting a residential roof the UAS operator will make several passes over the target area to look for defects or damage in the surface or structure of the roof.
Typically, the operator will take photos or a video of the roof as the UAS passes overhead.
The visual data would then be processed later and an inspection report would be delivered to the customer highlighting what the inspection found.
The time spent reviewing the visual data to create an inspection report can take a long time to complete.
Additionally, the expertise to spot damage and defects can take a long time to acquire.
Now imagine if you can collect visual data and have your UAS identify and catalog damage and defects on a residential roof as the UAS passes over the roof.
Using an Azure Percept DK mounted on your UAS you can bring AI at the Edge to your inspection project to spot and highlight damage and defects without specialized AI skills.
Using Azure Percept Studio, UAS operators can explore the library of pre-built AI models or build custom models themselves without coding.
Let’s walk through the steps to create a custom vision model to identify defects on a residential roof.
You will see that there is no need for specialized AI skills, just the knowledge of how to tag photos with attributes.
Deploy the Custom Vision model to your Azure Percept DK device.
Make sure you select the IoT Hub you created in the Prerequisite step.
Select the Azure Percept DK device you are using
Verify and select the Model Iteration you wish to deploy onto the Azure Percept DK device
Click the Deploy model button
Verify the Device deployment is successful by watching for the Azure Portal notification stating that the deployment is successful
What I just demonstrated is the capability to identify three characteristics of roof damage or deficiencies.
This identification was done in real-time using Artificial Intelligence without coding, a team of Data Scientists, or a purpose-built companion computer.
The great thing it you can improve the identification of characteristics in data as time goes on by tagging additional images and re-training the algorithm and re-deploying the model to the Azure Percept DK.
Using Azure Percept DK to rapidly prototype UAS use cases that can take advantage of Artificial Intelligence will put you at an advantage when incorporating new capabilities into your workflow.
Think about this rapid prototype and how easy it is to incorporate Artificial Intelligence into your systems and workflow.
Monitoring and Detection
Pedestrian and vehicle counting
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.