Exploring what Azure Percept can accomplish in the hands of creative developers
Published Jun 03 2021 09:22 AM 3,994 Views

We knew when we unveiled Microsoft Azure Percept and its development kit that developers and solution builders would use the technology in innovative ways we couldn’t even imagine. In the months since introducing Azure Percept at Microsoft Ignite in March, people have indeed thought up ways to use it that are useful, inspiring, and even fun.

 

Azure Percept transforms the process of adding AI at the edge by dramatically reducing the amount of technical know-how needed. With help from Azure AI Cognitive Services and Azure Machine Learning models, as well as open-source AI models, the developer kit can go from box to performing tasks in 30 minutes without any coding skills.

 

Our goal is to make it easy for enterprises of any size to extend AI capabilities to the edge with a single, end-to-end platform. There are integrated hardware accelerators and Azure AI and Azure IoT services to make it simple to use and ready to adapt to customers’ needs without sacrificing security. The Azure Percept development kit by ASUS is a camera-enabled system on module for rapid customization that includes all the basics and is ready to work immediately with Azure AI, Azure Machine Learning, and Azure IoT management services.

 

Adapting Azure Percept for inventive uses

Fusing capabilities with simplicity makes Azure Percept a highly adaptable platform, so it’s no surprise that developers immediately started experimenting with the developer kit to see what they could create. The creativity and innovation that have taken place in the past three months alone are truly inspiring.

 

For example, one developer has shown how the Azure Percept Vision camera and Azure Vision can be trained to recognize American Sign Language and translate it to written words with no coding skills. Another developer showed how he could train Azure Percept to start recognizing American Sign Language (ASL) letters in less than 30 minutes. After working with it for a few days, rather than the user teaching ASL to Azure Percept, the AI tool was instead teaching him.  

 

 

Designed for easy deployment

These and other projects demonstrate the best attributes and great capabilities of Azure Percept. Bringing AI to the edge enables real-time processing and analysis of data closer to where it is generated. In addition to faster decision-making, AI training and processing of data at the edge rather than in the cloud addresses some data privacy concerns, allows for quick scalability when training connected devices, and reduces network, cloud, and data center costs. It also creates new opportunities for intelligent, automated decision-making in places without reliable internet connections.

 

The Azure Percept Studio helps by offering pre-trained AI models and solutions for people detection, people counting, vehicle detection, general object detection, and products-on-shelf detection. The studio also hosts guided workflows to simplify the integration of edge AI hardware and Azure AI and IoT cloud services. These workflows integrate services such as Azure IoT Hub, Custom Vision, Speech Studio, and Azure Machine Learning.

 

All combined, this allows for rapid prototyping and deployment of AI applications that use both vision and speech without users needing significant technical knowledge. Developers already familiar with Azure services also can connect with and modify existing Azure resources outside of Azure Percept Studio.

 

Try out Azure Percept yourself

To get started on building your own solutions with Azure Percept, it’s easy to purchase a developer kit to try out pilot projects before deciding to deploy at scale. You also can visit the Azure Percept YouTube channel for videos about getting started with the developer kit.

 

As businesses find new ways to deploy Azure Percept tools and services, everywhere from retail stores to factory floors, we hope to see developers explore even more creative, innovative uses for Azure Percept at the edge.

Co-Authors
Version history
Last update:
‎Aug 26 2021 11:01 AM
Updated by: