Azure Percept is a platform of hardware and services that simplifies use of Azure AI technologies on the edge. The development kit comes with an intelligent camera, Azure Percept Vision, and it can a...
since im fairly new at a number of things in the robotics stack, iotedge being one of them, it seems i need the deployment manifest that was used to define the modules and their publish subscribe relationships.
I know this from doing hello world from visual studio code.
My guess is that somehow i need to find from where the OOB experience runs. whether thats a container or some systemctl/systemd service in theCBL-Mariner operating system.
my guess is that the website that runs the OOB experience has the Azure IotEdge project and manifest file at its disposal? Or you call out to github, or maybe an azure devops pipeline?
Which gives you that default vision processing pipeline that then you instruct us to manipulate with the device twin properties to run a differently trained computer vision algorithm.
Its good value to have a ready to go thing like this. But its also very important that newbie developers and vets alike can follow some multi part training where we can dig into the guts of this thing so that we can make it our own and learn the azure iotedge way.
This device is good for the decision makers/ POC demonstrations.
But if you also want grass roots adoption from the army of developers out there then we need that middle ground.
NVidia Jetson is the other end of the spectrum. But somewhere in the middle is the sweet spot. Maybe you can put me in touch with the team and i can work with them on developing these types of documentations, samples, tutorials, POCs internally as well as for clients? mailto:juan.suero@gmail.com
Im working on combining computer vision on intelligent machines with hololens 2 for collaboration and training of robots.