Forum Discussion
Vision AI dev kit not working when specifying "ModelZipUrl"
- Sep 25, 2020
Nikunja - I found one problem. There was no more space available on the Camera for some reason. I reflashed it using the fastboot procedure listed in toubleshooting. The default model OOBE deployment now works. However, still running into the same issue with model not running when specifying ModelZipUrl as mentioned in this thread
I ran into the same problem. Factory reset a couple of times setting up everything from scratch. Neither of our two cameras will even run the default model anymore after factory reset and setting up Azure resources from scratch in the quick start. Everything seems to running ok on the camera and in Azure but no boxing of objects happening.
- NikunjaSep 23, 2020Former Employee
Kristian_Heim, Sridhar_Kothalanka, We're looking into this with another team, I'll follow up once we find out more.
- Sridhar_KothalankaNov 26, 2020
Microsoft
- eem1776Dec 02, 2020Copper Contributor
I would like to know also as I am seeing the same issue with custom model generated for AISDK from the Custom Vision portal. The sample SSD model worked fine send human readable text... it's almost like the tensor names might be off in the camera va- config, or the json serialization in the sample is not pointing to the right tensor name for the custom models not the coco SSD from the sample?.. Anyway, I'm seeing the same output:
\x89\x80\x00\x85\x86M.... or other variants...
- Kristian_HeimSep 25, 2020
Microsoft
Nikunja - I found one problem. There was no more space available on the Camera for some reason. I reflashed it using the fastboot procedure listed in toubleshooting. The default model OOBE deployment now works. However, still running into the same issue with model not running when specifying ModelZipUrl as mentioned in this thread