Mixed Reality
5 TopicsExperience Microsoft HoloLens 2 and Dynamics 365 Guides Revolutionizes your works.
Free event, reserve your seat quickly!!! Learn how Dynamics 365 Guides revolutionize frontline workers' communication, training, & knowledge transfer with MR Apps on HoloLens 2. Date:Tuesday, September 19th Time:10 - 11am PDT Reserve your spot(limited seats):https://www.eventbrite.com/e/experience-microsofts-dynamics-365-guides-tickets-696644631847 Hosted by Mixed Reality Specialists from SphereGenand HoloLens 2 SphereGen Technologies, a certified Mixed Reality Partner withMicrosoft HoloLens 2, is hosting this live demo event. Having worked with the HoloLens since its introduction in 2016, we have successfully implementedDynamics 365 Guidesfor many clients – with groundbreaking results. Microsoft, the innovator behind HoloLens 2 technology, will be co-hosting this event with featured speakers. This webinar will include a live demonstration of Dynamics 365 Guides using Microsoft Teams and the HoloLens 2. What will you learn about Dynamics 365 Guides? Experience a live demonstration of a trainee using Guides to learn how to repair a piece of equipment. See how Guides overlays virtual annotations directly onto the instrument to provide clear and accurate instructions. Follow along as the trainee is guided through operating the device with step-by-step instructions. See how a technician communicates with a Specialist or Subject Matter Expert (SME) while remaining hands-free to focus on the subject at hand. Understand how Guides is being adopted across multiple industries as a collaborative tool that provides immediate remote guidance in-field support and training - no travel time or expense required. Remote Assist is now a part of Guides. Learn aboutNEWMicrosoft features, such asGuides for MobileandGenerative AI in Copilot. Guides for Mobile: Step-by-step instructions, images, and animations on the field technicians' mobile devices for faster and more accurate task completion. Generative AI in Copilot: AI assistant that automates tasks and creates content across business productivity and collaboration apps. Learn more about Microsoft Dynamics 365 Guides use cases: https://www.spheregen.com/guides/ Engage in a Q&A session with our SME’s who have implemented this solution for several clients. Who should attend this event? Innovation Leaders/CIOs Directors of Training Managers of Field Support Production Management Don’t miss your chance to experience Dynamics 365 Guides in action! This event has a limited numbersof free tickets available for an engaging and dynamic experience.679Views0likes0Comments(Guest Blog) From MRTK2 to MRTK3 - going cross platform with HoloLens 2 and Quest 2
This is a Guest Blog Post from Joostvan Schaik, MVP. Back in 2016, I published my first real HoloLens app in the Microsoft Store, back then for HoloLens 1: AMS HoloATC. Since then, this app has been my litmus test for everything I tried to do in Mixed Reality, as it is relatively simple, yet uses almost all aspects of Mixed Reality. So, when a new platform/toolkit/device arrives, I take this app and try to adapt it for that first. When the Windows Mixed Reality headsets came along, I made it run on that as well. Then came Mixed Reality Toolkit, Mixed Reality Toolkit 2, and I converted the app to work on those – and the last step gave me a pretty easy conversion to HoloLens 2 when it popped up. In fact, I made the app run on that device without even having access to a device. In fact, I only had a few minutes with a prototype of it, yet the app was ready before HoloLens 2 was released. Fast forward to earlier this year, MVPs were invited to a preview of the Mixed Reality Toolkit 3 – a radical departure from the past. And I went in it the usual way: after a bit of mucking around I took my AMS HoloATC app, yanked the MRTK2 from it and put in the MRTK3. Anyone who knows anything about coding knows that this approach tends to break a lot of things. Visual Studio experienced a lot of errors – not quite surprising when you pull the rug under the app’s feet. But the fun thing about this head-first approach is: it shows you clearly where things break or are changed, yet there is a clearly limited set of things you want to achieve. It is basically, reconnecting a lot of wires that you know must go somewhere. How do you deal with the Spatial map here? How does interaction work here? How can you build a user interface with these new components? That is where my inner geek kicks in - with years of experience. In a couple of days’ time worth of finicking, I had my app working again. Mind you – nearly without documentation, just peeking into the code of the MRTK3 and the few samples that came with it. So then came a defining moment: on the Mixed Reality Developer Days 2022, Microsoft showed an interesting app: Zappy’s Playground. If you have not seen it – it’s basically a simple kind of game, where you have to provide a robot with enough power to operate. You do this by picking up, placing, and orienting small wind generators. Now this may sound a bit weird and even a bit lame, but the main point of the game is not the game itself, but to demonstrate how to implement all kinds of typical techniques, interactions and other ‘building blocks’ in a Mixed Reality application using MRTK3. And like every developer knows: ten lines of example code working in context, with all nuts and bolts around it in place, are worth more than a thousand lines of documentation. The most intriguing thing about the app itself was on stage, it ran both on HoloLens and the Meta Quest 2. Unchanged. While HoloLens is Windows based, and the Quest Android. Now, was this true or a stage trick? I have a Quest 2 - I had bought it for a WebXR development experiment early this year, which was completed successfully. After that… it was basically sitting there, gathering dust. The first thing to find out was whether or not Microsoft's claim about Zappy's cross platform play was the real thing or not. So, I dusted off the Quest, to see if I could make it work on that device. Spoiler alert: it turned out the most challenging part of the whole operation was actually getting the Quest 2 into ‘developer mode’. Although it’s not complicated, once you know what to do, there are several steps required. First you need to set the device in developer mode using your Oculus app on your phone, this requires you to create an ‘organization’ to associate your account with, your accounts needs to be verified, drivers needs to be installed... it's quite a process, sometimes a bit confusing, but in the end, I apparently had the Quest 2 in developer mode. I already had cloned the Zappy’s playground code from GitHub to run it on HoloLens. I needed to install the Android tools next to the UWP tools, then I opened the app in Unity, switched to the Android build platform, hit “build and run” … and basically, that was it. It ran on the Oculus, without any changes. The stage demo had definitely not been staged – it was real! Before my very eyes. I flexed my knuckles. It was time to let good old AMS HoloATC make the jump to Quest 2! I repeated the procedure. Switched to Android build, copied over all the Android settings from the Zappy app, and deployed. Yay! It ran. In a flat floating window, instead of a VR app. I double checked my setting - I had forgotten one. Take 2: now I saw my app appearing as a VR app, I saw the airport around me, and planes appear… but I could not interact with it. I could not touch airplanes, nor could I look at them and see the pictures. After some careful studying, it looked like the MRTK3 developers had created a lot of extra settings for the so-called “Input settings” for the XR Interaction Toolkit, on which the MRTK3 is built on. This is a concept I am still getting familiar with, but I could determine which files were involved. I copied those over as well from Zappy to my app. I checked and double checked. Everything was in order. And then it was time for deploy 3. … And everything worked. I could gaze at airplanes and saw pictures, select them with the controller ray - even with about 4 hours ago, I never had deployed a native app on a Quest 2. In that time, I had learned how to put it into developer mode, how I could deploy apps to it, and what I needed to do to adapt a working MRTK3 HoloLens app to a Quest 2 app. Literally only configuration. Zero lines of extra code. For a native app. Color, me impressed. Now, I realize, for more complex apps, some more device specific code may be needed. In fact, shortly after the first run I added a few platform-specific lines of code - to make stuff appear bigger. The Quest 2 is a VR device – basically a screen close to your eyes with not nearly the resolution of a HoloLens. If you show 10 cm (about 4”) airplanes, they tend to look pretty crappy and pixelated. My rule of thumb is to scale everything up 3.5x in VR with respect to HoloLens. To that end, I could simply re-use code I wrote for Windows Mixed Reality headsets. So, although I did add some platform specific code, it was not any new code. Long story short: there is a bit of work to do to move your app from MRTK2 to MRTK3. Depending on how you have built your app, this may be a bit of work or quite a lot. Especially if you have built a lot of user interface using buttons and panels from the MRTK2, you will have some work to do. However, if you have used some kind of halfway decent architecture, porting functionality is quite doable. After all, most of it is Unity and plain C#. In addition, there is a lot less custom code in MRTK2 – they are very much building on standard Unity stuff. And once you are on MRTK3, the step to Oculus Quest is ridiculously small. I think this is the big prize. There are still some pretty rough edges – MRTK3 is in preview after all, yet already in much better shape than when I saw it first – but it seems MRTK3 is aimed at making the promise of OpenXR come true: a generalized, somewhat device-abstracted way of building cross-platform Mixed Reality apps. A kind of MAUI for Mixed Reality, built on top of Unity. This is great news for developers who want to build business apps, but whose customers feel that buying a fleet of Hololens devices is just too steep of a hill to climb all at once – for now. It also paves the road for future devices which may be more advanced and show HoloLens-like capabilities, like the Meta Cambria. Or whatever Microsoft itself, or other companies may have up their sleeves.However, as this pans out, the area of Mixed Reality will be even more exciting in the near future than it already is today!8.9KViews5likes0CommentsAccelerating sales cycle for faster deal closure with mixed reality
To stand apart from their competition, the Tinsley Equipment team wanted to provide their customers with a real-time quote for the equipment under consideration.Moving on from traditional simulation techniques, Softweb Solutions' AR product visualization solution enhances the sales experience by offering an immersive 360-degree perspective of the subject or the equipment.7.6KViews1like0CommentsHow to include syswow64 dlls to HoloLens
Hello, I am trying to connect a Tello drone to HoloLens and display the camera feed from the drone directly to HoloLens. The problem that I have is that when I try to launch the project to HoloLens I get the following errors regarding the loading of the dlls: Loading advapi32.dll Plugins: Failed to load 'advapi32' with error 'Operation has failed with error 0x7f: The specified procedure could not be found. Loading TelloVideoDecoder.dll Plugins: Failed to load 'TelloVideoDecoder' because one or more of its dependencies could not be loaded. The TelloVideoDecoder.dll has the following dependencies: SysWOW64\OPENGL32.dll SysWOW64\WS2_32.dll SysWOW64\kernel32.dll SysWOW64\MSVCP140.dll SysWOW64\VCRUNTIME140.dll SysWOW64\ucrtbase.dll Do you know if there is a way to include these windows dll in my project? Or is there any way to have this dlls installed on holoLens?1.9KViews0likes3Comments