OpenXR
9 TopicsUnity App Stuck Loading On HoloLens 2
I am following this document andthis tutorialto test AR on a HoloLens 2, and I go to the part where I have a Unity game with all the proper setup (ie. build settings are correct, I used Unity AR feature tool, etc.) and with 2 squares. I built the project and opened the solution, and then I set the configuration to "Release," platform to "ARM64," and set it to "Device" because I have the HoloLens directly connected to my laptop. I selected the Unity app solution and right-clicked it to "Set as Startup Project." Then, I pressed the green button next to "Device" to build and deploy it. The first time, I got prompted to enter a pin after a bit, which I did, and once the deployment was done, the app showed in the "All Apps" section, but once I clicked on it, I got a window that had a white background with a box with an X in the middle of the screen. On top, there is a blue circle with a white loading circle that was loading for forever. I tried rebuilding it multiple times, recreating the Unity project in a new project, and double-checking my Unity build settings, but those were correct. How can I fix this issue? I'm using Unity version 2022.3.20f1. Thank you, Avinash Devineni Edit: Now it started doing the environment scanning visual, but still shows a white box, although there is no blue loading circle anymore. I didn't change anything or redeploy it to the HoloLens 2, so I'm not sure why it started doing something different.222Views0likes0CommentsObjects not anchored in Hololens 2 after building from Unity
Hello all, I hope I found the right spot to put this. I am having an issue with porting one of our apps from Magic Leap 2 to Hololens 2 in Unity. After setting up MRTK into the project and hooking a few things up, I build the project for UWP and deploy to the device. When running the app, all objects in the scene move with the camera instead of staying anchored to their position. If I setup MRTK in an empty project and deploy, objects in the scene however are anchored appropriately. I am not sure what is possibly causing this, but I imagine there is some setting or error within my project. I do not have much experience with Hololens development. Things I have tried: -Disabling unnecessary DLLs from being loaded for UWP -Fixing any exceptions thrown shown in the Player Log file -Updating any conditional compilation tags so code from other platforms do not run unintentionally -Checking MRTK project/build/prefab settings against empty project that is successful Perhaps someone here is aware of what triggers this and could potentially point me in the right direction. I am using Unity 2022.3.11. Thanks in advance!415Views0likes0CommentsBuild error"The local machine does not support running projects complied for the ARM64 architecture"
Hello, I've been getting errors recently where I cannot build an ARM64 project for UWP on Unity3D. On Unity, it states in the build settings that"The local machine does not support running projects complied for the ARM64 architecture". And when I try to actually build the application, I get these error messages (second image). I have all the correct packages from Visual Studio 2022 and this still doesn't work for me. Does anyone know what the issue is? Thank you!Hololens/ UXTools - How To Activate/Deactivate a Non-Scene Component (UNREAL)
Hello, I am working with the UXTools Bounds Control non-scene component, developing for the Hololens in Unreal Engine 4.27. I am using this component to allow the user to grab, scale and rotate an item, but I need it to only become active once conditions have been met. Presently the component remains visible and active as soon as it is added to a blueprint. Things I have tried to dynamically activate it: Using activate/ deactivate node, which seemingly had no effect Using set visibility and set actor hidden in game, which had no effect Trying to add, attach and destroy the component at runtime, which seems to be designed only to work with scene components Setting static meshes of the box to none at runtime to at least hide their visible appearance, also did not work I am working in a blueprint only project so I cannot edit the native C++ for the component. Are there any other options available to me for activating/deactivating, attaching/unattaching, or showing/hiding a non-scene component within blueprints?835Views0likes0CommentsWorld Locking Tools and Cross Session Object Persistence
Hello all, I am porting to the Hololens2 an application previously developed with Hololens1 (if curious,hereyou can find a video description of the app). Most elements have been straightforward during porting, but the new persistence management is causing me many issues (and headaches). I have 2 key requirements: 1. Be able to move and rotate objects in the environment at runtime within the Hololens2, and then each time the application starts, it recognizes the room and repositions all objects as they were placed in previous application runs. 2. Share the real environment position/rotation of my 3D objects with other Hololens2 devices, to have all objects correctly aligned among all users. Sharing is done on a local network, which might not have access to the internet. When I developed (a long time ago) this system for the Hololens 1, everything worked fine, and I relied on the WorldAnchorStore and Spatial Anchors (https://docs.microsoft.com/en-us/windows/mixed-reality/develop/unity/persistence-in-unity). However, with the latest developments to the Mixed Reality API I have understood and tested the two new available approaches: a) World Locking Tools b) ARAnchorManager, using the OpenXR plugin and ARFoundation API One first consideration is that the World Locking Tools API approach seems extremely more stable than the ARAnchorManager approach. And from the documentation, it seems to be the preferable approach suggested by Microsoft. Therefore I would like to achieve my requirement (number 1) with the World Locking Tools. I seem to have it working, but (1) my solution seems quite hacky, and (2) I am unable to share my information with other devices (failing requirement number 2). In the World Locking Tools methodology, I understood that there are no more "explicit" anchors as the WLT system instantiates them in the background. Still, it is possible to create Attachment Points, which help stabilize objects (this is my understanding). Therefore, using attachment points, to load objects in the same position at each application run, what I do is: - read from a local file (saved in XML) a list of all my movable objects and all their positions and rotation (saved in previous runs) - assign to each of these movable objects an Attachment Point (using the AttachmentPointManager.CreateAttachmentPoint() https://docs.microsoft.com/en-us/dotnet/api/microsoft.mixedreality.worldlocking.core.iattachmentpointmanager.createattachmentpoint?view=wlt-unity-1.5) - each time an object is moved, I save (or update) its position and rotation in the local file Despite this solution seems to work (i.e., when I load the application again, objects are where I previously positioned them), there are a couple of open points which I can't understand and are not exhaustively explained in the documentation. 1. What is the point of creating an attachment point at runtime if there is no link with the game object associated with it? 2. I understand there are different coordinate spaces (https://docs.microsoft.com/en-us/mixed-reality/world-locking-tools/documentation/concepts/advanced/coordinatespaces). Do I need to adjust (during update) the position of the objects according to other coordinates (Frozen, Locked)? And if so, how should I do so? 3. Am I in charge of serializing and deserializing each object's position and rotation information to have their position/rotation persistent across sessions? 4. How can I share this serialized information with other devices for them to load objects in the same position/rotation for all? I understand there are Azure Spatial Anchors; however, I need to achieve the sharing of objects' real-world positioning without accessing the internet (I only have a local wireless network). I hope someone can help me, and I am available to provide further details and code if necessary. Thanks a lot!Shared MR: Local Anchor Transfers in Unity
I'm trying to set up a common experience between two Hololens 2 devices in the same room, following https://docs.microsoft.com/en-us/windows/mixed-reality/out-of-scope/local-anchor-transfers-in-unity. I made a fresh Unity 20.3.26f1 project, added and configured MRTK 1.0.2203.0, made a GameObject with a WorldAnchorManager component (not yet sure what to do with that), and then another GameObject with this script attached to it: using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.XR.WSA.Sharing; public class anchorscript : MonoBehaviour { public GameObject rootGameObject; private UnityEngine.XR.WSA.WorldAnchor gameRootAnchor; // Start is called before the first frame update void Start() { gameRootAnchor = rootGameObject.GetComponent<UnityEngine.XR.WSA.WorldAnchor>(); if (gameRootAnchor == null) { gameRootAnchor = rootGameObject.AddComponent<UnityEngine.XR.WSA.WorldAnchor>(); } } // Update is called once per frame void Update() { } } This led to the following error in the Unity editor: Assets\anchorscript.cs(4,26): error CS0234: The type or namespace name 'Sharing' does not exist in the namespace 'UnityEngine.XR.WSA' (are you missing an assembly reference?) I'd appreciate any help understanding why this happens and what I can do to make the script compile.Hololens 2 app works with unity game mode, but not when deployed to hololens2.
I am developing an app for Hololens 2 on Unity 2020.3.33f1 using MRFT v1.0.2203.0 and XR SDK. I have done all of the basic MRTK setups (I added MRTK Foundation, Extensions, Tools, and Standard Assets using the Mixed Reality Feature Tool; I used the utility "Configure Project for MRTK" in Unity under Mixed Reality > Toolkit > Utilities; I installed the XR Plugin Manager and the Windows XR Plugin and adjusted their settings in Project Settings). This is the first time I am working on the hololens2 project. Initially, I was able to render objects to hololens2. Then I modified my scene. Now, I have a parent object and a child object in this scene with a script(called in child object) that loads the medical image to the scene.Then I built a scene to render a medical image via which is called inside the child object and it was visualized when game mode is enabled in unity. Unfortunately, I could able to visualize both 3D objects on hololens2, not the 3D image. My Build settings are set to Universal Windows Platform, HoloLens target device, ARM64, latest installed Windows SDK and Visual Studio, and release mode. Any idea what could be causing this issue?842Views0likes0CommentsHololens2 and PUN setup
Hi I am creating a PUN2 app for hololens and getting the following errors in the Unity development environment, The type or namespace name 'CloudSpatialAnchor' could not be found The type or namespace name 'Azure' does not exist in the namespace 'Microsoft' The type or namespace name 'AnchorLocatedEventArgs' could not be found Any inputs? Thanks567Views0likes0CommentsKill App When Tile Closed
Hi MR Devs, I'm looking for a way to kill the application when the 2D Tile/Panel/Slate is closed in the MR home. I'm using Unity to create a Win32 app for HP Reverb G2. When I start the app from the 3D launcher with windowed mode, a 2D view is created, however closing the view allows the app to survive and a user would need to switch back to desktop to close the application. Is there a WMR event of some sort my application can listen for to kill the application when its WMR 2D view is closed?