Oct 09 2023 02:25 AM - edited Oct 09 2023 02:50 AM
Going by the promotional materials Mesh has a concept of the "stage" for a shared screen and gallery (where those on a standard teams call - i.e. with no avatar, appear). I don't see any way in the mesh toolkit or the Mesh 101 example to add that stage screen. Obviously when building a custom environment one would expect to be able to place that stage somewhere. So you would expect to see it as a game object or as a render texture to place upon a game object. However, I don't see any such thing. Any ideas ?? I strongly suspect that "under the hood" the "stage" is an Azure Communications Services endpoint (Teams is built on top of ACS - so it is logical that that is how it connects to Unity) - it would be great if that is the case. The ability to simply add such an ACS endpoint to the Unity scene would enable huge flexibility as for e.g. ACS has support (currently in private preview) for bringing in an SRT stream feed. Part of the reason that I suspect that the "stage" is ACS under the hood is that this Package Details - Azure Artifacts SDK which was released in Beta in June brings ACS calling to Unity. It would be very helpful if someone from Microsoft can provide some guidance as it would help with knowing which parts of the learning curve we need to direct our attention to.
Oct 09 2023 03:36 PM
Oct 09 2023 11:49 PM - edited Oct 10 2023 01:38 AM
Oct 16 2023 02:14 PM
Oct 17 2023 05:20 AM
@CameronMicka Thanks Cameron but, that's not going to cut it. I get the idea of how it might be convenient to allow a degree of post build environment customization via the event customization tool for non technical users. However, it is absolutely essential that EVERYTHING that is is possible to do in the customization tool be available to be done via code in Unity / cloud scripting for the pro developer so it can be baked into the environment and the relevant Url's etc supplied via code.
For context the scenario I am talking about involves accessing medical data including videos 3D models of patient anatomy built from CT scans etc selected at runtime from dataverse / blob storage hosted data accessed via cloud scripting and pulling them into the Unity scene at runtime to facilitate case reviews. If it is not going to be possible to do that sort of thing and the event customization route is the only way to bring Gameobjects into the environment at runtime (your "artifacts") then Mesh will very quickly be relegated to being an office toy rather than a serious business tool.
I can already load objects at runtime using a tool like TriLib 2 - Model Loading Package | Modeling | Unity Asset Store and point it to blob storage. However, the promise of Mesh was that it was all secure, closely coupled to Azure, the Microsoft graph etc and that you would be able to bring in back end data to the mesh environment at runtime. Do I need to look outside mesh at packages like TriLib2 to code bringing in my Azure hosted data and to manipulate load / manipulate "artifacts" or is Mesh actually going to deliver the ability for pro developers to build proper business tools with all the security etc.
If there is way, and I'd be truly shocked if there isn't, to make the various runtime elements "artifacts" available to those building the environments I would very strongly suggest that Microsoft get it out in front of Unity developers and permit them to realize its potential ASAP because right now Mesh appears to be positioned at the toy end of the spectrum.
The surgeons I work with aren't too interested in holding their pre operative planning meetings on the flight deck of a a spaceship while toasting marshmallows - they are interested in serious collaboration tools which have the potential to improve patient outcomes by ensuring that all involved can see the relevant data and collaborate over that data in a 3D context. I appreciate that it is at the preview stage but it has been a VERY long wait for mesh to hit public preview and what has been released so far in terms of documentation, tutorials etc. is frankly pitiful - it makes it look like Microsoft isn't really interested in Mesh anymore, which I very much hope isn't the case.
Oct 17 2023 11:28 AM - edited Oct 17 2023 12:24 PM
Thank you, Nicholas, for your detailed feedback and for sharing your concerns. I completely understand your perspective and the need for pro developers to have more control and flexibility in building robust business tools using Mesh.
I hear your frustration about the current limitations and appreciate your suggestions about ensuring Unity developers can easily orchestrate video playback. While I can't provide immediate solutions, I assure you that your feedback is invaluable. Microsoft is actively working to enhance Mesh and provide more comprehensive documentation and tools for developers like you.
In the mean time you may be able to achieve your desired outline using a mix of Visual Scripting and the Video Player component, using shared script variables to sync when the video starts. But I agree – this is an area we need to improve docs and samples coverage on.
I appreciate your patience and passion for pushing the boundaries of what Mesh can offer in the realm of serious business applications. Please feel free to reach out if you have any more specific questions or concerns.
Oct 17 2023 12:39 PM
@CameronMicka Thankyou for the prompt response. I think that one of the issues is that in the absence of a roadmap it is very difficult to see where Microsoft is going with Mesh. I"d urge Microsoft to publish a proper feature roadmap ideally along the lines of Microsoft 365 Roadmap - See What's Coming | Microsoft 365 as soon as possible.
May 20 2024 01:31 AM