Remote Physiotherapy in Microsoft Teams using UCL MotionInput and Live Share SDK
About The Team
We are a team of second-year students from University College London with a joint aim - to make physiotherapy more accessible, easier, and enjoyable. Our project was part of an Industry Exchange Program (IXN) under the supervision of UCL Professor Dean Mohamedally and Microsoft Cloud Advocates Lee Stott and Ayca Bas. Fuelled by our passion for innovation and a strong desire to make a difference, this project was perfect allowed us to express our creative side, whilist working with new technologies including Microsoft Teams Toolkit to develop Microsoft Teams applications, UCL Motioninput Windows 10/11 Application, NDI and the Teams Live share SDK.
UCL MotionInput 3
UCL MotionInput v3 is UCL produced software for Touchless Computing interactions. It is a means of interaction with a PC without the need to touch it, with just a webcam. A user interacts with this software on their PC via gestures with their hands, head, face, full body and their speech. The software analyses interactions and converts them into mouse, keyboard and joypad signals making full use of your existing software. The software was developed by academics and students at University College London's Department of Computer Science.
The software contains:
- Hands-based tracking modes - such as in-air multitouch, mouse, keyboard, digital pen and joypad
- Facial Navigation modes - mixing facial switches with nose and eyes navigation
- Exercises in gaming modes - users place hot-spot triggers in the air around them, along with combinations of "walking on the spot" recognition, for first person and third person gaming, retro gaming and more.
- Simultaneously recognising speech alongside all of the above - for mouse events like "click", for app events like "show fullscreen" in PowerPoint, for operating system events like "volume up" and in your own phrases in your games and applications - along with live captioning.
Remote Physiotherapy in Microsoft Teams using UCL MotionInput and Live Share SDK Meet the team:
- Adil Omar-Mufti (Author) - LinkedIn
- Arvind Sethu - LinkedIn
- Youngwoo Jung - LinkedIn
- Setareh Eskandari - LinkedIn
The Project
Recently, we built a Microsoft Teams App to be utilised as a means for remote physiotherapy, by patients worldwide.
The COVID-19 pandemic has presented significant challenges to physiotherapy patients. Traditional tele-health technologies have not adequately met the needs of patients requiring rehabilitation. The gamification of physiotherapy has been shown to increase patient motivation and engagement.
To address this issue, we have developed a solution that allows patients to play games during their physiotherapy sessions using movement with UCL MotionInput. We built a Microsoft Teams App, offering patients the ability to play a variety of interactive games, either individually or collaboratively, all through motion using the Teams Live Share SDK, eliminating the need for physical input devices - making an ideal remote physiotherapy tool for patients worldwide.
Furthermore, we have developed an NDI-based remote Teams solution to be used in tandem with our Microsoft Teams App. This enables patients to remotely access and use MotionInput without any download or installation required. Hence significantly increasing accessibility for physiotherapy patients, allowing them to receive virtual care from the comfort of their own homes and preferred devices.
Many thanks to Ayca Bas, Lee Stott for helping and providing their invaluable expertise throughout the course of the project!
System Architecture
Both physiotherapist and patient join a meeting where our app is displayed. From there, they can interact with our main subsystems.
We have an extensive selection of classic games integrated into a Microsoft Teams application, and embedded GIFs showing how to do certain MotionInput gestures to play games. The camera system utilises the NDI solution. By patients joining a Teams call with their camera on, this camera can be fed into the MotionInput API by the admin of the Teams call, allowing for patients to use MotionInput without needing to install anything on their own device. Finally, our data collection system captures certain movement metrics that are useful for the physiotherapist, and it is then automatically stored in a csv file on the computer of the person running the MotionInput.
At the present time the data capture is not integrated into the project, as this is still in a proof of concept. However, we are confident that it can be a part of our MotionInput Teams application in the future.
Our Solution
Our solution is designed with ease in mind for the end user. The Microsoft Teams application can be simply launched from within a Teams call, with no installation required on the patient devices allowing them to use any teams enabled device from a mobile phone, tablet, iPad, Laptop or PC. All the data and application are hosted on the physicans machine and telementry captured around the session usage. Our unique NDI solution, not only allows users to use MotionInput without requiring any installation.
By allowing users to simply join the Teams call using any device. All patients can use MotionInput, simply by joining a Teams call with their camera switched on as they would do for any other telemedical conference call.
Finally, we have built an initial comprehensive data collection framework, this captures a patient’s movement data and helps patients visualise the data. What we have built is just the first steps towards a solution that can be utilised in real world situations once validated.
Our Microsoft Teams app was developed using Teams Toolkit in Visual Studio Code. Ayca Bas and the Microsoft Teams Toolkit team were particularly helpful for advice and guidance when getting started with the process. There some amazing resources online including the Team App Camp resources these resources are a great starting point for building Microsoft Teams apps.
Our application has been implemented is a ‘Tab’ within Teams using Javascript, in particular using React components in a OPA (One page application). We also used Live Share SDK, a brand new kit released by Microsoft in 2023. This SDK uses the Fluid Framework to share state between participants in a Teams meetings. Using this framework, we managed to integrate games with Live Share technology, and laid out a framework for expansion, that can hopefully be developed by students that inherit our work. Also, we were able to implement collaborative multiplayer games in our teams application, to hopefully make physiotherapy more engaging for patients.
For the camera input and ability to interact with the Motioninput Windows Application, we use NDI-Out technology, which is built into Teams, The NDI-out allows us to extract live videos from a Teams call, for MotionInput interaction interpretation. Initially, we wanted to pair this with NDI-In technology, with ideas to create some form of a graphic overlay. We finally opted for the recently released Live Share SDK, the Live Share SDK allowed us to utilise the NDI-Out work which is fully integrated into UCL MotionInput, and we are now working with the UCL MotionInput team to have the NDI feature in future builds of MotionInput which are avaiable via the Windows store!
Reflection
Our experience using different Microsoft tools together was extremely smooth. When deploying our Teams application to Azure, this functionality was built into Teams toolkit, and therefore only took a few minutes. However, that being said, we were working on a complex problem with multiple frameworks and services and testing across a number of different devices so it was a great learning experience as we gotto deal with every type of issue! However, experiencing some difficulties with a recent release is not uncommon, and it was simply a part of the overall journey, albeit with a few bumps along the way and the Microsoft team were amazing mentors throughout the process.
Overall, we relished this rare opportunity to work with brand new technologies and a the Live Share technology, and be one of the firsts developers to do so. We hold an underlying sense of accomplishment when presenting and discussing our work. Not only did we work with new exciting software, but also had the guidance and assistance of Ayca Bas and Lee Stott; our fantastic project partners at Microsoft. They made the experience easier, more fulfilling and are the reason we are able to share our work to many people. That being said, what we will takeaway is that we have built strong foundations for a revolutionary method for remote physiotherapy. And one that we hope, can change many lives for the better.
What’s Next?
LiveShare SDK Expandability
We have built a framework in Microsoft Teams to allow collaborative multiplayer games. LiveShare SDK is a new feature, given less than 2 weeks with the new product, we created a working framework to allow users in a Teams meeting to enjoy collaborative experience inside our application. LiveShare SDK works for Tic-tac-toe, using mouse clicks; similar logic can be directly translated over into most other one button static games requiring mouse clicks, for example, Reversi. Therefore, many games can be implemented using our framework and therefore played with MotionInput.
NDI Extenability
Another possible extendability, is an NDI multiplayer feature to build onto our initial work with NDI, would bring another level of compatibility. By NDI multiplayer we mean that one can switch the NDI video source running the MotionInput concurrently, all still whilst running MotionInput. Currently, if you are running MotionInput using an NDI video from the Teams call, if you want to switch to another video, you have to stop re-run MotionInput.
By integrating the NDI multiplayer all of this can be done while MotionInput is still running. This would make running physiotherapy sessions easier. Also exciting features can be added, for example, if someone clicks on the screen using motion, then you can switch the NDI camera to the next user in the meeting, imagine this technology for tic-tac-toe!
Data Tracking framework
We have built an initial framework that tracks, captures and plots movement data. With extensive testing, we hope this can be used for physiotherapy, and integrated into Teams or MotionInput.
We have enjoyed working on this project, and will now hand over our work to a team of masters students at UCL. We hope they will be able to expand on our work accordingly, ultimately making progress towards our goal of making physiotherapy accessible for anyone!
Additional Learning Resources
Build and deploy apps for Microsoft Teams using Teams Toolkit for Visual Studio Code - Training | Microsoft Learn
Teams app camp (microsoft.github.io)
Thanks for reading!
Overall, this project has been an invaluable experience, that all members of our team have loved working on. We have all learned an incredible amount, and look forward to our work being developed into a product that can be used to have a real impact on peoples lives.