Academic Project Co-Supervisors at University College London:
Prof Dean Mohamedally, Prof Graham Roberts, Dr Nicholas Gold, Prof Neil Sebire, Prof Yvonne Rogers, Prof Joseph Connor (UCL)
In Collaboration with Microsoft:
Lee Stott, Honorary Associate Professor in Software Systems Engineering (UCL)
The COVID-19 pandemic period has renewed interest in touchless computing in healthcare. This is largely motivated by infection prevention and control, and an interest to now consult and remotely monitor patients in their own homes. Previous efforts to implement touchless computing have been limited by the expense and installation of new hardware, software requirements and assembly of solutions. I have been leading on University College London (UCL)'s MotionInput v2.0 supporting DirectX project with Great Ormond Street Hospital for Children, with the task set to our team of how to create a low cost and scalable method for enabling touchless computing for existing and new software, by using standard webcams.
UCL’s MotionInput v2.0 supporting DirectX is a modular framework with four modules of customised gesture tracking from a webcam using open-source libraries (hand gestures, head movement, eye tracking and exercise recognition). These are connected by a common Graphic User Interface (GUI) to enable Windows-based interactions on all existing PC software managed by DirectX, including games, creativity, office and especially healthcare applications. Human motion gestures, like hand and body motions, head and facial movement and eye tracking are mapped from an RGB webcam video stream to input commands for existing applications and games. It uses published open-source libraries in a federated and locally processed way.
System architecture of the prototype software
MotionInput v2.0 is designed to work across a broad range of contexts and with users of differing capabilities. Users can execute cursor commands – moving a cursor with their hands, eyes or head direction, perform mouse clicks with hand pinching, eye blinking and mouth opening gestures. They can scroll with a hand or head movement. Physical detection of exercises like squatting, jumping, cycling, air punches and kicks are also measured as inputs for DirectX. New contributions include an idle state for activating and deactivating gestures, a dynamic area of interest and depth variation captured from a 2D webcam.
Healthcare staff can navigate complex user interfaces, such as electronic health records and radiology images, touch free using a mid-air hand gesture. This includes free drawing on DICOM scans, and touchless panning and rotation of 3D imaging and anatomy. There are numerous occasions when the hands of a clinician are either occupied by clinical tasks or it is important to keep peripheral computer interfaces free from contamination.
Student project demo videos
Patients with motor impairments would be able to use head movements or eye tracking to interact with their computer for both functional and creative tasks. This might include navigating web browsers, scrolling through PDF files, and creative expression through painting and playing music.
Student Demo Projects
Full body rehabilitative and repetitive hand exercises can be performed from home and encouraged by playing existing games on Windows, as well as Kinect games with skeletal tracking capabilities. This might be after a hospital treatment, to build strength after an event, to improve motor skills or as part of targeted conditioning as guided by physiotherapists.
Student Demo Projects
We have launched our community beta test programme for public and enterprise sectors to participate in testing and refining this technology with their existing Windows software.