Nayana Dasgupta, Farhan Mahmood & Guide Limjumroonrat are Computer Science student at University College London and have been working with Microsoft and the UCL IXN on this Project.
The aim of the project, is to produce a rich and unique eye tracking game for individuals affected by severely disabling conditions such as Amyotrophic Lateral Sclerosis (ALS), Motor Neurone Disease (MND) and Spinal Cord injuries. Individuals with these conditions often retain good control of their eyes, however there is a lack of applications and games that are designed for and fully support the use of eye tracking. As a result, we started developing Of Mice and Messages - a pipe-based eye tracking puzzle game where users solve each level to read the next message in an overarching storyline.
The development of eye tracking applications involves solving a range of interesting challenges, from preventing accidental selections to creating an accessible user interface for gaze interaction.
Our game has been released onto the Windows Store: https://www.microsoft.com/en-gb/p/of-mice-and-messages/9nhhnhwpb263?rtc=1&activetab=pivot:overviewta...
Our project website, with in depth details of our research and the project, is available here: http://students.cs.ucl.ac.uk/2019/group23/index.html
See this video for an overview of the completed game
The source code for Of Mice and Messages and Eye Tracking 2D is available here: https://github.com/OccularFlow
Creating an accessible and fun eye tracking game required us to not only create an original game concept, but also to ensure that the user interface and mechanics of the game are easy to use with an eye tracker.
After running an ideation session where we brainstormed ideas for the game, we decided to move forward with a pipe-based puzzle game that involved users directing water from one pipe to another by moving and rotating pipes from a dock to the grid.
To make the game interesting and varied, we added challenges such as obstacles and a funnel where users could store water while moving pipes around the grid. We also created a new level type where users rotated pipes already placed on the grid and added a timer-based achievement system – where users can get gold or silver stars for completing levels within certain time limits and a bronze star for simply completing the level.
Our next step was to create an identity for the game. This was achieved through the addition of a storyline to the game, which then guided the graphic design of the game. The storyline is told by the user delivering and reading messages sent to different game characters, through completing each level – therefore, adding an extra incentive for users to complete every level to finish the storyline.
We reviewed several eye tracking applications for example, Microsoft Research’s Eyes First Games and the Eye Writer Drawing Software, and research papers for example, Gaze Controlled Games Isokoski et al. to investigate the differing advantages and disadvantages of the various gaze interaction techniques available e.g. dwell time, gaze and click, blinking, pupil dilation etc. (A full breakdown of our research can be found on the project website.)
A fundamental challenge in the development of eye tracking applications is the Midas Touch Problem, which arises due to the difficulty of determining whether a user is selecting something with their eyes or if they are simply looking at it. There are multiple methods to combat this issue, with differing advantages and disadvantages but, one of the most commonly used is dwell-time based activation – where a user fixates on an object for a defined period of time so that it can be selected. However, the use of dwell-time based activation presents further challenges such as user uncertainty on whether a selection has been made, in addition to the issue of selecting an appropriate dwell time that avoids accidental selections whilst not slowing down game play.
We used dwell-time based activation to solve the Midas Touch Problem and used a range of techniques to overcome the user uncertainty and dwell-time period issues. These included:
We also made all buttons large and easy to select with gaze, to account for inaccuracies in the gaze data collected from the eye tracker.
To simplify the game screen, we created a pause screen, which enables users to take a break from playing the game and allows them to change settings such as dwell-time, sound effects and background music.
We also created an interactive tutorial for the game, to teach users the game’s concepts and how they can use gaze to control the game. We also implemented the ability to control the game using a mouse, so that it can be played by a wide audience.
Unity provided us with the majority of tools that we needed in order to build this game. As this is a 2D game, the mechanics within the game are not overly complicated, despite all of us having no prior experience using Unity, C# or with game development in general. This is not to say that there were no hurdles in the way – we had two main issues that we came across when making this game.
The first hurdle was creating a realistic looking water simulation for our game. This is when our lack of experience with Unity was slowing down our progress. Fortunately, Unity has an Asset Store where other developers have released their own game components which anyone can download and use for their own game. We downloaded an asset called Water2D which included 2D water mechanics and a very simple looking pipe which helped us to get started.
However, the solution provided by this asset was far from what we needed for our own game. Along the course of our project, we spent a lot of time building upon the Water2D asset so that it followed the semi-realistic physics of our game, as well as optimising the code so that it would be as efficient as possible. Eventually we were at a point where we had a reasonably good-looking water simulation that only used the water particles from the original Water2D asset.
The second obstacle that we faced was adding the functionality of eye tracking to our game. Tobii provides an SDK on the Unity Asset Store which enables developers to implement eye tracking in their games. This sounded like it would be the perfect solution for us, however, we discovered that this package was only designed to work with 3D games. We could have used several shortcuts to force the package to work with a 2D game however, we decided to instead produce a full and reusable solution ourselves.
The Eye Tracking 2D package uses eye tracking data from the Tobii API and enables this data to be used in 2D games. We not only used this within our own game to implement eye tracking, but also packaged the solution so that other developers can easily use it to implement 2D eye tracking applications by simply dragging and dropping scripts into their game objects.
We hope that this easy to use package will encourage other developers to develop their own eye tracking games for individuals affected by severely disabling conditions.
We have published Of Mice and Messages onto the Microsoft Store and are in the process of releasing the Eye Tracking 2D Package onto the Unity Asset Store.
We aim to continue to develop the game with additional features, for example new level modes and storylines and users having the ability to create and share their own levels and we would love to gain feedback from the community, to help shape our development strategy.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.