Guest post by Lilly Sinek, Student UCL Computer Science.
As a computer science student at UCL this term, I took part in a hackathon organised alongside Great Ormond Street Hospital to use their Azure FHIR protocol API in an innovative way. I chose to work with Augmented Reality and create an AR clipboard that can be viewed through a mobile phone.
A lot has been made of AR and VR technologies and their potential to ‘revolutionise’ all sorts of industries, including healthcare – but simply put, we haven’t seen that happen yet. In reality, there are still many barriers to users adapting AR and VR technologies, some of which may be solved by future technological breakthroughs, but lots of which simply have to do with user attitudes and experience with AR and VR. Putting on a headset makes some people feel motion sick; other users feel silly walking around with their phone stuck out in front of them to use AR; programmers are not sure how to design for AR and VR and so experiences are created that are difficult to navigate – there are many problems like these that those who design AR/VR experiences have to contend with.
But amongst the barriers facing these technologies, I’m particularly hopeful about handheld AR applications – applications that use your phone and its camera to augment the world around you. These types of app are getting better and widespread all the time, and as they get easier to use, more people get used to using (and designing) them. My favourite recent example comes from London based AR agency Arcade, who made the Roxy the Ranger app for London Sealife that allows kids to interact with an AR avatar around the London Aquarium (https://arcade.ltd/sea-life-ar-chatbot/), and shows how very young people can quickly understand and adapt to the AR setting in order to play games and puzzles involving the world around them.
In this context, I thought I would put some of my AR skills to use in the hackathon to see how AR could interact with FHIR data to create a useful experience for clinicians. The key question in my mind was, what could a clinician get from an AR experience that they could not get from a traditional phone app?
My answer in this context was a more humanized and reality-based experience. We all know how frustrating it can be to be in a consultation and trying to explain something to a doctor or nurse who is immersed in a screen – it breaks the human to human connection that is so important in a patient-clinician interaction. From the clinician’s point of view, facing away from their patient towards their screen, being unable to look them in the eye, is not ideal either, and might lead them to miss crucial diagnostic information due to being distracted or frustrated with technology. For me, this is the true potential advantage of AR - it doesn’t have to draw the user away from the real world – in fact, it should draw them into it.
I went into the hackathon with high-reaching aspirations for apps that could project patient information and observations onto the human body – imagine a clinician conducting an examination of a patient being able to look through their phone and see, superimposed onto the patient’s body, indicators of where they had had previous procedures or incisions, for example.
In reality, this was not something that could be achieved in a two day part-time hackathon! So I decided to go for something simpler, and design an AR clipboard to display real time patient data, fetched from the FHIR API. In a hospital, a patient would have observations and personal information displayed a clipboard next to their bed. Not only does this raise privacy concerns, as data can be viewed by anyone approaching the bed, but it also needs to be updated by handwritten markings or printing new information when there is a change in the patient. A handheld mobile application could fetch and update this information in real time, perhaps even in the future automatically fetching observation data from ioT connected medical devices, like heart rate monitors. The addition of AR allows the clinician to view this information super-imposed on a real-life clipboard, giving the patient-clinician interaction a more familiar and “analogue” feel.
I chose to use Unity Game Engine and their package AR Foundation, which is a Unity provided library allowing cross-platform development of AR apps for both iOS ARKit and Android ARCore simultaneously. This allows convenient and quick prototyping of AR apps that are compatible out of the box with a large number of devices.
Using AR Foundation’s in-built functions, it was relatively easy to create a trigger image recogniser and a flat plane that would be superimposed onto the trigger image once it was detected in the camera field. I created a separate script to handle connecting to and fetching information from the FHIR API, using Newtonsoft JSON library for parsing. It’s not best practice, but I didn’t have enough time to hard code a Patient data object from scratch (FHIR patient data records are quite large and complex) so I automatically generated one using https://quicktype.io. From there a few lines of code sufficed to pull the data I wanted from the data object and after some fiddling around with global text objects, I managed to get this text linked to display it on the superimposed plane as text. A UI text input field and button allowed a user to request a different patient’s data based on their ID number – pressing the button sent another request to the FHIR API, updating the stored data and re-displaying it on the plane. You can see the basic architecture of this simple application below:
I managed to create a simple working prototype in the time available, but I was more excited about the future potential of this project. I created some mockups to show the more detailed kind of information that might be displayed, in future, in a similar AR app.
In all honesty, it felt silly to be developing a healthcare app without talking to actual clinicians about how they might actually be using it. I am very far away from being a medical professional or understanding the workflow of an average hospital, for example. I did have a few informal conversations with friends and family members who do work in healthcare about whether they could see themselves using something like this – the answer was almost always, “possibly”, depending hugely on ease of use and performance of the application. In fast-paced and stressful medical settings, the last thing anyone wants to be doing is fumbling around with an application that is slow, buggy, or counter-intuitive.
For this reason I feel strongly that more work needs to be done in understanding clinician’s real needs and the gaps in their workflow that AR could fill. Part of this work is in developing prototypes to demonstrate what might be possible in future, and I hope that my project in this hackathon can contribute towards that.