I had the pleasure of spending my weekend this week at the University of Sheffield for their first iteration of HackMed, a hackathon aimed at integrating medical (Jannah, Wai Ching and Andrew) and Computer Science students (Myself and Dorian).
Before I even sat down, I mentioned some of the capabilities of AR and how such technology is brilliant for instant visual feedback. An issue raised was the time wasted searching for reports, patient history and various other forms of diagnostic information on patients as a doctor enters the room. Our solution would replace this by simply sticking a tracking image unique to the patient above their bed; this would trigger multiple windows to appear and display important information such as vitals, allergies, current medication, etc.
A bonus would be that regardless of where the doctor is based in the hospital, provided they have access to the tracking image, this information can be pulled down.
We spent some time thinking of the perfect punny name for our AR creation as this was an obvious priority for us. DocChARt, as suggested by Andrew was too good not to use.
After spending a solid hour trying to solve the issue of why tracking wasn’t working (because I didn’t enable a package) I began development on the app. I experienced the joys of working with UI elements / positioning them in a real space.
We also created a mini hospital room and a fake patient to demonstrate this.
- The patient’s tracking image was attached to a folded strip of paper, allowing it to be hot-switched out.
- Our patient was a whiteboard marker with a sad face drawn on a rolled strip of paper.
- The bed was a whiteboard rubber with a layer of tissue wrapped around it, then another layer, folded at the top for the ‘cover’.
- Wai Ching did some magic and added a WINDOW with the head of a transparent plastic spoon
- Support beams around the model with dismembered spoons/forks
All of this was then placed on five plastic cups to give us a bit of extra height to prevent any unnecessary background noise during the image capturing.
Our hospital room with our AR-placed elements around the room. Please take note of my hilarious hospital-themed name for Katniss
Diego thankfully managed to create a great looking UI so that the app would be something other than solid coloured backgrounds as is with most of my attempts at ‘design’. He also created an iPhone app with a touch-ID login system so that we wouldn’t need to use my password field system, which would be quite clunky in a practical environment.
To make sure the demo was as realistic as possible, I implemented some live vital charts for our left panel, while, in practice, this would be fed via a server, I wanted to simulate this without having to use any costly functions as this would be running every frame. Due to the costly nature of the Hololens, this would also need to run on mobile, thus the need for efficiency.
After some input from both Wai Ching and Andrew, I managed to generate realistic looking patterns for the various kinds of data ECG machines would typically provide.I scrapped my original system for something less costly, inspired by a tutorial we worked through in my first year at university for repeating backgrounds.
To do this, I created a flat 2D image that started and ended in the dead centre. I then wrapped this around a quad and almost like magic, I was able to adjust the material offset to simulate the waves, all without using any expensive functions. The random also has a min/max and delay option, allowing for graphs such as the blood pressure to be slower.
And our project wouldn’t be complete without a website, with a background mostly in medical, Jannah amazingly put together a beautiful site for us in less than 24 hours: http://curvograph-larve.dyn1.push2.io/
It was a joy to work with such a talented, diverse team; this has easily been one of my most enjoyable projects to work on to date.
That said, this post wouldn’t be complete without a swag-shot: