In May of 2016, I entered the Seattle Microsoft Hololens Hackathon.  We had 44 hours to build a project with Unity and Microsoft Hololens.  The tech of our project was composed of C#, Javascript, and OSC protocol.  I had met Ryan James at the previous VR Hackathon and was really interested in his work with medicine, learning and simulations.  I was really happy to join his team and excited to design sounds for the project and work on my Unity chops.

The challenge was to create a simulation that trained medical practitioners how to guide a catheter into the right side of a patient's heart, then make a precise puncture.  The target is the middle of the heart so that surgeons may gain access to the left side of the heart.   

Ryan explained this common procedure prevents surgeons from threading a catheter through the heart, into the aorta to the left chamber.  It's simpler to puncture through the middle.  We wanted to understand how augmented reality could improve this procedure.

Transeptal Puncture and Catheter Integration Simulation

Transeptal Puncture and Catheter Integration Simulation

We set out to create a simulation that will provide a better look and practice, with a hologram of the heart.  Dozens of xrays are usually conducted during the procedure, injecting dyes into the tissue for visual feedback.  An advantage of the Hololens is that it can display a 2D close up and 3D view simultaneously.  This was an early team UI decision. We also knew the Hololens would be great for monitoring vitals and managing patient information privately.  

Ahmad Aljadaan and I set out to design the audio cues early on so that our voice actor, Bridget Swirski could record them.  We quickly began to realize the importance of these cues in the user interface.  The audio cues guide most of the experience, creating feedback for the user when they are on and off track in the procedure, and giving them directions as they guide the catheter and needle through the inside of the heart and arteries.  We chose a female voice, as suggested in a lightning talk by the Hololens spatial audio design team.  The human voice has excellent spatial recognition in the AR environment.

These audio cues helped refine our approach into the experience storyboard:  

We knew the end user would need a granular method of controlling the catheter and needle.  OSC seemed like a great choice since it's a low latency TCP protocol used for musical instrumentation.  It works with iOS and Android already, and I found some packages for Unity for us to modify.  

TouchOSC Custom Controller Interface I designed for Single-Hand Operation (accelerometer control was a little squirrelly for surgery)

TouchOSC Custom Controller Interface I designed for Single-Hand Operation (accelerometer control was a little squirrelly for surgery)

We crunched and coded to the very beginning of our presentation.  The scope of our project was a little big for the weekend, and we did a lot.  Sometimes the scope of your project outgrows the hackathon and it certainly did this time. 

Credits

Ryan James: Lead Developer

Ahmad Aljadaan: Developer, UX

Mark Laughery: Scene Design, UX

Jigesh Parekh: OSC Integration, 2D Design, Presentation

Andrew Luck (me): Sound Design, UI, OSC Integration, Presentation Design, Video Editing, 2D Motion