Technology in Medical Education

Like many of you, I watched Apple's much anticipated reveal of the new iPhone 8 and iPhone X. Sure, they're pretty, they have nice cameras, wireless charging, and Face ID, but what really caught my attention was the new augmented reality platform. iOS 11 will launch this month with ARKit, a set of developer tools that will make it much easier to design apps with augmented reality experiences. I am not a techie or a gamer, and I only have minimal experience with virtual reality (VR) or augmented reality (AR). However, I think these technologies have huge potential for Medical Education.

VR and AR are already being used in med ed. These technologies can facilitate basic anatomy instruction, procedural training, and simulation of patient interactions. More importantly, learners can view the anatomy and practice their skills without the need for cadavers, pig labs, standardized patients, etc. Don't get me wrong, I loved my anatomy labs. They play an integral role in medical education. But these labs are expensive to operate and you can only dissect each cadaver once. AR and VR provide students with a realistic learning environment and require fewer resources than conventional methods.

Virtual Reality (VR): creates a realistic, virtual world that users can interact with. Typically requires a helmet or goggles.

Augmented Reality (AR): blends virtual reality with real life. Users interact with virtual content in the real world, and can distinguish between the two. May use a headset or goggles, but can also be done with a handheld device.

Several VR medical training platforms are already available. Some focus on creating realistic, emotional, patient interactions, while others offer more skills based applications. Osso VR, for example, provides a realistic surgical training environment, allowing surgeons to go through the steps of performing an operation in an immersive VR environment. Their product is currently geared toward Orthopedics, but they plan to expand to include other specialties and procedures. The real advantage here is repetition - trainees can perform procedures and operations multiple times in VR before ever performing them on a live patient. In the Emergency Medicine world, this would be especially useful for practicing rare procedures, like cricothyroidotomy and pericardiocentesis. ​

AR platforms, such as Microsoft HoloLens, also have intruiging implications for Med Ed. With HoloLens, the wearer sees a holographic image projected in front of them in space, but also sees the room and other people as they are (hence AR, not VR). For example, students can examine the anatomy of the heart in 3D space, rotating it and looking inside the organ. Even more exciting is the way students can interact with the hologram, peeling back layers, watching blood pump, or adding labels. Case Western Reserve has been piloting HoloLens with great success and and are now working on creating a full digital anatomy curriculum.

So, where does the iPhone fit in? The new iPhones (8, 8 Plus, and X) all support apps built on Apple's new AR platform, and I suspect the next generation of iPads will, as well. While this may not be quite as cool or realistic as an immersive VR environment or interactive hologram effect, it is far more accessible. Developers are probably already hard at work building educational apps that will soon be available to anyone with an iPhone. Whereas products like Osso VR and HoloLens are still likely limited to use in the classroom or simulation centers, any student, resident, or practicing health professional will be able to access the iPhone's AR apps anytime, anywhere (and, probably, at a much cheaper price). Just check out Insight Heart for a preview of what's ahead. I'm calling it now, this will be a game changer.