Recently, when googling some keywords related to Medical Augmented Reality, I stumbled over a beautiful augmented reality system setup based on a tablet PC. And, what was even more exiting; it seemed to be used in a real surgery! Following up with the credits given in the video, I ended up with an interview with Prof. Dr. Hans-Peter Meinzer. Prof. Meinzer is supervising a group of developers at his department at Deutsches Krebsforschungszentrum in Heidelberg. Together with his medical collaborators, he pushed the system out of the lab space into the real world. The interview was held in German on February 16, 2012. Here you get the English translation.

Dear Prof. Meinzer, I found your video on youtube having the title “MITK pille – mobile medical augmented reality App for the Apple iPad”. Can you give us a little bit more background information about this project?

The project is now about 18 month old. At the very beginning, we called it “Pille” because of Doctor McCoy of spaceship enterprise whose nickname was “Pille”. This guy had a machine to look into his patients to be used for diagnostics. But at that time, we had no glue how this machine is called. Now we know that they called it “Tricorder”. However, we changed the name by the time. In the US, they do not understand this link since the American name of Dr. McCoy is Dr. Bones. For this reason, we named the project now “Bones”. The project is still in a very early, experimental stage. However, I love it and my working group loves it and we have a lot of fun with it. In fact, we are rather surprised about the success of the project.

Do you use an IPad in your video?
Yes, this is correct. It also works with an IPhone but then the screen becomes very small. Actually, we started developing with old IPhones since IPads were not available at that time.

How does you system work?
The software has been there already (MITK – Medical Imaging ToolKit). The major issue we had to deal with was the linkage between tablet PC or IPhone and the patient. Of course, an IPhone is not an x-ray device. The data has to be there already, e.g. computer tomography, ultrasound or anything else.

So, how did you link the handheld tablet PC with the patient?
Well, we tested systematically six different tracking systems at my department. There are optical, magnetic, and mechanical tracking systems. And usually there are two or three comparable products of one approach. But there is still another tracking technique, which we developed within an earlier project in cooperation with the urologists. Here, we stick small needles into the prostate. An endoscope or laparoscope camera now sees the plastic heads of those needles. Using the position of the plastic heads of the needles in the video images, we are able to calculate with simple trigonometry the pose of the camera. We developed this tracking system in order to look into the prostate and to show bundles of nerves, the ureter and the tumor. This kind of tracking approach is called laparoscopic inside-out tracking.

Then, we had the idea to attach such markers to the skin in order to look into the patient from the outside. The tracking software was already there from the project I mentioned before. However, the major problem was how to use the camera of the IPhone to detect those markers for inside-out tracking. And now it really becomes a job for hackers. For this reason, I founded a working group, which I call “Nerds”. Those guys are exceptionally cool. It is quite hard to find somebody who is able to program IPhones. But those guys know how to do this. I’m pretty proud of them. 18 month ago the first one started to work with me, and now there are seven of them.

They found a solution to stream the IPhone camera to a second PC via WLAN in order to compute trigonometry on that external computer. Then an Augmented Reality image is composed and streamed back to the IPhone. This approach is quite unique and hard to be reimplemented. Although the tracking technology was there already for some time, no one thought about our IPhone -approach before.

Did you publish your solution already?
Well, only partially. Recently we had a publication about the system’s application to the kidney of Prof. Dr. Rassweiler, Urology in Heilbronn and myself in the Journal of Endourology.

Oh yes! The story went like this. Some time ago, I meet Prof. Dr. Rassweiler at a conference. I told him about our approach and took him into our lab. Then, we looked together into the stomach of a puppet in our experimental setup of that system. He immediately said, that he wants to use the system. Then, he took my people with him in order to let them visualize a renal calix. Here, doctors penetrate their instrumentation into the renal calix to perform laser-based breakup of the renal calculi. Hence, they do not enter through the urethra, but they rather penetrate through the tissue from outside to reach the renal calix. And that’s what he did then. Actually, radiologists said that they do not need such kind of system. They don’t need anything else than their 2D images. In contrast to urologists, surgeons, trauma surgeons and orthopedic surgeons, who went totally crazy!

How did you then use the system during surgery?
In the two surgeries performed with a present Augmented Reality system, Michael Müller, our main developer, held the tablet PC while the surgeon treated the patient. One problem became obvious: Holding the tablet PC for a longer time turned out to be quite uncomfortable and Michael began to jitter. This also causes a recalculation for every new pose of the tablet PC due to the jittering. For the future, I can imagine a flexible arm construction holding the tablet PC once it is moved to a suitable position.

Where do you and the surgeons see the major advantage of the Augmented Reality System?

One is able to look inside! For example, you can use the tablet PC, when you want to insert a needle into the intervertebral discs from the outside. Then you pose the tablet to be held by the flexible arm construction above the spinal column. Now, you can palpate the spinal column at the region of interest with your finger, which is also visible in the image on the tablet PC. Then, you are able to create a mental link between the haptic information and the visualized vertebrae below the finger. Using this information, the surgeon can now identify and mark the exact entry point on the back of the patient.

Beside the definition of the entry point of the needle, do you also think about navigating the needle to the right location?
Well, we are progressing quite fast at the moment. After the basic system worked, it became clear that we still need to track the needle. Michael Müller attached a small plate with a dozen marker dots to such a needle. This set of marker dots can be seen by the camera and tracked by inside-out tracking as well. For this reason, we know about the position of the needle. Then we developed a program that helps guiding the needle to its destination. This approach had already been proposed by Dr. Lena Maier-Hein for liver puncture. Here, one needs to pose the tip of the needle at the entry point on top of the skin surface. There is a cross displayed to guide the tip of the needle. A second point defines the destination of the needle. The entry point and the target point serve as notch and bead sighter to find the right orientation of the needled to be penetrated into the body. Then you start moving your needle x cm towards the region of interest.

powered by

Follow This Blog

Selected Interviews

I’d like to refere to an interesting interview about VR glasses to be used for training surgeons. The interview with Dr .Shafi Ahmed, operating at The Royal London Hospital, has been published at news-medical.net. As Dr. Ahmed sais „VR lets every student…

Helping Autists to Interact with Their Social Environment

Arshya Vahabzadeh, a pediatric psychiatrist, speaks in the interview with The Doctor Weighs In about his work with the company Brain Power developing transformable, wearable technologies for children with autism to learn crucial live skills. With the help of Google Glass…

Google+ Hangout Session about Medical AR

I would like to direct your attention to a very interesting google+ hangout session organized by Christine Pereyand hosted by IEEE. This online discussion having the title „Augmented Reality and the Human Body“ took place on January 22, 2015. Terry Peters,…