Healthcare from the perspective of a clinician encompassing both the capture of the clinical viewpoint as well as the technology to help clinicians capture knowledge at the point of care
The thoughts expressed are my own and do not necessarily represent those of Nuance

I know I am fighting a tide of folks who like to hold on to paper and feel reading a book cannot be done unless you are holding paper printed with ink but they said similar things about letters which have been replaced (love it or hate it) by e-mail. As it is, our library system is struggling and children's school text books remain an exercise in frustration of obsolete texts that contain markings, answers and missing pages and cost parents money each year. But it's healthcare where we can "slay the paper dragon"

Healthcare is the real paper dragon to slay, and the Americans might even live longer if we acted. The National Institutes of Health and other leading institutions could more effectively distribute medical information to doctors and patients alike, and the sick could use the same machines to monitor treatments and juggle around pills, not just track the financial details.

My own parents struggled with drug therapy creating a spreadsheet (well actually this was created by my brother and loaded up ready for updating by my mother) to track the multitude of pills, times and dosages necessary to comply with physician directions. Simplifying these instructions and sending them digitally in a form that can be consumed in an iPad like machine (to be clear this could even be a PC but the advantage of the iPad is the instant on, instant connection to the internet and reasonable compromise between screen size, use usability, mobility and portability) would be very attractive to many seniors struggling with their own treatment.

The cost of the Healthcare paper work mountain exceeds $1,000 per person in America so anything that attacks this problems is going to be desirable. But what David Rothman is referring to is not just about the technology of presentation but the underlying transportability (semantic interoperability) and he refers to the "magic of web links and facts consolidated via XML Based Technologies". Unfortunately the challenge for the current system is reaching that point of interoperability (Standards and Interoperability) given the history of paper and the wide variation in representation of diseases, drugs and therapies (they told me when I first went to medical school that learning medicine was equivalent to learning a new language in terms of the added vocabulary required to communicate with my peers; in fact Latin used to be required for any student wishing to study medicine - you can see some of the Latin terms in medicine here). Normalizing these terms and extracting the data is the challenge facing healthcare . There is a clear need for narrative in communication - this is how clinicians best communicate clinical information amongst the team and indeed to the patient (as can be seen here) but clinical systems and the EHR need data, but data input is difficult. Bringing these two worlds together is the thrust of clinical language understanding combining Natural Language Processing technology (NLP In Healthcare) with the emerging world of digitized medicine. David wants to

let patients themselves play more of a role in policing our health system, thereby lowering costs while actually taking up less of their time, thanks to the right automation

and they will (and must) but we have some significant steps to take to achieve his vision of dashboards and the easy and rapid sharing of information. At this point any small steps is good news (every journey begins with but a small step) and the simple process of e-mailing patients actionable health tips based on the doctors finds may seem mundane but its a start.

For now we have to deal with the existing system, navigating the insurance nightmare of cost and denials all the while trying to keep up on treatment plans, drugs and therapies and if you are like me not just for yourself but for multiple family members. For now in lieu of an iPad

Get a full copy of your medical record

Get everything in digital form if at all possible but if not in printed form - you can always scan and convert to a PDF document with text using optical character recognition (OCR)

Get a copy of your problem list including an explanation for things you don't understand

Full listing of drugs as well as ones you have taken in the past and stopped

Get your X-Rays again in digital form on CD is good but failing that get the actual films

Educate your self on your condition(s) - be an expert

We all face the same challenges but starting with a full set of information helps everyone. Over time doctors will be able to produce clinical records that contain the full story for the patient that includes the narrative and the data to help automate some of this activity. In the meantime you need to be part of the solution that coordinates your care.

Had good experiences or bad - let me know. Seen your records - what did you think? Was it useful to have your medical record?

Monday, June 21, 2010

Along the lines of Deep Blue IBM is breaking new ground with its latest research innovation "Watson" focused no Natural Language Processing applied in this instance to the well known television game of Jeopardy. Take a look at the video that features the Super Computer Watson pitted against contestants in a real game of Jeopardy. The only accommodation for the "silicon based" life form was providing the questions as text rather than requiring the additional step of speech recognition

Certainly impressive and looking like a real leap forward even with errors occurring. This is of course a enormous task for any computer but even to achieve success in certain instances is extremely impressive and very exciting. Here we are 13 years on from Deep Blue's famous feat of beating Gary Kasparov at chess. The New York Times featured this in the magazine over the weekend: Insert Title. As they point out this is approaching the innovation we have seen on Star Trek

The computer on Star Trek is a question-answering machine, it understands what you’re asking and provides just the right chunk of response that you needed. When is the computer going to get to a point where the computer knows how to talk to you?

Well it seems we stepped a lot closer to the Hollywood vision that's been in place since 1963. In fact I have been making this point for a number of years. We have been fooled into believing Speech Recognition achieved much more than recognizing words. In fact Spock's original interaction with the computer in 1963

Computer, compute to the last digit the value of pi" -- Spock (Wolf in the Fold)

Was asking for much more than just speech recognition but included comprehension and then actions based on that comprehension
Over time we have seen many instances but the challenge of comprehension is brought home in Star Trek IV - The Voyage Home when Scotty discovers that speaking to a computer and expecting it to understand was beyond the capabilities:
As we see (even in Hollywood) computers continue to struggle with complexity in language (Direction Unclear):
But with Watson's success in what is a good analogy of the complexity of human language we are approaching the point of genuine interaction with technology and as some of the contestants intimated:

Several made references to Skynet, the computer system in the “Terminator” movies that achieves consciousness and decides humanity should be destroyed. “My husband and I talked about what my role in this was,” Samantha Boardman, a graduate student, told me jokingly. “Was I the thing that was going to help the A.I. become aware of itself?”

I think we are still a ways away from this but with the change in approach as opposed to trying to teach computers all the variations of data and linkage allowing the system to "learn" by feeding in data and creating algorithms that link data statistically for future inference.

Much like the challenge in medicine Watson applies extensive knowledge that has been previously analyzed and stored and importantly applies multiple algorithms to come up with a stack rank of answers. In fact in the of all the predictive systems available ones that take multiple predictions form different sources and then takes the most frequent tend to be the most accurate

Watson’s speed allows it to try thousands of ways of simultaneously tackling a “Jeopardy!” clue. Most question-answering systems rely on a handful of algorithms, but Ferrucci decided this was why those systems do not work very well: no single algorithm can simulate the human ability to parse language and facts. Instead, Watson uses more than a hundred algorithms at the same time to analyze a question in different ways, generating hundreds of possible solutions. Another set of algorithms ranks these answers according to plausibility; for example, if dozens of algorithms working in different directions all arrive at the same answer, it’s more likely to be the right one. In essence, Watson thinks in probabilities. It produces not one single “right” answer, but an enormous number of possibilities, then ranks them by assessing how likely each one is to answer the question.

Thinking about this system and its application to medicine we are stepping increasingly closer to analysis of multiple inputs of signs, symptoms and subsequently examination and laboratory testing and imaging. A number of years ago I saw a similar solution in very basic form that analyzed inputs as they arrived and started to produce a short list for differential diagnosis. The limitations at the time related to computing power and inputs but and to some degree the capture of knowledge in a form that could then be used. Watson turns this process on its head providing a means to input knowledge in large quantities that can then be analyzed, cataloged and then applied. There remains the question of what is valid information that can and should be accepted but even with this problem processing the rapidly expanding knowledge base automatically provides a means to help clinicians who today do not have the time to process all the moves/adds/changes to the clinical corpus of knowledge:

The problem right now is the procedures, the new procedures, the new medicines, the new capability is being generated faster than physicians can absorb on the front lines and it can be deployed

I don't see call centers being the route of interaction but much more likely as an adjunct tool providing guidance and short lists to the clinicians at the point of care of differential diagnosis and what steps (what additional history, examination or investigation) can help rule out or confirm the various choices. This may not be a patient level tool but as an adjunct to clinical knowledge is likely to offer significant support to clinical care and help improve the diagnosis and treatment of patients.

Combine this with a speech recognition tool that accurately renders the clinical data and you have some level of real time evidence based medicine that will revolutionize healthcare. DoctorNet will become self aware....very soon.

Thursday, June 10, 2010

Harvard business review blogger Jeff Goldsmith wrote a pretty damming write up on the healthcare technology sector: "Has the US Health Technology Sector Run out of Gas". He covers the lack of recent innovation and development across the board including pharmaceuticals, medical devices and even the once promising bio technology and personalized medicine/gene therapy concepts. But his summary of the healthcare IT sector was withering

Enterprise clinical information technology seems to have hit a similar flat spot. The major commercial IT platforms for hospitals and health systems are more than a decade old. Some of the older platforms are written in antique computer languages like COBOL and MUMPS, which predate the Internet by 20 years. Despite a societal investment of more than $100 billion, these tools have yet to demonstrate that they can reduce the cost or improve the efficiency of patient care. They remain cumbersome, expensive to install, maintain and operate. The user interfaces feel a lot like Windows 95 in an iPhone era.

Yikes! Is it really that bad.......there are probably plenty of clinicians, patients and even some IT vendors who might accept that in some cases it is bad - I bet many of you can still find a text based system using some form of terminal emulation still in use somewhere in your clinical facility. In fact asking anyone to use systems with these kind of interfaces would seem wrong and such systems should either be pushed out into the digital graveyard where they belong or at a pinch shield the user from these idiosyncratic requirements and counter intuitive user interfaces.

But there are innovations and new use of technology - :in the piece "E-health and Web 2.0: The doctor will tweet you now; Patients can now meet their doctors in 'the cloud'" we can see the adoption of this technology to providing a rapid response more suited to the new age of instant communication and busy lives we lead today. It might be hard for a physician in the 1950's to understand the need for this speed of communication but bear in mind the treatment choices in those days were limited. In fact the father of a friend of mine at school was a physician and he described to me his experience of a "crash call" or "code blue"

If a patient had a problem the nurses would summon the porter who would be dispatched to my room to wake me up. I would be woken by a knock on the door and informed there was a patient "going off" on Ward xxx. I would get up, get dressed, more often than not the porter would leave a cup of tea outside my door and I would take that and then leave for the ward. By the time I arrived one of two things had happened. The patient had either died or had improved of their own accord. Their syncope, myocardial infarction or whatever event that had taken place was either resolved or resolving or they had succumbed to that critical event. There was little we could do or offer in the early days and rushing to the ward made no sense

Today we live in a technology and information rich society where instant communication is expected and can and does make the difference. In fact:

Jeff Livingston, an obstetrician and gynecologist in Irving, Texas, said his 10-doctor practice has about 600 Facebook fans and more than 1,500 Twitter followers

That's no small following and I am betting many social networking gurus and experts can only look at those numbers with envy. We don't fully understand how this technology and communications systems will impact healthcare and the delivery system but one thing is for sure rapid innovation and change will be the status quo.

Thursday, June 3, 2010

>>>>A report is used to show that Americans could save while maintaining quality, but that may not always hold<<

I am reminded of the quote from my math teacher: "there are lies, damn lies and statistics". His point at the tim was that figures can be manipulated to tell any story and careful analysis is always warranted. In healthcare this is very evident with the deluge of product sheets and "studies" designed to persude the medical profession that the latest drug is better than the current cheap generic.

Unfortunately in this case while there may be some refinement of the data necessary this should not cloud the essential point hammered home by the study and the investigators. Publishing the data makes institutions aware of variations in standards and cost of care that must be explained and will serve as the basis of of future reductions in cost of care and nationwide improvements in quality.

Amongst other highlights in the study the link between higher utlilization and failure to improve outcomes is troubling:

>>>>The evidence is that higher utilization does not extend life expectancy, and might be correlated with shorter life expectancy, compared with lower utilization. Therefore, sending people with chronic diseases to higher-efficiency, lower-utilization hospitals for their care could result in both lower spending and increased quality and length of life.”<<<<

It certainly provides caregivers as well as patient's data to base their care treatment decisions on.

We should debate the data but the nature and extent of the variations exposed cannot be explained away and point to the issue of incentive in US healthcare today. We incent the providers to do stuff. The more procedures, treatments and clinical tests you do the more you get paid. Coupled with the increasing squeeze on compensation levels and the drive to increased utilization can be easily understood.

One of the six sigma core principles is that you cannot improve unless you have data and measure. The Dartmouth study and data while imperfect is a significant step in the right direction and should be embraced and used as a stepping stone in the long and winding pathway of healthcare reform.

Wednesday, June 2, 2010

Mobility and medicine has long been a staple of clinicians lives. With the increasing penetration of wireless and connectivity we have seen increasing ability to access information on the move but data capture continued to be a challenge. Now technology is catching up and providing tools and technologies that will allow physicians to be effective anywhere, anytime and any place (almost)

As you can see from the video above and the original article posted iMedicalApps.com (Nuance Medical Transcription iPhone Medical App) that features both a dictation application for clinical notes example and the ability to voice in a medical search and return results from various clinical knowledge providers including Medscape, Epocrates, and Medline. Critical to this will be the SDK component allowing other vendors to include these tools in their applications speech enabling clinicians who need access to information and the ability to capture clinical data while on the move. As Felasfa Wodajo, MD stated in his post:

..seems to have generated a fair bit of interest, judging from other websites and the traffic at the booth. I suspect this is justified as physicians are only too happy to get rid of their dictaphones and not have to sit in front of a computer microphone

This is just part of mobility and integrating the right tools into an increasingly mobile and distributed clinical care team will need some innovation especially when it comes to portability and power. Without belaboring the point Apple seems to have done it again with the iPad which has addressed the power issue with true all day usability. The only question now seems to be is it portable enough not to weigh too heavily on the clinician carrying the unit. The ergonomic challenges that occur as a result of having to hold the device and write on it at the same time remain. This was true when the tablet computers first hit the streets and a range of carrying cases appeared to help but never really solving the challenge of the constant weight applied to a wrist held in a horizontal orientation.

Meanwhile integrating alternative data capture methods to ease this burden will be important. How has your iPad experience been and is the weight and shape/size workable or not. Have you managed ot use the dictation method to capture data and was ti effective?