Thursday, 23 October 2014

Healthcare Interoperability in Canada: Perfection is the Enemy of Good

Yesterday, as co-chair of the ITAC Interoperability and Standards Committee, I presented opening comments for an ITAC Health workshop on Interoperability. Details of the event can be found here. Below is the text of my opening comments.

The Problem

I’m a software developer that got into healthcare about 10 years ago. When I joined healthcare, I was surprised by a number of things I saw. Things like:

Records are stored on paper and exchanged using paper fax.

The software behind the desk looks like it was written in the 1980s or 1990s.

The endless transcribing and repeated oral communication at every encounter is reminiscent of medieval monasteries: In a week a patient can repeat their entire medical history to multiple clinicians and dump out their bag of drugs for each and every one of them.

Data exchange, if it happens at all, is often extracted directly from the EMR database (bad practice) and looks like lines of custom pipe-delimited text from my Dad’s generation.

In short: Why hasn’t technology revolutionized healthcare like it has every other industry? It feels like Canadian Healthcare is still stuck back in the last century. Not much has changed in the last 10 years.

Healthcare IT in Canada is behind the rest of the world by most measures. Even the U.S., who are committed to doing everything the hard way, are years ahead of Canada when it comes to Healthcare IT. How did we get here? How can we fix it?

How did we get here?

You can’t blame Canada for lack of trying. We have invested billions of dollars into major eHealth initiatives right across the country. There has been a decade-long project to introduce new healthcare interoperability standards across Canada, organized under a Pan-Canadian EHR Blueprint to get everyone connected into centralized EHR repositories. We were promised that everyone would have a shared electronic health record accessible by all providers by 2015. We’re not going to make it. What happened?

If I were to pick one overarching theme it would be this: Perfection is the enemy of Good.
I’ve seen numerous projects get derailed by intricate Privacy and Security tentacles that grow out of monstrous consent models. Time and time again we have held up perfectly secure and functional eHealth initiatives because we’re pursuing an absolutely comprehensive and airtight privacy and security model around it. These delays cost lives. It’s too easy to indefinitely postpone a project over privacy and security hand waving.

Another issue I’ve seen hold Canada back is our fantasy that each province is a unique flower, requiring completely different infrastructure, software, and its own independent standards committees and EHR programs. Get OVER yourselves. We will all save a heck of a lot of money when the provinces just get together and present Canada as a single market to the international Healthcare vendor community, rather than as a balkanized collection of misfits.

From a software developer’s perspective, I can tell you one issue that contributed to delaying Canada’s eHealth agenda is the quality of our Interoperability Standards. I’ve heard people say, “I don’t care what message standard you use to move your data around—the technology is irrelevant—the interoperability standard isn’t the problem.” To this, I say “hogwash!” I’ve seen good APIs and I’ve seen bad APIs. The “P” in “API” stands for “Programmer.” If you want to know whether a proposed API is any good, you have to ask an experienced programmer. If you take a look at the HL7v3 standard, it looks to me like they skipped this step. If it costs 10 times as much effort to implement one API over another, that’s a sign there’s probably a problem with your API.

I think when the whole Canadian HL7v3 thing started out, there were a number of vendors involved in the process. But one by one they dropped out, and the torch was left to be carried by committees of well-intentioned, but ultimately misguided information modellers.

We in the Canadian vendor community need to take some responsibility for letting this happen.
Smaller vendors didn’t get involved because they couldn’t afford to—many were just struggling to survive in the consolidating landscape. The tragedy here is they will be the ones most affected by lack of interoperability standards.

Larger vendors arguably stand to benefit the most from a Wild West, devoid of easy-to-use interoperability standards where their Walled Fortress can be presented as the only fully interconnected show in town!

But simply falling into the arms of a handful of large vendors will have a cost for all of us in the long run. That cost is innovation. It’s in our best interest to start seriously thinking about supporting a manageable collection of simple, proven interoperability standards.

How can we fix it?

Vendors are the custodians of the most experienced technical minds in Canada. We need to bring these minds together and take on this problem. We can’t afford to continue complaining, wiping our hands of responsibility and expecting government to figure it out for us. We need serious software engineers at the table, rolling up our sleeves, and getting this job done.

Now it’s easy to say that. But what can we practically do to move this forward? I recommend 3 things.

We need something in Canada akin to the IHE working groups they have in the U.S. A focal point for vendor input on the direction interoperability standards will take in Canada. This needs to happen at the national level.

We need to leverage infrastructure already deployed and we need to leverage standards that have already been successfully implemented in other parts of the world. This will mean moving forward with a plurality of standards, such as IHE XDS, CDA, HL7v2 and HL7v3, and potentially even FHIR.

We need to strive for simple, clear and unambiguous interoperability standards. It’s not enough to say you broadly support a standard like HL7v2. You need to have very specific conformance processes to go along with it that ensure my HL7v2 messages have exactly the same Z segments and use exactly the same vocabulary as your HL7v2 messages.

A bit more on the last point. Along with each standard, you need to have, at a minimum, content specifications and vocabulary bindings. And by this I don’t mean 400 page word document that system integrators are expected to read through and implement. I mean MACHINE READABLE software artifacts that completely specify the structure of how the data will be represented in bytes over the wire and how field values will be unambiguously interpreted. Representing your specs in a machine readable format accelerates interoperability tooling by a considerable factor. It’s the difference between building robots, and building robots that are able to build other robots.

For different standards this means different things.

Standard

Machine readable artifacts

Recommendations

HL7v2

conformance profiles, vocabulary dictionaries

UHN has done some great work here with their machine readable HAPI conformance profiles

HL7v3

MIFs with vocabulary constraints

although I don’t see much of a future for HL7v3 here in Canada outside of pharmacy and even there it’s not clear if that’s going to win in the long run

CDA

templates with terminology constraints

I think the jury’s still out for level 3 CDA. The Lantana group has a made a good start at organizing CDA templates, but this space still has a long way to go—I think it suffers from some of the same challenges that HL7v3 faces

IHE

Integration Profiles

Diagnostic Imaging is the poster child for how an initiative like this can be successful. DI is way ahead of other domains in Canada and we can credit the IHE for much of that progress—we need to consider building on the success of this approach in other domains.

FHIR

resource schemas

I have to say, given how new the FHIR standard is, it’s impressive how many online conformance test sandboxes are already publically available—that’s a testimony to how committed FHIR is to machine readability, openness and simplicity. Read Intelliware's assessment of FHIR here.

In closing, I’m asking the vendors: give us your best engineers, and let’s work together to get serious about establishing some simple, functioning interoperable standards to get our healthcare data moving!

Blog Archive

About Me

Ken Stevens, VP Healthcare, Intelliware Development. Intelliware is committed to open-source, standards-based agile software development and system integration.
Ken got his start by writing the software that put the Globe and Mail newspaper on the Internet. He made the transition to healthcare when he joined Intelliware about 10 years ago, where he lead a number of software delivery projects for pharmacies, hospitals, and healthcare agencies. His experience integrating pharmacies into jurisdictional Drug Information Systems led him to ITAC where he supported the Interoperability and Standards committee’s call for simplified healthcare data mobility in Canada. Now, as co-chair of that committee, Ken works directly with ITAC members to ensure that vendors have a voice with government agencies when it comes to their choice of interoperability standards in government issued RFPs.
Ken holds a bachelor’s degree in Philosophy and Mathematics from the University of Waterloo, and a PhD in mathematics from the University of Toronto.