Health IT Interoperability Ends 2015 on Transitional Note

“Had we just required that by law — and put together the infrastructure to make it happen — we wouldn’t be having our current problems.”

December 22, 2015 - Industry experts recognize that health IT interoperability won’t be summarily achieved by any solitary prod or mandate. After all, according to the authority known as the Office of the National Coordinator for Health IT, which published its Shared Nationwide Interoperability Roadmap in October, healthcare has just gotten started on a 10-year, milestone-based journey toward securely, efficiently and effectively sharing electronic health information.

In the roadmap document, ONC chief Karen DeSalvo, MD, maintains that we are “closer than ever before” to building a strong health IT foundation and “equipping every person with a long-term, digital picture of their health over their lifespan.” Yet, she acknowledges it will take a culture change to empower Americans to improve their health — and a large part of that effort will entail innovating on a core set of national interoperability standards and policies.

HealthITInteroperability spent the final part of 2015 surveying the landscape of electronic information exchange — with the help of health IT leaders — and assessing where things stand at our collective transition point. Here’s what they had to say about trends and events that significantly shaped interoperability over the past year and where we may be headed in the new year.

An alternate take on document exchange

Tom Giannulli, chief medical information officer at Kareo, starts with the concept that there’s too much flexibility in current health IT standards to effectuate practical information exchange.

“It takes very complex tools and code to actually parse a document and normalize it to what you are using internally or what you are trying to standardize against in terms of nomenclature and how to express things,” he explained. “By the time you figure out all the different nuances, most people don’t understand that when they create the document, you get garbage out.”

He added that health IT standards shouldn’t try to make everyone happy.

“I’m more a fan of rigid requirements, strict interop standards. That approach has proven itself well when it comes to billing. The transactions set moved from the 1500 form to the 857/837 transaction set in use now for submitting claims. That’s one data set, one standard. It’s rigid, but it’s encompassing and it works. Everyone has written to it successfully and everyone can interpret it successfully. There’s not a lot of wiggle there. And there are some good examples that are fairly simple where less is more, rather than moving up in complexity like the C-CDA stack.”

Giannulli envisions a standard summary document to be completed upon every patient encounter and then uploaded to central data repository with a single point of access. Once the data has been exported, other parties could “tap into it in multiple ways, to pull analytics that this infrastructure would create. To me, all the problems would be solved. You could do population health, you could do genetics, drug studies. And if you want to exchange data, that’s easy because the data is already centralized. You wouldn’t have to do point-to-point interfaces.”

He admits that the concept is idealistic, but argues, “Had we just required that by law — and put together the infrastructure to make it happen — we wouldn’t be having our current problems.”

“The rubber hit the road in 2015 with new requirements for meaningful use and for the chronic care management [CCM] program,” said Hefner. “Meaningful use said you have to comply with a 10 percent electronic document exchange requirement, while CCM said basically all of your transitions-of-care referrals have to be coordinated electronically in order for you to bill for services. Those two requirements are real drivers in the market right now.”

“In 2015 there was more of an acknowledgement and recognition that health information exchanges as core transport mechanisms were still a challenge because of the integration requirements, the cost and complexity, and the lack of being able to get to all the endpoints. It’s not unlike the initial deployment of Direct messaging. If not enough people are participating, then the network has less value."

Hefner sees potential solutions in taking a broader view of endpoint connectivity, wherein Direct messaging — a built-in capability for certified health IT technology — serves as a means of moving data in and out of EHR systems.

“There’s a lot of hope and aspiration around the evolution of interoperability through the use of standardized APIs,” he added. “And while those APIs are great for the superhighways of information flow, you still need endpoint connectivity to pick up a lot of the other folks.”

Injecting technology to identify risks

Anand Shroff, chief technology officer at Health Fidelity, points out the market reality that 80 percent of clinical data is unstructured. That has a foundational impact on interoperability efforts, especially in regard to risk adjustment between payers and providers, but it also opens up prospects for technology-based solutions.

“The fundamental premise here is to accurately define the health of the patient in order to correctly determine the payment,” observed Shroff. “If you don’t know that a patient is diabetic, your cost of care is going to exceed the capitated payment you will receive from the government. So it is in a health plan’s best interest to have a very focused effort to correctly capture all the chronic conditions and the accompanying issues that go with those chronic conditions to correctly peg the risk level that they are taking on each patient.”

Most of that information is found in the unstructured portion of the EHR, according to Shroff, because it has to be documented as part of a face-to-face visit, actively monitored and evaluated on a frequent basis, and assessed for progress in the patient’s treatment plan.

“What we have today is lots of risk adjustment coders in the U.S. and overseas who get these charts from EHRs to find the risk factors. It is a very manual, laborious process, very expensive and time-consuming. And because a human being is going through the process, it is very error-prone.”

Natural language processing (NLP) technology is making a difference, Shroff says, with its ability to look through charts and match up risk factors with appropriate evidence. “This massively speeds up the process, and it avoids errors that could be problematic downstream at audit time,” he adds. “It mirrors the use of NLP on the inpatient/outpatient coding side where now most organizations in the country are using NLP to create their claims.”

Large, advanced health systems and hospitals are starting to realize what they can do with unstructured data, Shroff notes. For example, Children’s Hospital of Philadelphia is working on early identification of deep vein thrombosis (DVT) in pediatric patients so that prophylactic treatment can be administered prior to any adverse event.

“They were having roughly 65-70 percent success by doing human reviews of radiology reports and flagging downstream DVT risks,” explained Shroff. “So this year they deployed an NLP-based solution that takes the radiology reports, runs NLP on them and applies complex criteria to flag patients who have DVT risk so they can administer prophylaxis. The NLP-based approach has been about 95 percent correct. We’re seeing more and more such use cases now that NLP is being applied to very real problems.”

Healing healthcare through enhanced trust

Frank Ingari, CEO of NaviNet, says the connection of clinical and payment entities is “at the heart of everything that’s changing healthcare.”

He explains that payers are discovering that their reimbursement will be dramatically affected by the cost but also by the quality of care that’s delivered to the patients they insure — and by the consumer satisfaction of insured patients.

“That has altered the whole relationship in that the payers know that they need to collaborate with and gain the trust of providers in a very different way. Similarly, providers have come to realize that their reimbursement is going to depend on being good stewards of the healthcare dollar...The biggest change in 2015 has been the marketplace acceptance that everybody needs to behave differently. We need to learn how to trust each other in order to heal healthcare.”

Ingari expects increasing exchange of information in support of administrative processes. He sees a transition from emphasis on ensuring payment for services to be performed to “a world in which an authorization is an opportunity for the best evidence-based medicine to provide clinical guidance for the best care. That’s a big change. It’s essential to how we want to do reform going forward, informed by the best guidance available.”

At present, the poor interoperability of EHR systems at the data and process levels remains a significant barrier to progress. “What that suggests is that provider systems are not well equipped with informatics or analytics, and they’re not very well served by HIEs across the country,” noted Ingari.

However, Ingari sees interoperability initiatives such as the Argonaut Project “raising awareness among the so-called ‘walled-garden’ vendors that customers have had enough.”

He concluded, “Customers — whether they are payers or providers — are going to start judging vendors based on how good they are on interoperability, which has happened in every other industry. There’s no way major banks would have tolerated the level of isolation that some of the big EHR vendors have gotten away with for decades in healthcare. I think that era is coming to a close. That’s huge for the industry, and for society.”