HHS has not yet released many details but The American Society for Radiation Oncology’s (ASTRO) radiation oncology-APM (RO-APM) is a likely useful proxy for what this model might look like.

More than two dozen vendors have taken the OCM Vendor Pledge this year yet no single vendor, including several EHR vendors, can presently meet all of the necessary requirements for a participating provider.

“We need results, American patients need change, and when we need mandatory models to deliver it, mandatory models are going to see a comeback… (CMS will) revisit some of the episodic cardiac models that we pulled back, and are actively exploring new and improved episode-based models in other areas, including radiation oncology,” according to Secretary Azar.

Impact on Radiation Oncologists

What will this mean for radiation oncology healthcare providers and the vendors that service them? A foundational and operational re-think will be needed to manage new challenges in documentation, records management, financial analytics and reimbursement.

It is highly probable that the new oncology radiation model will be similar to the already ongoing voluntary Oncology Care Model (OCM), but based on ASTRO’s Radiation Oncology Alternative Payment Model (RO-APM) (PDF link) and unlike the OCM, it will be mandatory. ASTRO’s RO-APM incentivizes adherence to clinical guidelines for several cancers, including breast, prostate, lung, colorectal, and head and neck cancers. It also applies to two secondary disease sites: bone metastases and brain metastases. This RO-APM is based on an episodic payment that is triggered by a clinical treatment planning CPT (rather than with the actual delivery of treatment such as with the current OCM model) and concludes 90 days after the last radiation treatment.

Regarding payment structure, ASTRO’s RO-APM is very similar to the OCM in that it would also include a monthly fee called a Patient Engagement and Care Coordination Fee (PECC), and retrospective performance-based payment incentives. The activities and goals of the PECC and retrospective measures reflect those of the OCM.

However, one of the most significant differences between the OCM model and ASTRO’s RO-APM involves the payment of fees. The RO-APM is based on an episodic model that pays a portion at the beginning of treatment planning and a final payment at the end of treatment; which, for many practices, will be a fundamental change from fee-for-service radiation oncology reimbursement, and require an overhaul of billing systems and workflows.

Healthcare IT vendors, including EHR vendors, are not yet ready with one, single solution to meet all the requirements of a healthcare provider participating in OCM.

Conclusion

Secretary Azar’s remarks indicate a complete about-face concerning mandatory vs. voluntary participation for APM programs by the Trump administration that may affect not only radiation oncology but all aspects of medicine.

Healthcare IT vendors, including EHR vendors, are not yet ready with one, single solution to meet all the requirements of a healthcare provider participating in OCM. While some vendors can meet various quality measure reporting components, there is not one single vendor today that can address all the financial and operational obligations inherent in this type of APM.

More than two dozen IT vendors, including several EMR vendors, have made the OCM Vendor Pledge this year, which is a step toward a comprehensive IT solution. However, the complexities of the system pose significant barriers which will take several years to overcome.

Want to learn more about the state of the IT industry regarding bundled payments and APMs? Look for our Bundled Payments report coming out in the next couple weeks (or our blog for a short primer on the state of the market) and our Payment Integrity report slated for release in mid-2019.

By Brian Murphy and Brian Eastwood

Seeking to liberate the industry from its self-created morass of siloed data and duplicative quality reporting programs, the Department of Health and Human Services (HHS) issued 1,883 pages of proposed changes to Medicare and Medicaid. It renamed the Medicare and Medicaid Electronic Health Record (EHR) Incentive Programs (known by all as Meaningful Use) to Promoting Interoperability Programs (PI).

As widely reported, it would eliminate some measures that acute care hospitals must report and remove redundant measures across the five hospital quality and value-based purchasing programs. It would also reduce the reporting period to 90 days. HHS will be taking comments until June 25, 2018.

HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place.

Certified EHRs as Enablers of Interoperability

HHS believes that requiring hospitals to use 2015 Edition CEHRT in 2019 makes sense because such a large proportion of the hospitals are “ready to use” the 2015 Edition. Ready to use is not the same as using. 2015 Edition EHRs may not be as widely deployed as HHS indicates. The following 10 month old snapshot from ONC shows hospitals have not aggressively moved to adopt 2015 Edition CEHRT.

Figure 1: Adoption Levels of 2015 CEHRT Source: Office of the National Coordinator for Health Information Technology. ‘Certified Health IT Developers and Editions Reported by Hospitals Participating in the Medicare EHR Incentive Program,’ Health IT Quick-Stat #29. Available at https://dashboard.healthit.gov/quickstats/pages/FIG-Vendors-of-EHRs-to-Participating-Hospitals.php.

Current adoption levels by HCOs are undoubtedly better, and many vendors have 2015 Edition technology ready to go, but hospitals can only change so fast. The rush to get hospitals on the most current edition has to do with the most relevant difference between the 2014 and 2015 Editions – the API requirement. APIs will be the technical centerpiece of better, more modern interoperability but adoptions levels are still low. APIs, by themselves, offer the promise of better data liquidity. For this promise to become a reality, healthcare stakeholders need more than just a solid set of APIs.

Price Transparency: Easier Said Than Done

HHS is also proposing that hospitals post standard charges and to update that list annually.

This is a nice thought, but it will take some heavy lifting to pull this off. For starters, HHS doesn’t even have a definition of “standard charge” and is seeking stakeholder input before the final rule is published. HHS also must determine how to display standard charges to patients, how much detail about out-of-pocket costs to include (for patients covered by public and private insurance), and what noncompliance penalties are appropriate.

Above all, there’s the thorny issue of establishing what a standard charge is in the first place. Charges vary by payer. Can a hospital truly state, without a doubt, the cost of an MRI or a colonoscopy? Most cannot – and technology alone will hardly solve this problem.

Patients (Not) Using Their Data

The existence of APIs will stand in the stead of the old view/download/transmit (VDT) requirement. Regarded as one of meaningful use’s most troublesome and fruitless requirements, this rule has been shed by HHS because of “ongoing concern with measures which require patient action for successful attestation.”

VDT is one of several MU Stage 3 requirements pertaining to patient engagement – along with providing secure messaging or patient-specific educational resources – that HHS has proposed dropping, under the pretense that it is “burdensome” to healthcare providers. While hospitals have struggled to get many patients to participate, the VDT requirement set the bar at one patient out of an entire population. What’s more, dropping the requirements fails to take into account how burdensome it is for patients to try to access their data, communicate with their physicians, and learn about their conditions and treatment options. It is also contrary to CMS Administrator Seema Verma’s remarks, first at HIMSS18 and again this week, indicating that the agency seeks to “put patients first.”

HHS says that third-party developed apps that use APIs will deliver “more flexibility and smoother workflow from various systems than what is often found in many current patient portals.” Whether such apps deliver “smoother workflow” is not a foregone conclusion.

Reporting Burden Reduction

HHS proposes “a new scoring methodology that reduces burden and provides greater flexibility to hospitals while focusing on increased interoperability and patient access.” The proposed scoring methodology uses a 100-point system (explained over 24 pages) in which attaining a score of at least 50 means there will be no Medicare (or Medicaid) payment reduction.

HHS is also mulling whether to abandon these measures altogether in favor of scores calculated at the objective level.

The TEFCA Angle

The biggest regulatory effort in recent months related to interoperability, other than this proposal, has been ONC’s proposed Trusted Exchange Framework and Common Agreement (TEFCA), required under the 21st Century Cures Act. TEFCA, well along in the planning stages, is a new set regulations from ONC whose goal is to catalyze better data availability using APIs. HHS in this regulation wants public comment on whether participation in a TEFCA-compliant network should replace the process measures in Health Information Exchange objective. Stated another way: Should TEFCA compliance replace 80 percent of the score for PI (75 percent in 2020)?

TEFCA is widely expected to provide a safe harbor from data blocking liability although ONC has been mum on this point. TEFCA then could do double duty: Eliminate the need to meet or report on health information exchange metrics and provide a shield from data blocking enforcement.

How much will it cost providers to comply and can they make money for providing access to their data?

Will TEFCA compliance, as a practical matter, accomplish anything? Will it make it easier for healthcare stakeholders to use each other’s data?

HHS is also considering doing away with Public Health and Clinical Data Exchange objective. It floated the idea that a provider that supports FHIR APIs for population-level data would not need to report on any of the measures under this objective. This would replace 90 percent of the score for PI (85 percent in 2020) when combined with the TEFCA knockout.

The specific API mentioned, called Flat FHIR and still in development, will probably contribute to part of the complex process of public health and registry reporting. This activity currently requires highly skilled data hunter-gatherers, usually with clinical credentials. In many organizations, these hunter-gatherers manually sift and collate multiple data sources to meet the varied requirements of the recipients of different registries. Flat FHIR, assuming it were production-ready, will certainly help, but it is unlikely that it could provide all, or even most, of the information needed for the range of public health reporting programs.

MIPS and APM Concerns

HHS acknowledges that providers are less than thrilled with aspects of the Quality Payment Program (QPP). HHS wants to know how PI for hospitals can better “align” with the requirements for eligible clinicians under MIPS and Advanced APMs. In particular, it wants ideas about how to reduce the reporting burden for hospital-based MIPS-eligible clinicians. It is undoubtedly looking for market-acceptable ideas to reduce the reporting burden where it is arguably more deeply felt – among non-hospital-based MIPS-eligible clinicians. While reducing or eliminating the reporting burden would help such providers, the big unanswered question, as it is with hospitals, is the burden of getting to 2015 Edition CEHRT.

Mandating Interoperability with Other Regulations

HHS also asks the industry how it could use existing CMS health and safety regulations and standards to further advance electronic exchange of information. It is ready to change Conditions of Participation (CoPs), Conditions for Coverage (CfCs), and Requirements for Participation (RfPs) for Long Term Care Facilities regulations to this effect. It wants to know whether requiring electronic exchange of medically necessary information in these regulations would move the interoperability needle.

Bottom Line

HHS believes that APIs will solve all of the problems that patients and healthcare stakeholders have with data access. HHS also seems prepared to declare that TEFCA compliance and 2015 Edition CEHRT guarantees that those APIs are in place. It roundly ignores the mesh of incentives that make stakeholders unwilling to share data and patients unable to access data. The industry has cried out for less process reporting and better insight into outcomes for years. This will accomplish the former but set the industry back with respect to the latter if interoperability is declared solved based on technology alone.

Current & Future Trends for the Digital Care Plan: New Report Hits the Streets

Proper care coordination depends on a longitudinal care plan that all members of a care team can view, contribute to, update, and distribute. Such a plan must also be patient-centric, holistic, interdisciplinary, and dynamic. Naturally, it will be digital and likely reside in the cloud.

Most healthcare organizations (HCOs), not surprisingly, are several years away from this type of care plan. It doesn’t help that the Centers for Medicare & Medicaid Services (CMS) and Department of Health and Human Services (HHS) have been quick to emphasize the need for coordinated care but slow to explain how HCOs should implement it or expect to get reimbursed for it with the exception of CCM coding.

It is with this need in mind that Chilmark Research releases its latest Insight Report: Longitudinal Care Plans: Delivering on the Promise of Patient-Centered Care. The report identifies the most important elements of a longitudinal care plan, describes the steps that HCOs should take to facilitate coordinated care, methods being being used to populate the data within care plans, indicates which data elements should go into a care plan, and provides best practices for using care plans in inpatient, post-acute, and behavioral health settings. In addition, the report evaluates off-the-shelf care plans from evidence-based clinical content vendors as well as three physician specialty societies.

As you may have noticed, the report focuses on the term longitudinal care plan. The word choice is deliberate – Chilmark Research sees this type of care plan playing as vital a role in care coordination as the longitudinal patient record. Both capture information from numerous sources – many beyond the traditional episode of care – to provide a complete picture of a patient’s health and well-being (in the case of the record) and of the path necessary to maintain and improve health and wellbeing (in the case of the care plan).

ONC Standards and Interoperability Framework for Care Plans

At the moment, a longitudinal care plan exists largely in theory. Individual care plans remain in silos in the offices that created them, disconnected from outside facilities. In the absence of digital connections, transitions of care must be accompanied by paper documents and phone calls. Patient access to the plan is all but absent.

So far, the health systems that Chilmark Research interviewed for this Insight Report have been able to begin to focus on filling care gaps as they have embarked on the care coordination efforts. While this is a laudable achievement – especially given the effect that even the most rudimentary care coordination can have on clinical workflow – it also shows just how far HCOs must go in order to achieve the reality of a true longitudinal care plan.

Longitudinal care plans represent a possible means to achieve the goal of greater care coordination both across and outside the HCO. This report provides a thorough overview of what steps HCOs need to consider when they are going to begin deploying a longitudinal care plan, the current status of longitudinal care plans across various care settings, and recommendations that HCOs should consider to successfully adopt and utilize longitudinal care plans. A must read for any organization or vendor looking down the barrel of widespread performance-based contracts.

ONC’s first draft of a nationwide interoperability roadmap is ambitiously vast in scope, but ultimately constrained by the past. Its purpose is to initiate a discussion within healthcare about the ways and means to achieve interoperability in 10 years notwithstanding that the discussion is already the obsession of many. It hopes that this document will launch a process that results in a national private-public strategy for supporting the kind of interoperable data infrastructure that will enable a learning health system.

The roadmap focuses on several major policy and technical themes: the impact of FFS on vendor and provider attitudes to data sharing, the potential for private payers and purchasers to incentivize data sharing, the central importance of standards and incentivizing compliance. This long, comprehensive, and in places incredibly detailed, document is really three documents in one. For those lacking the time to read through its 160+ pages, we summarize:

Part I – Letter from ONC chief Karen DeSalvo and executive summary that lays out a set of questions intended to guide the response to the overall document as well as a series of “calls to action” to galvanize industry participation. (Pages 1-15)

Part II – Exhaustive presentation of the current and potential future state of interoperability, as well as the challenges and opportunities that lay between. (Pages 6-162)

Part III – The final appendix containing 56 “Priority Interoperability Use Cases” which ONC wants to winnow down. (Pages 163-166)

Part I gives a sense of which way ONC is leaning by focusing on select high-level issues: payment policy, data governance, semantic and transactional standards, measurement of results, and probably most important, the priority use cases. The language used within the roadmap is reflective of a sea change in thinking: “send and receive” has been replaced with “send, receive, find, and use” as a way to describe what individuals need from interoperable HIT.

Part II lays out the elements of what it hopes will become the national roadmap for interoperable HIT. The roadmap’s focus on measurement – tracking and gauging the metrics of interoperability – is eerily similar to the EHR Incentive Program’s MU measures right down to using various numerators and denominators. If we learned anything from the EHR Incentive Program it is that the industry liked the incentives but disliked the actual MU objectives. Absent the carrot of incentives and/or the sting of penalties, it is hard to see how ONC can catalyze providers to embrace yet another set of operational metrics.

ONC continues to struggle with patient matching. We know that Congress will not countenance so much as a voluntary (on the part of patients) national patient identifier. Even if it did, the costs to the industry of maintaining a dual system would only add complexity to an overburdened system. Unfortunately, we think that ONC and HHS is powerless to change this extravagantly costly element of the interoperability conundrum.

Who Uses What Data?
Embedded in ONC’s treatment of HIPAA, data governance, and data portability lies an essential, unresolved issue: The rights and responsibilities of the various stakeholders with respect to the data governance. How exactly can personal health information (PHI) that is captured by clinical applications be shared within the context of care delivery?

Today, such rules of the road remain unclear, ambiguous, and deeply complex. Existing laws, both state and federal, are a patchwork in which various participants hold back data fearing liability where most often no liability exists. At the same time, various participants claim outright to ”own” this data and use this patchwork to consolidate their competitive position.

ONC rightly points out that ATM networks and airline reservation networks provide interoperability for radically simpler use cases than the health system requires (they also don’t quite have the regulatory complexity of health data). While true, the data practices of consumer-focused transaction networks are effectively incomprehensible for the average consumer. So why should we assume that therefore somehow how we need to make it comprehensible for this use case of data?

Is it reasonable to expect any patient to understand the protections offered in the vastly more complex health data realm? Resoundingly no. ONC could be soliciting input about comprehensive reconceptualization of data rights and responsibilities with respect to patient data. Admittedly, only Congress can act to change the existing regime, and even Congress is limited by legislation enacted at the state level.

This is one aspect of interoperability that will aways confound those seeking true interoperability and data exchange in the context of health. It is also why some proponents believe that interoperability will only truly occur once the patient has full control of their data (health data bank) and defines access to their PHI. But even here, we have a very long way to go before the majority of citizens take upon that responsibility.

Standards and Compliance
Most of the ideas embodied in the JASON Report and the subsequent JASON Task Force are offered up for public comment. The roadmap suggests that HL7’s new standard, FHIR, could be effective in the 6-10 year timeframe, considerably longer than the time contemplated by the Argonaut Project. ONC also stops short of saying that element-centric interoperability will or even should replace document-centric exchange. But it talks about electronic sharing of summary care records – not documents – between hospitals, SNFs, and home health agencies.

The roadmap reinforces and effectively doubles down on the centrality of standards in any plan to foster better interoperability. Borrowing liberally if not literally from the Common MU Data Set, ONC wants to know how it can help make data more intelligible inside and between providers. As a companion to the roadmap it also released an “Advisory” on the most common standards in order to get opinions on where the industry’s best practices focus should be. ONC believes that standards-compliance and the elimination of non-conforming data representations will pay dividends.

The counterpoint to this emphasis on standards is the view held by many smaller, often start-up vendors that see standards as a means to preserve the status quo, serving more of gatekeeping function than an enabling function. Data networks in other industries, while admittedly simpler, do not rely on prescriptive application-to-application data representation standards. Healthcare is the only industry with such an ornate implementation of level 7 of the OSI stack. Smaller vendors would rather see simpler standards, published APIs, or more of a focus on the results of exchange than on how the result is to be achieved.

The reality is that major HIT vendors and major providers grumble about prescriptive requirements but by and large remain deeply committed to standards and compliance. We think that ONC could have at least offered up the prospect of achieving interoperability goals without specifying the mechanism down to specific data elements. Unfortunately, ONC appears to be continuing upon its questionable path of highly prescriptive guidelines – guidelines that ultimately hinder innovation rather than create opportunities for innovation to flourish.

Read the Priority Use Cases Right Now
Part III contains 56 uses cases, obviously culled from users, and are a dog’s breakfast of highly specific and extremely general interoperability complaints. ONC is asking the industry to help it order and prioritize the vast range of technical and policy question that they raise.

We recommend that if you read nothing else in the entire roadmap, you should read these use cases because they sound more like demands than actual use cases.

For example, Item #8 – certified EHR technology (CEHRT) should be required to provide standardized data export and import capabilities to enable providers to change software vendors. Every HIT vendor believes publishing database schemas is the last stop on the way to mob rule. In case vendors as a class were uncertain about their reputation among at least some providers, this “use case” provides unambiguous feedback.

A large number of the use cases are decidedly patient-centric despite the decidedly provider-centric orientation of the wider healthcare system and significant resistance to any kind of reorientation. A significant number of the use cases are related to payment and administrative uses even though the roadmap’s focus is on clinical data and clinical uses of the data. There are also a large number of use cases related to support for the public health system and clinical research. Both of these constituencies unfortunately take a back seat to immediate patient care and payment priorities.

Issues Remain
The roadmap mentions, without analysis, the fundamental problem of FFS-based disincentives to data sharing. HHS has recently announced new goals for progress on the way to VBR but ONC has little leverage to do much more.

Another important issue that the roadmap does not and likely can’t address is the level of investment in IT by healthcare providers. While many yearn for an interoperable infrastructure comparable to what banking or retail enjoy, those industries spend far more, as a percentage of revenue, on IT than healthcare providers. Progress on EHR adoption was not a result of provider’s reallocating resources to technology adoption, but federal incentives under the HITECH Act.

Therefore, can we really expect HCOs to increase IT budgets to support interoperability? Probably not. Moreover, ONC and more broadly HHS, do not have the funding to support interoperability adoption on the scale of EHR adoption via the HITECH Act, absent congressional action. Most HCOs are cash-strapped and struggling with a multitude of changes occurring in the marketplace and frankly have a fairly poor record in the effective adoption, deployment and use of IT in the context of care delivery. This is a knowledge intensive industry that has done a pretty lousy job at effectively harnessing that knowledge via IT.

The only leverage ONC and HHS have to improve interoperability is payment incentives via CMS. Recently, HHS announced that it will accelerate the move to VBR. Following closely on that announcement was the formation of the Healthcare Transformation Task Force, an industry association that sees its task as helping the industry migrate to VBR. It is far more likely that organizations such as this in conjunction with payment reform will do far more to achieve interoperability than any prescriptive roadmap.

It may be high time for ONC to step back and let the industry tackle this one on their own for only they will truly have a vested interest, via payment reform, to share PHI in the context of care delivery across a community in support of the triple aim and population health management.

For many, the delay of Stage 3 of the Meaningful Use program evoked a collective sigh of relief, providing a much-needed extra year to focus on the challenging requirements for patient engagement and interoperability. As distant as 2017 may seem however, the preparation for Stage 3 is already underway in Washington; the vendor community and providers will soon be scrambling to follow suit.

Barring further delays, the timeline is as follows: This fall CMS will release the notice of proposed rulemaking (NPRM) for Stage 3 and the corresponding NPRM for the Standards and Certification Criteria. The former is the programmatic framework for what to expect – measures, percentages, reporting requirements, etc., while the latter is the technical product guidelines for software vendors to follow in order to receive ONC certification as a Stage 3 compliant solution that will enable their customers, if properly implemented and used, to collect those sought-after incentive dollars. The final rule is expected to drop sometime in Q1-Q2 of 2015 – just one year away.

But that doesn’t mean there’s a year to put off thinking about it. In a few short weeks, the Health IT Policy Committee (HITPC) is set to deliver an official recommendation on the topic of Stage 3’s patient engagement requirements to the ONC. From all indications, it appears this super-group of wonks will press for inclusion of patient-generated health data (PGHD – yet another #ONCronym for your twitter streams) into electronic health record systems. The technical experts have defined PGHD as follows:

“health-related data—including health history, symptoms, biometric data, treatment history, lifestyle choices, and other information—created, recorded, gathered, or inferred by or from patients or their designees (i.e., care partners or those who assist them) to help address a health concern.”

At first glance, this is a no-brainer, as we’ve been hearing the clarion calls for such inputs for the better part of the last decade. 60 percent of US adults claim to track their weight, diet, or exercise routine, according to the Pew Research Center’s data. Evidence for the positive impact of this data on quality, satisfaction, and in some cases cost is thin but growing.

But as we are learning through the first two stages of this program as well as the early headaches of ACA rollout, reams of sophisticated studies floated down from the ivory tower do not effective policies make. Despite the need for PGHD, when it is wonkified, ONCified, and held to the temple of the nation’s delivery system, there may be a small disaster in waiting. Below are three questions Chilmark is keenly tracking throughout the remainder of 2014:

What Constitutes PGHD?The language used thus far raises much speculation about what exactly this inclusion will mean when it hits the front lines. The definition provides only a general description, leaving a lot of possibility for interpretation and application down the road. For many, PGHD evokes the notion of datastreams from the vast array of health and wellness devices such as fitbits and jawbones, Bluetooth medical devices, and of course, tracking apps. Yet the definition above makes PGHD seem to carry more of an health risk assessment (HRA)-like utility, where patients fill out a survey and have it sent to their doctors in advance. Yet another angle is the notion of patient-reported outcomes: clinically oriented inputs from patients with regard to their physical and psychosocial health status. Outfits like ATA, HIMSS and others are lobbying for full inclusion of patient-monitoring and home-health data.

Each of these use cases brings with it a unique set of programmatic and technical components. A popular example as of late is with biometric data: If a panel of diabetic patients are all given Bluetooth glucometers that input into respective EHRs, then what – Will someone monitor each of them? Or are HCOs expected to fit those data into an algorithm that alerts and ultimately predicts any aberrance? This has been referred to as providing doctors with ‘insight’ rather than raw data. That sounds snazzy, but can we realistically mandate the creation of insight?

Collecting data such as patient allergies or side effects appears a simpler use case on paper. Yet HITPC is appearing to use everyone’s favorite A+ students – IDN’s like Geisinger, Kaiser Permanente, and Group Health Cooperative among others as the basis for their recommendation. As one example, the report lauds GHC’s eHRA model, which is based on a shared EHR and shared clinical staff for data review. As nicely as that may work, Chilmark is skeptical that it’s reproducible in an average clinical setting. Generally, the innovators in the digital engagement space have been the insurers, not the providers. We understand the need to look at innovators in order to prescribe a path for the rest of the country, but in talking to regular folks at urban hospitals, community clinics, mid-sized IPAs –it’s more likely that fluid data is a byproduct of integrated systems, not the other way around.

How Will the Market Respond?Despite its unpopularity in the C-suite, meaningful use has forced EHR vendors to pull their heads out of the sand and advance their product features. In addition to giving providers a break, part of the reason behind the Stage 3 delay was for vendors’ benefit: “[to provide] ample time for developers to create and distribute certified EHR technology…and incorporate lessons learned about usability and customization.” The Standards and Certification Criteria 2017 edition will play a big role in the next lurch forward, and one can be sure that those new mandated features will be all the rage at HIMSS 2015.

Yet at the broadest level, the evolution of EHRs (billing >> administration >> clinical) appears to be stalling. In exploring the patient engagement market and the to-date limited functionality of tethered patient portals despite Stage 2’s requirements one thing has become clear: EHR vendors will simply not just add new features for the sake of their customers (forget about patients). With new PGHD functionality emerging, we expect new companies to step up to the plate and seek modular ONC-ATCB certification

An example already underway is 3rd party data integration. Over the last few years, device manufacturers, startups, and third parties started seeing the value in injecting their data into EHRs. The emergence of middleware companies who provide integration as a service, such as Nanthealth, Corepoint, and Validic, will continue as PGHD requirements develop over the coming months. Similar companies will start (and already are) filling the void for HRA functionality, portal requirements, patient communication, and so on. We expect that this will only exacerbate the headache faced by CIOs & CMIOs with a long list of purchasing options. Startups take note: It should also set off a shopping spree by EHR companies and other enterprise vendors looking to buy rather than build. Allscripts acquisition last year of Jardogs is one such example.

Will Providers be Ready?In a word, no. The inclusion of PGHD brings with it an avalanche of procedural and programmatic preparation: data review and quality assurance, governance models and new workflows, the prickly issue of data ownership, staff time and training, liability concerns, HIPAA extension of coverage, ever-increasing insurer coordination, clinician accountability, and of course, patient consent, onboarding, and marketing. With the last one, keep in mind that we now live in the post-Snowden era…

Of course, without details of the required measures, further hand-wringing is unwarranted at this point. But suffice to say there’s a small storm-a-comin.’ As the definitions, rules, and standards of patient-generated health data emerge, we look forward to what promises to be a rich commentary and response to the NPRM amidst the broader discussion in the health IT community throughout 2014.

Now that NwHIN has been spun-out into the public-private entity Healtheway one has to wonder exactly what value they can deliver to market that will sustain them as they attempt to ween themselves from the federal spigot. Healtheway has no lack of challenges ahead but they intend to target one area that presents an interesting opportunity. Question is: Are they too early to market?

During a recent webinar, Healtheway’s interim executive director, Mariann Yeager, outlined the origin of Healtheway, the apparent traction Healtheway is gaining in the market and what their plan is going forward.

Healtheway got its start via funding from a variety of federal sources, all of whom who were looking for a solution to address their unique problems. For the Social Security Administration it was the need for a nationwide network to facilitate processing of disability claims. For the VA and lesser extent DoD it was the need to enable military personnel to receive care in the public sector and insure that their records were complete. Health & Human Services led most of the development effort leading to NHIN CONNECT, a less than stellar technology platform built by beltway bandits (who else), that hit the market with a thud.

One of the things the feds did get right though is a clear and comprehensive policy for data use sharing across disparate entities. The DURSA (data use and reciprocal support agreement) remains one of the key differentiators in Healtheway’s portfolio. Healtheway’s intent is to leverage the DURSA as the “unifying trust framework” and build upon that with a common set of technical exchange requirements (standards) to facilitate exchange with eHealth Exchange (this replaces the former NwHIN Exchange). Healtheway has also enlisted CCHIT to perform testing of technology vendors solutions to insure they comply with the technical exchange requirements that will allow for HIE-to-HIE connectivity.

That last sentence is the kicker. Healtheway and its eHealth Exchange is not intended to be an uber-national HIE but a set of policies and technical specs that will allow HIEs, be they public or private, to share information across institutional boundaries. Therefore, Healtheway will not get into the current rat’s nest of looking to on-board the multitude of ambulatory EHRs into an HIE but sit one level above that facilitating exchange across HIEs. This is something that many regional and state HIE programs are looking to facilitate, thus it is not surprising to see that a significant proportion of Healtheway members come from such organizations.

There will be a need for this functionality at some future point in time, but not today and likely not tomorrow either. Three key challenges stand in their way:

1) Getting buy-in from healthcare organizations and technology vendors. While membership has indeed grown, Healtheway is offering membership at a discount (likely a loss) to gain traction and unfortunately they still do not have significant traction as many brand names in healthcare are missing.

2) A tainted history with more than its share of missteps. Slowly coming out from under the wing of federal politics as a pseudo independent organization (Board still has plenty of government influence), Healtheway may begin to act more as an independent organization, more like a business. Unfortunately, due to a likely continual need for government funding that independence will likely be limited.

3) The HIE market, both from a technology, policy and implementation/deployment perspective is still primitive. The broad market is simply nowhere near the point of needing what Healtheway intends to offer for a few years to come, at least as it pertains to the exchange of clinical data. Good idea, too early to market. That being said, tehre will be value on the transaction side, e.g., SSA and disability claims processing.

Hopefully the future will prove us wrong on this one and Healtheway will indeed prosper and contribute to the maturity of the HIE market. But our advice, don’t bet on this horse just yet, give them six months than take a second look.

Yesterday, I was in Washington DC to attend ONC’s Consumer Health IT Summit. While having high hopes for some breathtaking new developments, ultimately walked away disappointed as this event ultimately devolved into a Blue Button promotional event. Now I have nothing wrong with some promotion, after all my background is heavily steeped in marketing. What I do have a problem with, as an analyst, is major hype around any concept, technology, etc. that is not balanced with some serious, thoughtful critique.

There were times when I thought this event felt more like a channeling of a Health 2.0 event with the clarion call of “Give me my damn data” being chanted. At times like that I had to pinch myself to remember, no, I’m in the grand hall of the Hubert Humphrey Building. Of course the multiple, large portraits of past HHS Secretaries hanging from the walls was also a clear reminder of exactly where I was.

But despite some shortcomings, the event was focused around what may be the government’s (VA & CMS) finest contributions to promoting patient engagement – the Blue Button. The Blue Button was first released in 2010 by the VA to allow veterans to gain access and control of their personal health information (PHI). CMS later released their own version of Blue Button that allowed beneficiaries access to their claims data. The VA thought Blue Button would be a success if they saw 25K Vets use this capability. The VA passed that number long ago and now, two short years later, the doors have literally been blown off that original estimate with some one million patients now using Blue Button to gain access and control of their PHI.

That is a phenomenal rate of adoption especially when one considers what they actually have access to.

A Blue Button download does not give one a well formatted easy to read file of their PHI. No a Blue Button download is nothing more than a simple ASCII text file and when you look at such a file dump, it isn’t pretty. Thankfully, ASCII has been around since we were hunting the great wooly mammoth during the ice ages so just about any piece of software (e.g., legacy EHRs and claims data bases) can easily create an ASCII file and developers can likewise take an ASCII file and repurpose that text into something fairly legible.

One company doing just that is Humetrix who I first met at the HDI Forum in June. They were also present at this event where they gave me a quick demo of their latest version of iBlueButton – a nice piece of mHealth software that takes the ASCII file from a Blue Button download and reformats it into a very easy to read and decipher file that a consumer can share with their care team. There is even an iPad version designed specifically for physicians, which gets to my next point.

Whenever I am in the company of physicians, I often ask them how they are coping with the changes taking place and specifically adoption of HIT. Had one such conversation Sunday while I was doing the charity Jimmy Fund Marathon walk for cancer research. On this walk there are always quite a few oncologists and nurses and seeing as you’re walking for a good many miles, plenty of time to talk.

I asked one oncologist about HIT adoption at Dana Farber and meaningful use to which he quickly replied: “Meaningful use is the bane of our existence right now.” So I asked further: What problem could HIT really solve for him? He had a ready answer: “Rather than a new patient showing up with a mound of paper records that I must laboriously review, I want a digital version of a new patient’s record with labs, pathology, images, meds, etc. all readily laid out so I can make a more rapid assessment to define a treatment plan for that patient.”

Now we could wait until all the HIEs are in place, all DURSAs are signed resulting in frictionless data flows between healthcare institutions. We could wait until every certified EHR for Stage Two is deployed and physicians start using Direct messaging. We could also wait for patients to request under Stage Two that their provider transmit records to another (still not sure how complete those records need to be to meet Stage Two). Or we could enable Blue Button, educate the public and let them take direct control of their PHI and share it with whom they see fit. Plenty of options but if we really want to change healthcare, the last one is the most impactful, the most viable, but unfortunately like the others, it will take some time, though likely less than getting those DURSAs signed.

Getting back to yesterday’s event and my disappointment, following is what I would like to see in the future:

Honest and frank discussion on giving patients access to their records. The American Hospital Association was in vehement opposition to the Stage Two rules on patient access to their records. Let’s put them on stage to explain why, to give that contrarian viewpoint, to provide balance.

Enlist providers to discuss the benefits and challenges of giving patients access to their records. How does patient access to records change the conversation of care? How does it impact the workflow of a practice? What fears may physicians have and how do we address them?

Fewer panels of talking heads and more real world perspectives. The event had a wonderful moment when a Vietnam veteran talk about his healthcare challenges and how Blue Button contributed significantly to his self-management. Let see more of that, e.g. a Medicare patient using Blue Button.

And my biggest disappointment of all had nothing to do with this event – it had to do with Stage Two.

If indeed the feds really believe in the Blue Button the same way they believe in Direct then why the h*ll did they not directly put it into the certification criteria for EHRs. Clearly something went amiss and it is unfortunate.

Thankfully, many vendors have stated they will support Blue Button in a forthcoming release including Allscripts, athenahealth, Cerner, Greenway, and many others. Our last HIE report also found just over 25% of vendors profiled intend to support Blue Button in 2012. There is momentum here already, now we just need to on-board physicians to talk to their patients about the value of having access to and control of their PHI for as we move to more capitated models of care, the engaged patient may indeed be the miracle drug to rescue our healthcare system from financial collapse.

Addendum: Have received feedback regarding Stage Two and patient access to their records so let me clarify. Stage Two does indeed grant a patient the ability to access, view and transmit their records. This is incredibly powerful, especially with the push towards standards and the transmitted file being in a CDA standard format. As Keith Boone so clearly articulates, the content package that is transmitted under Stage Two is a fairly complete, summary document of care received and an individual’s health status. But Stage Two does not support an ability to transmit a full and complete longitudinal record. It is my understanding that the Blue Button, at least the instance at the VA, allows a patient to download their complete record thus why I took the argument down the path I did.

In time it is my hope that the Blue Button becomes a symbol, as Keith puts it, “a verb,” that all will understand instinctively – click this, get your data and move on. Other services will take that data dump, transpose it the way you want it for the purposes you intend. The technology and standards behind it will simply become irrelevant to the user. It just works. Getting there will be the task of the S&I Framework workgroups. I wish them God’s speed in accomplishing that task for the benefit of all citizens.

Many in both the private and public sectors are working hard on that vision – keep up the good work!

This week, the well regarded periodical, Health Affairs published its annual issue that focuses on Healthcare IT (HIT). One of the papers published was authored by ONC, with ONC head, Dr. David Blumenthal listed as a co-author. The paper, The Benefits of Health Information Technology: A Review of the Recent Literature Shows Predominantly Positive Results (think they could have made the title any longer?) ultimately took a close look at 154 studies conducted between July 2007 and February 2010 on the impact of HIT on a number of critical factors including quality and efficiency of care delivered and physician satisfaction.

There has been more than a few questions raised over the last couple of years as to the actual contribution HIT provides and whether or not we are on the right track with the substantial investment this country is making via incentives, grants and various programs to encourage the adoption and use of HIT among physicians and hospitals. Adding to those questions is the current fiscal crisis that this country and virtually all States are facing leading one to wonder, is this the best use of the taxpayers’ precious dollars. It appears that ONC’s sponsorship of this exhaustive study, which as the title states found overall positive contributions of HIT adoption, is an attempt to put those arguments to rest. It sure seems that way to this analyst as there was an unprecedented amount of “media push” coming out of ONC to get the story out including granting exclusive interviews for which Chilmark took advantage of yesterday in a ~15 minute interview with Blumenthal.

Prior to the interview, ONC requested the list of questions I would ask of Dr. Blumenthal. Thinking that 15 minutes was precious little time, I developed three questions that were open ended, but also did not tread into waters which I knew were verboten, e.g. what will be in Stage 2 MU… The three questions and paraphrased responses based on my notes are provided below:

1) Many of the negative findings appear to involve challenges in adopting CPOE and workflow redesign. How will these findings/revelations influence future policy within ONC and more broadly across HHS?

Blumenthal: We always knew there would be challenges in adoption of HIT and for CPOE we significantly lowered the threshold in Stage One MU requirements. I can not speak to future MU requirements but do believe we are on the right track and it is important to remember that the intent of this law is not for everyone to meet MU requirements.

2) If indeed the “human element” is critical to successful HIT adoption, how will HHS seek to improve that metric in the adoption process?

Blumenthal: This is where the Regional Extension Centers (RECs) will play an important role in the future. RECs will be sharing best practices across the country amongst one another to insure that the human element in the adoption of HIT is minimized. Also, over time as more systems are installed, greater adoption occurs and physicians become more comfortable with their use we will see the human element become less of an issue.

3) What did you personally find as the most interesting/insightful finding of this publication review exercise?

Blumenthal: I was pleasantly surprised that the literature review supported positive outcomes as the result of the adoption of HIT across so many dimensions, particularly gains in efficiency.

Going through those questions actually went more quickly than I expected so I tossed in one more:

4) Usability of HIT solutions (EHR) remains an issue and Chuck Friedman of your office presented at HIMSS’11 that ONC, along with NIST were going to dig deeper into this issue. What will be ONC’s role?

Blumenthal: The challenge of usability is very real. I have head from many physicians, ‘I wish the computer worked for me and I didn’t work for the computer.’ Oftentimes, physicians do not do enough due diligence before buying a solution and do not know fully what they are getting until it is fully installed. EHRs are also very complicated products so determining what one might end up with is not always readily apparent in an initial review of a product. We hope to shed some sunshine on true usability. ONC itself will not ultimately be doing the testing, we will look to others (editor’s note: he’s likely certifying bodies such as CCHIT, Drummond, Surescripts, etc.).

In closing, I have a ton of respect for Dr. Blumenthal. The job he was given when he joined HHS two years ago was monumental. He has put in a Herculean effort to bring us to where we are today and I hope, I pray, that his successor will be able to carry the baton forward with such skill.