A new bipartisan committee’s working group will gather on Capitol Hill throughout the coming months to find ways to improve electronic health records, according to Senate health committee chairman Lamar Alexander (R-Tenn.) and ranking member Patty Murray (D-Wash.).

The group will work to find five or six ways to “make the failed promise of electronic health records something that physicians and providers look forward to instead of something they endure,” Murray said in an announcement.

All members of the Senate health committee are invited to be a part of the working group. Staff meetings begin this week, with participation from health IT professionals, industry experts and government agencies.

The working group’s goals include the following:

Help providers improve quality of care and patient safety.

Facilitate interoperability between EHR vendors.

Empower patients to engage in their own care through access to their health data.

Protect privacy and security of health information.

The working group isn’t the only way Alexander and Murray are pushing for change when it comes to EHRs.

The Government Accountability Office placed the Veterans Affairs Department’s healthcare system on a list of high-risk programs for 2015, saying at an April 29 Senate Veterans’ Affairs Committee hearing that the agency needs to address inadequate oversight and ambiguous policies.

“Risks to the timeliness, costeffectiveness, quality and safety of veterans’ healthcare, along with other persistent weaknesses GAO and others have identified in recent years, raised serious concerns about VA’s management and oversight of its healthcare system,” said GAO Healthcare Director Debra Draper at the hearing.

GAO prepared testimony (pdf) says VA operates one of the largest healthcare delivery systems in the nation, including 150 medical centers and more than 800 community-based outpatient clinics.

Enrollment in the VA healthcare system has grown significantly, increasing from 6.8 to 8.9 million veterans between fiscal years 2002 and 2013, GAO says.

Over this same period, Congress has provided steady increases in VA’s healthcare budget, increasing from $23.0 billion to $55.5 billion.

At the hearing Draper outlined five major areas that put the VA at risk of failing to provide adequate healthcare to veterans including ambiguous policies and inconsistent processes, inadequate oversight and accountability, information technology challenges, inadequate training for VA staff and unclear resource needs and allocation priorities.

John Daigh, the VA’s assistant inspector general, agreed with Draper’s assessment of the Veterans Health Administration.

“VHA is at risk of not performing its mission as the result of several intersecting factors,” Deigh said. “VHA has several missions, and too often management decisions compromise the most important mission of providing veterans with quality healthcare.”

Daigh focused on the Veterans Integrated Service Networks – regional offices that are set up to oversees VA medical centers in certain areas – saying the current VISN structure has not worked effectively to support and solve problems facing hospitals.

One role of the VISNs is to make sure medical providers at each facility are doing their job properly with periodic reviews.

Daigh said in prepared testimony (pdf) that a forthcoming VA OIG report found that in hospitals where there are specialty units with small numbers of providers, it is difficult to obtain unbiased peer reviews of clinical cases and assessments of clinical performance by peers.

That lack of data makes it difficult for VISN’ to accurately assess medical care providers. But medical centers shouldn’t be shouldering all of the blame, Daigh said.

“The VISN structure has been inconsistently effective in addressing this issue,” he said.

Each VISN has a different internal organization and each medical facility has a different internal structure.

“This lack of standardization makes the dissemination of information and policy to facilities challenging and the acquisition of critical data from facilities more difficult,” Daigh said.

For more:
– go to the hearing page (webcast and prepared testimony available)

Most hospitals don’t have good ways of measuring the complex costs associated with an individual patient’s stay in the hospital. The VA is one surprising exception.

The success of health reform in the US depends on finding ways to control the growth of costs. Hospital care is expensive. And when patients have to be readmitted unexpectedly after discharge, it can really crank up spending.

As we strive to keep health care costs in line, reducing hospital readmissions is drawing a lot of attention. Reducing preventable readmissions could reduce health care spending and improve quality of care at the same time.

But very little research on readmission costs has been done. An exception is a study that found that one in five elderly Medicare patients is readmitted to the hospital within 30 days of being discharged, at an estimated cost of $17.4 billion in 2004.

Most hospitals don’t have good ways of measuring the complex costs associated with an individual patient’s stay in the hospital.

But there is, however, a hospital system that does a very good of job of tracking these costs: the Veterans Health Administration.

Veterans Affairs could provide a blueprint

The Veterans Health Administration (the VA) operates 119 acute care hospitals across the US, and has created an unparalleled comprehensive patient-cost accounting system, its Decision Support System (DSS).

The DSS works from the bottom up by summing the individual resources and costs each individual patients winds up needing during their hospital stay. Unlike most other hospital accounting systems, the VADSS also can separate costs that are fixed regardless of the volume of services provided, such as administrative overhead, from costs that vary with service volume, such as lab tests or imaging. All of this means that the VA can track patients’ costs with greater precision than most hospitals, and can more easily see the cost of readmissions.

There are other reasons why VA is a good setting for studying readmission costs. VA hospitals have a simpler set of incentives around readmitting patients. Under Medicare, hospitals face a trade-off between receiving payments for readmitting Medicare patients and avoiding payment penalties for not readmitting patients under the new ACA regulations.

But in the VA system, budgets are set annually, so there is no financial incentive to readmit patients. It will not increase the amount of money VA hospitals get. And physicians who work in VA hospitals are salaried VA employees. They do not gain financially when they readmit patients, so they have no incentive to provide unnecessary care.

How much money does preventing readmission save?

In a recent study, Theodore Stefos and I used 2011 Decision Support System data to examine the component of cost that varies with a readmission, to provide hospital managers with a more realistic estimate of how much they could save by reducing readmissions.

We found that managers could expect to save $2,140 for the average 30-day readmission prevented. For heart attack, heart failure, and pneumonia patients, expected readmission cost estimates were higher: $3,432, $2,488 and $2,278 respectively.

We also found that patients’ risk of illness was the main driver of expected readmission cost. This is an important finding for managers. Even though this is a factor they cannot control, they can expect that patients with a greater risk of illness might be at greater risk after controlling for other factors such as age. Men also were much more likely to be readmitted than women, as were lower income and unmarried vets. Understanding this information can help hospital managers better predict which patients are at risk for readmission, and to take steps to address this proactively.

While the VA has some processes of care that differ from other health care systems, its experience has important lessons for private sector hospitals, especially for those that treat a high share of chronically ill or low-income patients.

Why it is important to know what readmissions cost

Today hospitals are under increasing pressure to curb readmissions. For instance in 2013 Centers for Medicare & Medicaid Services (CMS) started to financially penalize hospitals for 30-day readmissions that exceed national averages for heart attack, heart failure and pneumonia. As of October 2014, chronic obstructive pulmonary disorder and elective knee and hip replacements are also being targeted and the penalty has increased up to 3% of the total Medicare reimbursement to the hospital.

Hospital managers would like to know what actual cost savings are when a readmission is avoided, so they can understand how readmissions affect their overall budgets.

The Centers for Medicare & Medicaid Services has paid out nearly $30 billion in meaningful use incentives for hospitals and physicians to adopt EHRs. But some members of Congress, the body that approved those funds, are about as frustrated with EHRs as doctors and nurses.

“The evidence suggests these goals haven’t been reached,” said Senator Lamar Alexander, R-Tennessee, in a long EHR hearing followed by Erin McCann, Healthcare IT News managing editor.

Robert Wergin, MD, president of the American Academy of Family Physicians, said that family physicians are having a difficult time with the Stage 2 meaningful use requirements. The “time, expense and effort it takes makes it not worth while,” said Wergin. Indeed, some 55 percent of physicians surveyed plan on skipping Stage 2 all together.

“The issue of interoperability between electronic health records represents one of the most complex challenges facing the healthcare community,” said Wergin. The government “must step up efforts to require interoperability.”

A central problem, as McCann wrote, is that “Vendors have no incentive to share data and create more interoperable systems. There’s the question of data ownership here. There’s the question of competition. And there’s the question of standards, or lack thereof.”

“The vendors are siloed,” as Wergin said. “And you’re held somewhat hostage by the vendor you have.”

Deprivation has a way of making you feel excessively thankful for even the most meager offering. Yoni Maisel, a reflective patient and patient advocate with a rare genetic primary immune deficiency disorder, conveyed just this sense of disproportionate gratitude in an exuberant recent piece describing the impact of technology on his life.

Inspired by Eric Topol’s new book, The Patient Will See You Now, highlighting the power of the smartphone (my WSJ review here), Maisel went out and bought one. He then received (on his smartphone) an email from a doctor who had read about the symptoms Maisel had previously described on his blog related to a second, extremely rare disease he has (Sweet’s Syndrome), and thought she had a patient with a similar condition. After viewing photos of skin lesions Maisel took (with his smartphone) and shared with her (with his smartphone), the doctor was reportedly convinced her patient had Sweet’s Syndrome as well.

Maisel tweeted enthusiastically that Topol’s book (and, implicitly, the technology he champions) “Just Played Part in Dx of 1in 1Million #RareDisease.”

A somewhat different reaction was shared by Rick Valencia, Head of Qualcomm Life, who commented, via Twitter; “Shocking that buying a smartphone and sending a pic considered a tech breakthrough. #onlyinhealthcare”

On the one hand, of course, Maisel’s story obviously represents a terrific outcome for the newly-diagnosed patient, and – precisely as Maisel and Topol emphasize – highlights one way smartphone technology can improve medical care.

At the same time, Maisel’s joy, paradoxically, also reminds us of a deep flaw in the system, as Valencia’s comment begins to suggest. At issue: poor data sharing, a medical tragedy of underappreciated dimension. Valuable, even vital information often remains uncaptured, unanalyzed, and, especially, unshared.

The human consequences associated with poor data sharing were poignantly described by Seth Mnookin in his New Yorker article last year profiling a family whose son, Bertrand was born with a mysterious disease that eluded rapid identification. The family (like an estimated 25% of patients with unknown genetic disorders) was able to obtain a diagnosis by exome sequencing, yet struggled to locate others with a similar condition. It wasn’t until the father, Matt Might, blogged about it – and had the story picked up by Reddit and others – that he was able to locate others with the disease.

The key point is that the networks afforded by Reddit were fundamentally richer than any medical dataset. If someone – the father in the Mnookin story, the doctor in Maisel’s story – wants to find others who have similar genetics and phenotypes, they need to rely on public, non-medically-specific networks because these networks, while not purpose-built, are nevertheless far denser, and often, it seems, the best option available. The issue this speaks to is what I’ve heard referred to as Matticalfe’s Law, named by physician and informaticist John Mattison to suggest a variant of the familiar Metcalfe’s Law. (Disclosure: while I’ve no business relationship with Mattison, he is co-chair of the Global Alliance for Genetics and Health eHealth working group, on which I serve. Also, to offer my usual reminder/disclosure, I am CMO at DNAnexus, a company that makes a cloud-based platform for genomic data management and collaboration).

Metcalfe’s Law is the idea that the value of a network is proportional to the square of the number of participants – i.e. adding more people to a network increases value not linearly, but exponentially. It’s a key principle underlying the concept (and power) of networks (though not without its critics – see here).

Matticalfe Law, as Mattison explains it, is that “the value of data silos is very limited, but when deployed in aggregate yields a law of accelerating returns rather than a law of diminishing returns, similar to the network effect of Metcalfe’s law.” Mattison adds he “hybridized the eponym to distinguish it from the classical network effect, hence Matticalfe’s Law.”

One implication here is that if every cancer center, every medical center, every rare disease center shared their data fully, then as a whole, these data would be profoundly more valuable and useful. The chances that a patient with an unusual mutation and phenotype would have someone like them, somewhere in the world, would be so much higher.

So why isn’t this done?

For starters, most hospitals – even leading centers — are struggling to meaningfully organize the genetic and phenotypic data of their own patients in a fashion that can truly inform clinical decision making, as I discussed late last year; thus, you can argue that it’s hard to share with others what you can barely grasp yourself.

A second factor, of course is privacy; medical centers typically emphasize the special nature of medical data, and express concern about the fate of rich information in a shared dataset.

Yet, many experts are skeptical that this represents the true (or only) explanation; as Mnookin writes,

‘If you want to be charitable, you can say there’s just a lack of awareness’ about what kind of sharing is permissible, Kohane said. ‘If you want to be uncharitable, you can say that researchers use that concern about privacy as a shield by which they can actually hide their more selfish motivations.’”

In other words, even if top centers were able to collect and usefully organize phenotypic and genetic data on patients, would they share most of this information or silo it?

I’m not sure I know anyone who would bet against “silo.”

Whether consciously recognized or not, these data are perceived as representing a competitive advantage for the institutions and individuals who generated them (and notably, in this context, the “generating individual” is understood to be the researcher, not the patient!).

Leading cancer centers (for example) have more data than most other hospitals and practices – even though their total share of cancer patients is relatively small, as something like 85% of cancer care occurs in the community. In a world without rich data sharing, today’s top cancer centers enjoy a distinct competitive advantage; their datasets (and more broadly, their experience sets), while individually small in the absolute sense, are large compared to most community hospitals and practices. However, in a world with richer data sharing, these leading centers would arguably lose much of their competitive advantage – even though the global quality of cancer care would likely go up, driven by the knowledge the richer dataset would provide. Thus, it’s perhaps not surprising that most leading cancer centers talk up data sharing far more than they engage in it – at least at anything like the rich level that would be ideal to advance medical science. (Of course, there are encouraging exceptions to this generalization.)

The need for rich data sharing to accelerate what Andy Grove calls “knowledge turns” (link here – ironically but not surprisingly, preview only; JAMA has not made this open access) has both frustrated and motivated patient advocates such as Chordoma Foundation co-founder Josh Sommer, who has worked tirelessly to change the system (see here, also here).

Nevertheless, both in the context of scientific research and in the context of patient care, the unfortunate truth is that while it’s fashionable to profess commitment to data sharing, many hospitals, and many researchers, are reluctant to part with data.

“What if you owned a business and one of your competitors said: ‘I would like a list of all your customers, as well as information on their demographics and health history.’ You would likely say, there is no way I’m giving you a list of my customers.

Well in the case of healthcare, customers = patients.”

Instead, the idea of the moment seems to be “federated” datasets – the idea that everyone can keep their own datasets, but query engines could specifically extract the exact, relatively limited data they need, affording, it’s suggested, many of the benefits of data pooling but without incurring many of the risks. There’s a conspicuous “assume a can opener” quality to this strategy, but it’s worth watching because some very smart people (and organizations) are working intensively on this — and because it might be the best we can hope for.

One alternative to this idea is that patients could contribute their own data into datasets, which could be used for the common good. This is obviously attractive conceptually, but the challenge is more pragmatic: while some patients are both motivated and technologically adept, most patients struggle exhaustively just to get a handle on all of their own medical records, and most are unlikely to have the time, inclination, and ability to share – beneficial as this would be.

An idea I’ve been thinking about (see here, here) is the notion of the data-inhaling clinic, medical centers built around the premise of rich data collection and sharing, and offering genuine interoperability. Patients choosing to seek care here would explicitly want their data shared, and in turn would benefit from the data sharing of others. Consent to share data (which could always be withdrawn) would be a foundational condition of care at these centers, and a reason enlightened patients would seek treatment there (in addition to the empathetic care, which as always remains elemental). Institutions subscribing to this philosophy would not need to have the same owner, nor even the same EMR – just the same commitment to rich and complete data sharing among participating institutions. (Sharing data only among participants seems necessary, at least initially, to avoid free-rider problem; the point is that any organizations willing to share appropriately-consented data in substantial fashion could belong to the network.)

While some patients might not like this approach, those in favor would vote with their feet, and I can imagine that the rich, consented dataset the subscribing, data-inhaling clinics would build would rapidly exceed those available elsewhere in the world. Perhaps at this point, holdout institutions – which I imagine would include top academic medical centers – would finally relent and join as well.

The aspiration would be that in a world of rich data sharing, making diagnoses based on the combination of unusual symptoms and unusual genetics wouldn’t be exceptional, or even tweet-worthy; rather it would be — and should be — the expectation.

U.S. Surgeon General Vice Admiral Vivek H. Murthy meets with Dr. Jonathan Woodson, assistant secretary of Defense for Health Affairs, at the Pentagon on March 11, 2015, to discuss the Military Health System’s critical role in support of the National Health Strategy.

eaders of the Military Health System met with the newly confirmed U.S. Surgeon General at the Pentagon on March 11. Dr. Jonathan Woodson, assistant secretary of Defense for Health Affairs, and Air Force Lt. Gen. Douglas Robb, director of the Defense Health Agency, discussed military health and the MHS’ critical role in support of the National Health Strategy in their first meeting with Vice Admiral Vivek H. Murthy since his confirmation by the U.S. Senate in December 2014.

“Our partnership with the Public Health Service has been instrumental in helping the Department and the Military Health System achieve its mission,” said Woodson. “Public Health Service officers have worked side-by-side with us in our military hospitals and clinics, in our laboratories, in support of our global health mission, and as part of the medical team serving all of our beneficiaries. One of the most prominent areas where we have collaborated is on implementation of health prevention and wellness initiatives. I look forward to continuing to work in close partnership with Admiral Murthy to promote health and healthy behaviors for our force and all of our beneficiaries.”

The meeting gave the leaders the important opportunity to discuss the Military Health System as a strategic asset in support of national security objectives, and the important role DoD plays in supporting the National Health Strategy, especially in areas such as reducing obesity and tobacco use.

“Our military medical personnel are all members of the larger federal team focused on improving the health and wellness of the entire country,” said Robb. “I am privileged to have Public Health Service officers working with me in the Defense Health Agency on a number of critical health matters. We’re one team engaged in one fight. It was a great opportunity to show Admiral Murthy everything we have to offer, and to express our appreciation for the talented people the Public Health Service shares with us.”