Fourteen US and Canadian cancer institutes will use International Business Machines Corp.’s Watson computer system to choose therapies based on a tumor’s genetic fingerprints, the company said on Tuesday, the latest step toward bringing personalized cancer treatments to more patients.

Oncology is the first specialty where matching therapy to DNA has improved outcomes for some patients, inspiring the “precision medicine initiative” President Barack Obama announced in January.

But it can take weeks to identify drugs targeting cancer-causing mutations. Watson can do it in minutes and has in its database the findings of scientific papers and clinical trials on particular cancers and potential therapies.

Faced with such a data deluge, “the solution is going to be Watson or something like it,” said oncologist Norman Sharpless of the University of North Carolina Lineberger Cancer Center. “Humans alone can’t do it.”

IBM is positioning Watson for exactly this task: an area of medicine where humans can see the vast potential, but can’t begin to wrangle the data needed to achieve it. “Genomics is the secret to unlocking personalized medicine,” said Steve Gold, a Vice President of the IBM Watson Group, at a press conference on Tuesday.

Yet it is unclear how many patients will be helped by such a “big data” approach. For one thing, in many common cancers old-line chemotherapy and radiation will remain the standard of care, and genomic analysis may not make a difference.

The scientists directly involved with Watson aren’t making any promises, but they’re hopeful they can slowly begin to make a difference in the world of cancer treatment, which today leaves a great number of patients without many good options.

“Traditional cancer treatments are moderately effective, associated with moderate toxicity, and many patients still succumb to the disease,” said Lukas Wartman, assistant director of Cancer Genomics at Washington University and a leukemia survivor, at Tuesday’s press conference. “There’s been a lot of pessimism among those [fighting] cancer, and Watson offers an opportunity to fight back against that pessimism.”

Cloud-based Watson will be used at the centers – including Cleveland Clinic, Fred & Pamela Buffett Cancer Center in Omaha and Yale Cancer Center – by late 2015, said Steve Harvey, vice president of IBM Watson Health. The centers pay a subscription fee, which IBM did not disclose.

Oncologists will upload the DNA fingerprint of a patient’s tumor, which indicates which genes are mutated and possibly driving the malignancy. Watson, recognized broadly for beating two champions of the game show Jeopardy! in 2011, will sift through thousands of mutations and try to identify which is driving the tumor, and therefore what a drug must target.

Distinguishing driver mutations from others is a huge challenge. IBM spent more than a year developing a scoring system so Watson can do that, since targeting non-driver mutations would not help.

“Watson will look for actionable targets,” Harvey said, matching them to approved and experimental cancer drugs and even non-cancer drugs (if Watson decides the latter interfere with a biological pathway driving a malignancy).

But Watson has trouble identifying actionable targets in cancers with many mutations. Although genetic profiling is standard in melanoma and some lung cancers, where drugs such as Zelboraf from the Genentech unit of Roche Holding AG target the driver mutation, in most common tumors traditional chemotherapy and radiation remain the standard of care.

“When institutions do genetic sequencing, only about half the cases come back with something actionable,” Harvey said, often because it is impossible to identify the driver mutation or no targeted therapy exists.

The other collaborating centers are Ann & Robert H. Lurie Children’s Hospital of Chicago; BC Cancer Agency in British Columbia; City of Hope, in Duarte, California; Duke Cancer Institute in North Carolina; McDonnell Genome Institute at Washington University in St. Louis; New York Genome Center, Sanford Health in South Dakota; University of Kansas Cancer Center; University of Southern California Norris Comprehensive Cancer Center, and University of Washington Medical Center.

Last year, the fallout from a string of breaches at major retailers like Target and Home Depot had consumers on edge. But 2015 is shaping up to be the year consumers should be taking a closer look at who is guarding their health information.

Data about more than 120 million people has been compromised in more than 1,100 separate breaches at organizations handling protected health data since 2009, according to Department of Health and Human Services data reviewed by The Washington Post.

“That’s a third of the U.S. population — this really should be a wake-up call,” said Deborah Peel, the executive director of Patient Privacy Rights.

The data may double-count some individuals if they had their information compromised in more than incident, but it still reflects a staggering number of times Americans have been affected by breaches at organizations trusted with sensitive health information. And the data does not yet reflect the hack of Premera, which announced this week that hackers may have accessed information, including medical data, on up to 11 million people.

Most breaches of data from health organizations are small and don’t involve hackers breaking into a company’s computer system. Some involve a stolen laptop or the inappropriate disposal of paper records, for example — and not all necessarily involve medical information. But hacking-related incidents disclosed this year have dramatically driven up the number of people exposed by breaches in this sector.

When Anthem, the nation’s second-largest health insurer, announced in February that hackers broke into a database containing the personal information of nearly 80 million records related to consumers, that one incident more than doubled the number of people affected by breaches in the health industry since the agency started publicly reporting on the issue in 2009.

“We are certainly seeing a rise in the number of individuals affected by hacking/IT incidents,” Rachel Seeger, a spokesperson for HHS’s Office for Civil Rights, said in a statement. “These incidents have the potential to affect very large numbers of health care consumers, as evidenced by the recent Anthem and Premera breaches.”

And some cybersecurity experts warn this may only be the beginning. “We’re probably going to see a lot more of these happening in the coming few months,” said Dave Kennedy, the chief executive of TrustedSEC.

Health organizations are targets because they maintain troves of data with significant resale value in black markets, Kennedy said, and their security practices are often less sophisticated than other industries. Now that some major players in the market have come forward as victims of cyberattacks other organizations are likely to take a close look at their own networks — potentially uncovering other compromises, he said.

“The information that companies like Anthem and Premera had is more valuable than just payment card information held by retailers or financial institutions,” said Scott Vernick, who heads up the data security and privacy practice at law firm Fox Rothschild. Credit card information has a relatively short shelf life, with new cards issued on a regular basis, he explained. But a health organizations often have complete profiles of people including Social Security numbers and medical health information that is much more difficult if not impossible to change.

Some of the data can be used to pursue traditional financial crimes — like setting up fraudulent lines of credit, Kennedy said. But it can also be used for medical insurance fraud, like purchasing medical equipment for resale or obtaining pricey medical care for another person.

This type of scheme is often not caught as quickly as financial fraud, experts said, and could have a lasting affect if it results in a person’s medical history containing false information. “In theory you could end up in an emergency situation, and if your records are contaminated by someone else’s information that could cause serious problems — like medical professionals believing you have a different blood type,” said Peel.

If a hacker is able to obtain information about a person’s medical condition, as it appears may have happened in the Premera breach but not the Anthem breach, there are additional risks. Information about mental health or HIV treatments could be made public, and there’s no way to truly make the information private again. “There’s almost no way to remedy this; there’s no recourse,” said Peel.

Health care providers already have to comply with government rules on protecting patient privacy, including HIPAA, which are enforced by HHS.

“Health care organizations need to make data security central to how they manage their information systems and to be vigilant in assessing and addressing the risks to data on a regular basis,” said Seeger, the HHS official. “In addition, organizations need to ensure they are able to identify and respond appropriately to security incidents when they do happen to mitigate harm to affected individuals and prevent future similar incidents from occurring.”

State-level officials are also increasingly involved in enforcement in this area, said Vernick, and consumers may have additional legal avenues depending on state laws.

But privacy and cybersecurity advocates say the industry and the government still aren’t doing enough to protect consumers.

“HIPAA required security be addressed, but it wasn’t spelled it out exactly how, so there was no culture of using ironclad security,” said Peel. “We have systems that are engineered as though this data is not sensitive and valuable.”

Health organizations sometimes rely on legacy systems, and some have not invested in cybersecurity at a rate that matches the urgency of the threats they face, Kennedy said. “The medical industry is years and years behind other industries when it comes to security.”

Even before the Anthem breach, major health insurers had become aware of the rising risk of cyberattacks. Aetna and United Health Group both cited the risks of hackers and breaches in their respective 2013 financialreports.

And the industry is already taking steps to coordinate how it responds to such incidents through groups designed to share information about digital threats — like the National Health Information Sharing and Analysis Center, or NHISAC. The organization is one of several efforts related to critical infrastructure that works with the Department of Homeland Security to share data about current threats, such as what sort of tactics are used and forensic information about attackers.

Members are able to share details about security incidents in “machine time” using an automated system, according to NHISAC executive director Deborah Kobza, and the group sends out daily threat updates. When a major cyberattack is disclosed, NHISAC erupts into a flurry of activity — trying to find out as much as possible so its members have information that can make it easier to see if they’ve been the victims of a similar attack.

And 2015 has already kept NHISAC busy: “We just caught our breath from the Anthem hack, and here we go again,” said Kobza about responding to the Premera breach.

The Centers for Medicare & Medicaid Services has paid out nearly $30 billion in meaningful use incentives for hospitals and physicians to adopt EHRs. But some members of Congress, the body that approved those funds, are about as frustrated with EHRs as doctors and nurses.

“The evidence suggests these goals haven’t been reached,” said Senator Lamar Alexander, R-Tennessee, in a long EHR hearing followed by Erin McCann, Healthcare IT News managing editor.

Robert Wergin, MD, president of the American Academy of Family Physicians, said that family physicians are having a difficult time with the Stage 2 meaningful use requirements. The “time, expense and effort it takes makes it not worth while,” said Wergin. Indeed, some 55 percent of physicians surveyed plan on skipping Stage 2 all together.

“The issue of interoperability between electronic health records represents one of the most complex challenges facing the healthcare community,” said Wergin. The government “must step up efforts to require interoperability.”

A central problem, as McCann wrote, is that “Vendors have no incentive to share data and create more interoperable systems. There’s the question of data ownership here. There’s the question of competition. And there’s the question of standards, or lack thereof.”

“The vendors are siloed,” as Wergin said. “And you’re held somewhat hostage by the vendor you have.”

President Barack Obama’s Precision Medicine initiative hinges on gathering data from millions of individuals, but there are challenges the healthcare industry will face when it comes to collecting that information, says Niam Yaraghi, a fellow in the Brookings Institution’s Center for Technology Innovation.

Interoperability and security are two issues plaguing the industry, which also will play a role in Obama’s initiative, the aim of which is to increase the use of personalized information in healthcare

“To succeed, the Precision Medicine initiative has to either overcome the lack of interoperability problem in the nation’s health IT system or to find a way around it,” he writes.

For now, those working on the project should get access to what medical records they can and then work with providers and vendors on gaining access to future records, Yaraghi says.

When it comes to privacy for precision medicine, problems the administration will face include setting up secure technologies and privacy regulations surrounding the project so that information cannot be accessed by malicious actors. In addition, when researchers begin to analyze data, they may uncover information patients do not want shared or known.

To ensure safety of information privacy, audits should be conducted by third parties, Yaraghi says. And if a data breach occurs, a patient’s participation in the study should be canceled and the patient should get financial compensation.

“The Precision Medicine initiative is a ‘Big Hairy Audacious Goal’ with exciting promises and priceless implications,” he writes. “[H]owever, to believe in its future success, it should first propose a plan to resolve the above mentioned patient privacy concerns and health IT challenges.”

Deprivation has a way of making you feel excessively thankful for even the most meager offering. Yoni Maisel, a reflective patient and patient advocate with a rare genetic primary immune deficiency disorder, conveyed just this sense of disproportionate gratitude in an exuberant recent piece describing the impact of technology on his life.

Inspired by Eric Topol’s new book, The Patient Will See You Now, highlighting the power of the smartphone (my WSJ review here), Maisel went out and bought one. He then received (on his smartphone) an email from a doctor who had read about the symptoms Maisel had previously described on his blog related to a second, extremely rare disease he has (Sweet’s Syndrome), and thought she had a patient with a similar condition. After viewing photos of skin lesions Maisel took (with his smartphone) and shared with her (with his smartphone), the doctor was reportedly convinced her patient had Sweet’s Syndrome as well.

Maisel tweeted enthusiastically that Topol’s book (and, implicitly, the technology he champions) “Just Played Part in Dx of 1in 1Million #RareDisease.”

A somewhat different reaction was shared by Rick Valencia, Head of Qualcomm Life, who commented, via Twitter; “Shocking that buying a smartphone and sending a pic considered a tech breakthrough. #onlyinhealthcare”

On the one hand, of course, Maisel’s story obviously represents a terrific outcome for the newly-diagnosed patient, and – precisely as Maisel and Topol emphasize – highlights one way smartphone technology can improve medical care.

At the same time, Maisel’s joy, paradoxically, also reminds us of a deep flaw in the system, as Valencia’s comment begins to suggest. At issue: poor data sharing, a medical tragedy of underappreciated dimension. Valuable, even vital information often remains uncaptured, unanalyzed, and, especially, unshared.

The human consequences associated with poor data sharing were poignantly described by Seth Mnookin in his New Yorker article last year profiling a family whose son, Bertrand was born with a mysterious disease that eluded rapid identification. The family (like an estimated 25% of patients with unknown genetic disorders) was able to obtain a diagnosis by exome sequencing, yet struggled to locate others with a similar condition. It wasn’t until the father, Matt Might, blogged about it – and had the story picked up by Reddit and others – that he was able to locate others with the disease.

The key point is that the networks afforded by Reddit were fundamentally richer than any medical dataset. If someone – the father in the Mnookin story, the doctor in Maisel’s story – wants to find others who have similar genetics and phenotypes, they need to rely on public, non-medically-specific networks because these networks, while not purpose-built, are nevertheless far denser, and often, it seems, the best option available. The issue this speaks to is what I’ve heard referred to as Matticalfe’s Law, named by physician and informaticist John Mattison to suggest a variant of the familiar Metcalfe’s Law. (Disclosure: while I’ve no business relationship with Mattison, he is co-chair of the Global Alliance for Genetics and Health eHealth working group, on which I serve. Also, to offer my usual reminder/disclosure, I am CMO at DNAnexus, a company that makes a cloud-based platform for genomic data management and collaboration).

Metcalfe’s Law is the idea that the value of a network is proportional to the square of the number of participants – i.e. adding more people to a network increases value not linearly, but exponentially. It’s a key principle underlying the concept (and power) of networks (though not without its critics – see here).

Matticalfe Law, as Mattison explains it, is that “the value of data silos is very limited, but when deployed in aggregate yields a law of accelerating returns rather than a law of diminishing returns, similar to the network effect of Metcalfe’s law.” Mattison adds he “hybridized the eponym to distinguish it from the classical network effect, hence Matticalfe’s Law.”

One implication here is that if every cancer center, every medical center, every rare disease center shared their data fully, then as a whole, these data would be profoundly more valuable and useful. The chances that a patient with an unusual mutation and phenotype would have someone like them, somewhere in the world, would be so much higher.

So why isn’t this done?

For starters, most hospitals – even leading centers — are struggling to meaningfully organize the genetic and phenotypic data of their own patients in a fashion that can truly inform clinical decision making, as I discussed late last year; thus, you can argue that it’s hard to share with others what you can barely grasp yourself.

A second factor, of course is privacy; medical centers typically emphasize the special nature of medical data, and express concern about the fate of rich information in a shared dataset.

Yet, many experts are skeptical that this represents the true (or only) explanation; as Mnookin writes,

‘If you want to be charitable, you can say there’s just a lack of awareness’ about what kind of sharing is permissible, Kohane said. ‘If you want to be uncharitable, you can say that researchers use that concern about privacy as a shield by which they can actually hide their more selfish motivations.’”

In other words, even if top centers were able to collect and usefully organize phenotypic and genetic data on patients, would they share most of this information or silo it?

I’m not sure I know anyone who would bet against “silo.”

Whether consciously recognized or not, these data are perceived as representing a competitive advantage for the institutions and individuals who generated them (and notably, in this context, the “generating individual” is understood to be the researcher, not the patient!).

Leading cancer centers (for example) have more data than most other hospitals and practices – even though their total share of cancer patients is relatively small, as something like 85% of cancer care occurs in the community. In a world without rich data sharing, today’s top cancer centers enjoy a distinct competitive advantage; their datasets (and more broadly, their experience sets), while individually small in the absolute sense, are large compared to most community hospitals and practices. However, in a world with richer data sharing, these leading centers would arguably lose much of their competitive advantage – even though the global quality of cancer care would likely go up, driven by the knowledge the richer dataset would provide. Thus, it’s perhaps not surprising that most leading cancer centers talk up data sharing far more than they engage in it – at least at anything like the rich level that would be ideal to advance medical science. (Of course, there are encouraging exceptions to this generalization.)

The need for rich data sharing to accelerate what Andy Grove calls “knowledge turns” (link here – ironically but not surprisingly, preview only; JAMA has not made this open access) has both frustrated and motivated patient advocates such as Chordoma Foundation co-founder Josh Sommer, who has worked tirelessly to change the system (see here, also here).

Nevertheless, both in the context of scientific research and in the context of patient care, the unfortunate truth is that while it’s fashionable to profess commitment to data sharing, many hospitals, and many researchers, are reluctant to part with data.

“What if you owned a business and one of your competitors said: ‘I would like a list of all your customers, as well as information on their demographics and health history.’ You would likely say, there is no way I’m giving you a list of my customers.

Well in the case of healthcare, customers = patients.”

Instead, the idea of the moment seems to be “federated” datasets – the idea that everyone can keep their own datasets, but query engines could specifically extract the exact, relatively limited data they need, affording, it’s suggested, many of the benefits of data pooling but without incurring many of the risks. There’s a conspicuous “assume a can opener” quality to this strategy, but it’s worth watching because some very smart people (and organizations) are working intensively on this — and because it might be the best we can hope for.

One alternative to this idea is that patients could contribute their own data into datasets, which could be used for the common good. This is obviously attractive conceptually, but the challenge is more pragmatic: while some patients are both motivated and technologically adept, most patients struggle exhaustively just to get a handle on all of their own medical records, and most are unlikely to have the time, inclination, and ability to share – beneficial as this would be.

An idea I’ve been thinking about (see here, here) is the notion of the data-inhaling clinic, medical centers built around the premise of rich data collection and sharing, and offering genuine interoperability. Patients choosing to seek care here would explicitly want their data shared, and in turn would benefit from the data sharing of others. Consent to share data (which could always be withdrawn) would be a foundational condition of care at these centers, and a reason enlightened patients would seek treatment there (in addition to the empathetic care, which as always remains elemental). Institutions subscribing to this philosophy would not need to have the same owner, nor even the same EMR – just the same commitment to rich and complete data sharing among participating institutions. (Sharing data only among participants seems necessary, at least initially, to avoid free-rider problem; the point is that any organizations willing to share appropriately-consented data in substantial fashion could belong to the network.)

While some patients might not like this approach, those in favor would vote with their feet, and I can imagine that the rich, consented dataset the subscribing, data-inhaling clinics would build would rapidly exceed those available elsewhere in the world. Perhaps at this point, holdout institutions – which I imagine would include top academic medical centers – would finally relent and join as well.

The aspiration would be that in a world of rich data sharing, making diagnoses based on the combination of unusual symptoms and unusual genetics wouldn’t be exceptional, or even tweet-worthy; rather it would be — and should be — the expectation.

The Patient-Centered Outcomes Research Institute (PCORI) is operating in accordance with requirements of the Affordable Care Act, but the lack of a standard data model poses a hindrance to plans to develop a “network of networks” to further comparative effectiveness research (CER), according to a report from the Government Accountability Office.

Yet while the planned National Patient-Centered Clinical Research Network (PCORnet) is believed to have the potential to significantly improve the ability to conduct CER, it faces various challenges.

Plans call for PCORnet to combine the resources of 11 Clinical Data Research Networks (CDRNs), such as the National Pediatric Learning Health System from the Children’s Hospital of Philadelphia, and 18 Patient-Powered Research Networks (PPRNs) formed by patient groups, such as the Epilepsy Foundation’s Collaborative Patient-Centered Rare Epilepsy Network, FierceHealthIT previously reported.

PCORI officials and other stakeholders, however, “expect that the process of mapping data to the common data model will be slow and resource intensive because of the lack of standardization among existing data maintained by CDRNs and PPRNs, such as data from electronic health records,” according to the report.

A common data model and the requirement that networks hire the expertise to address standardization are expected to work out this issue, the report says. However, uncertainty about costs and sustainability of the network remains an issue, and each CDRN and PPRN is expected to be required to provide a sustainability plan.

The completeness of data is another issue. PCORI officials said that they are collaborating with other relevant organizations, including state Medicaid offices and private health insurers, to identify how CDRNs could link their EHR data to claims data, which would improve data completeness.

While some early results are expected to be announced in 2017, full evaluation is not expected until around 2020, the report says. Research findings will be publicly available 90 days after researchers submit their final reports, as required by law.

The Office of the National Coordinator for Health IT has released for public comment its shared nationwide roadmap for interoperability.

The roadmap’s goal is to provide steps to be taken in both the private and public sectors to create an interoperable health IT ecosystem over the next 10 years, according to ONC.

One of the main focuses on the roadmap is to enable “a majority of individuals and providers across the care continuum to send, receive, find and use a common set of electronic clinical information at the nationwide level by the end of 2017.”

“HHS is working to achieve a better healthcare system with healthier patients, but to do that, we need to ensure that information is available both to consumers and their doctors,” HHS Secretary Sylvia M. Burwell said in the announcement. “Great progress has been made to digitize the care experience, and now it’s time to free up this data so patients and providers can securely access their health information when and where they need it.”

Along with the roadmap, ONC also released a draft of 2015 Interoperability Advisory Standards, which “represents ONC’s assessment of the best available standards and implementation specifications for clinical health information interoperability as of December 2014.”

The roadmap is garnering praise from industry leaders, including from the College of Healthcare Information Management Executives. CHIME said in an annocement that is “welcomes” the Interoperability Standards Advisory today as part of the roadmap.

“This is a much-needed playbook for each and every health IT professional,” CHIME President and CEO Russell P. Branzell said in the announcement. “Now, healthcare providers and health IT developers have a single source of truth, with an extensible process to align clinical standards towards improved interoperability, efficiency and patient safety. While we have made great strides as a nation to improve EHR adoption, we must pivot towards true interoperability based on clear, defined and enforceable standards.”