At a recent meeting of the National Health IT Policy Committee, the CEO of a large electronic health records (EHR) corporation said technology for “data segmentation”—which ensures patients control who sees and uses sensitive data—is something “vendors don’t know how to do.” But that simply isn’t true. Vendors do know how to build that kind of technology, in fact it already exists.

At the same meeting, the National Coordinator for Health IT recognized the Department of Veterans Affairs and the Substance Abuse and Mental Health Services Administration for their “demonstration of technology developed for data segmentation and tagging for patient consent management”, but he seemed to forget that millions of people receiving mental health and addiction treatment have been using EHRS with consent and data segmentation technologies for over 12 years. Again, the technology already exists.

Facts:

-Technology is NOT the problem—it’s not too hard or too expensive to build or use consent and data segmentation technologies.

-Data segmentation and consent technologies exist: the oldest example is EHRs used for millions of mental health and addiction treatment records for the past 12 years.

-All EHRs must be able to “segment” erroneous data to keep it from being disclosed and harming patients—that same technology can be used to “segment” sensitive health data.

-Starting in 2001, HIPAA required data segmentation and consent technology for EHRs that keep “psychotherapy notes” separated from other health data. “Psychotherapy notes” can ONLY be disclosed with patient permission.

-The 2013 amendments to HIPAA require EHRs to enable other situations where data must be segmented and consent is required. For example:

-If you pay out-of-pocket for treatment or for a prescription in order to keep your sensitive information private, technology systems must prevent your data from being disclosed to other parties.

-After the first time you are contacted by hospital fundraisers who saw your health data, you can opt-out and block the fundraisers from future access to your EHR.

The real problem is current technology systems and data exchanges are not built to work the way the public expects them to—they violate Americans’ ethical and legal rights to health information privacy.

An article written at ModernHealthcare.com about our new Privacy Trust Framework explains how the framework came into being and what it’s major principles are.

Key quote from the article:

“‘This comes from what the American public wants and was devised by Microsoft and PricewaterhouseCoopers,’ Peel said. ‘Some of the bigger corporations see the future as the public controlling things. Microsoft wanted to distinguish itself from Google Health (its one-time rival as a developer of PHR platforms) and wanted HealthVault to be the privacy place and wanted to compete in that way.’ PricewaterhouseCoopers saw a future auditing opportunity, she said. ‘We’re now moving with the Blue Button where patients can access their information and control it. The ultimate consumer is the patient.'”

HealthDataManagement.com recently posted this article about Patient Privacy Rights’ Privacy Trust Framework. The article tells HealthDataManagement readers “The Framework is designed to help measure and test whether health information systems and research projects comply with best privacy practices in such areas as whether patients have control over their protected health information, an organization obtains meaningful consent before disclosing data and obtains new consent before secondary data use occurs, patients have the ability to selectively share data, and the organization uses servers housed in the United States, among other factors.”

The key principles for our Privacy Trust Framework:

*Patients can easily find, review and understand the privacy policy.

* The privacy policy fully discloses how personal health information will and will not be used by the organization. Patients’ information is never shared or sold without patients’ explicit permission.

* Patients decide if they want to participate.

* Patients are clearly warned before any outside organization that does not fully comply with the privacy policy can access their information.

* Patients decide and actively indicate if they want to be profiled, tracked or targeted.

* Patients decide how and if their sensitive information is shared.

* Patients are able to change any information that they input themselves.

* Patients decide who can access their information.

* Patients with disabilities are able to manage their information while maintaining privacy.

* Patients can easily find out who has accessed or used their information.

* Patients are notified promptly if their information is lost, stolen or improperly accessed.

* Patients can easily report concerns and get answers.

* Patients can expect the organization to punish any employee or contractor that misuses patient information.

* Patients can expect their data to be secure.

* Patients can expect to receive a copy of all disclosures of their information.

The MOST “incomplete” US privacy law is HIPAA, which eliminated Americans’ rights to control the collection, use, disclosure and sale of their health data in 2001.

The new Omnibus Privacy Rule did not fix this disaster. It made things worse by explicitly permitting health data sales for virtually any purpose without patients’ consent or knowledge. These new regulations violate Congress’ intent to ban the sale of health data in the 2009 stimulus bill.

In addition to not being able to control personal health information Americans have no ‘chain of custody’ for their health data, so there is no way to know who is using or selling our health data.

(Reuters) – When nationwide pharmacy chain CVS Caremark Corp announced last week that its employees must submit to a medical exam or pay a $600 annual fine, some critics raised privacy concerns…

Under the CVS exam, which is free, tests will measure an employee’s weight, body fat, blood pressure, glucose levels and other health indicators. Workers who smoke must enroll in an addiction program by next year.

“They draw blood, that’s data collection. You have to go through a screening, that’s data collection. You have to call WebMD’s center, that’s data collection. People’s sensitive health data is being used for commercial purposes,” said Dr. Deborah Peel, founder of the advocacy organization Patient Privacy Rights.

This is an amazing article written by Rebekah Skloot, author of ‘The Immortal Life of Henrietta Lacks’, demanding consent and trust.

Rebecca is right—-the only way Americans will trust researchers is when they are treated with respect and their rights of consent for use of genomes and genetic information is restored.

The public does not yet realize that they have no control over ALL sensitive health information in electronic systems. We have NO idea how many hundreds of data mining and research corporations are collecting and using our blood and body parts. We ALSO have no control over our sensitive health information in electronic systems violating hundreds of years of privacy rights.

This week the many stories about CVS showed employers can force employees to take blood tests, health screenings, and be forced into “wellness” programs–all of which REQUIRE collection of sensitive health information—which employees cannot control.

We have NO map of who collects and uses personal health data—Henrietta Lacks family was NEVER asked for consent to use her genome.

Steve Lohr likens today’s Big Data issues to the introduction of the mainframe computer in the 1960s. Even then, new technology threatened the “common notions of privacy”.

A few key quotes from the article:

“…the latest leaps in data collection are raising new concern about infringements on privacy — an issue so crucial that it could trump all others and upset the Big Data bandwagon. Dr. Pentland is a champion of the Big Data vision and believes the future will be a data-driven society. Yet the surveillance possibilities of the technology, he acknowledges, could leave George Orwell in the dust.”

“The World Economic Forum published a report late last month that offered one path — one that leans heavily on technology to protect privacy. The report grew out of a series of workshops on privacy held over the last year, sponsored by the forum and attended by government officials and privacy advocates, as well as business executives. The corporate members, more than others, shaped the final document.”

“Patient Privacy Rights Founder Deborah Peel, MD calls a new CVS employee policy that charges employees who decline obesity checks $50 per month “incredibly coercive and invasive.” CVS covers the cost of an assessment of height, weight, body fat, blood pressure, and serum glucose and lipid levels, but also reserves the right to send the results to a health management firm even though CVS management won’t have access to the results directly. Peel says a lack of chain of custody requirements means that CVS could review the information and use it to make personnel decisions.”

“This is an incredibly coercive and invasive thing to ask employees to do,” Patient Privacy Rights founder Deborah Peel told the Boston Herald, noting that such policies are becoming more prevalent as health costs increase.

“Rising health care costs are killing the economy, and businesses are terrified,” she continued to the Herald. “Now, we’re all in this terrible situation where employers are desperate to get rid of workers who have costly health conditions, like obesity and diabetes.”