What Story Is Your Health Data Telling?

On any given day, you might go to the supermarket for the week’s grocery shopping and swipe your supermarket membership card to receive all the store-promised discounts. Then, you might go home and sit down to your favorite show on the cooking channel. Maybe on the way home, you remembered to call in that prescription the doctor ordered the day before.

Could a simple morning like this one lead your next health insurance company to deny you coverage? Or a potential employer to turn down your application?

We leave data trails wherever we go. The supermarket savers card you swipe without a second thought collects data about your eating habits. Did you buy an economy size chip assortment pack with a 3-liter bottle of Pepsi that was on sale?

And your television rats on you too. Does your cable subscription of over 700 channels plus add-on movies and ten additional sports channels reveal something about your lifestyle? Some might assume you don’t get out enough.

What about that prescription for glipizide? When it’s time for new insurance because you changed jobs, will your new insurer charge you higher premiums because your daily habits data forms the profile of an overweight, inactive diabetic?

Our data profiles are collected from numerous sources not the least of which is your internet browsing history, search terms, location, and Facebook pages you liked. That’s how marketers know what to send you—by your data history. And all of this data somehow forms a health picture of you as active, sedentary, or chronically ill.

Health Profiling and You

Health profiling is a frightening thought.

But who’s collecting this data and who’s reading it? That’s the scary part. Your personal buying habits, television viewing habits, Twitter posts, magazine subscriptions, and even health care history are sold to data miners who sell your data to advertisers or pharmaceutical companies or anyone who could benefit by the information. And you can’t opt out.

Think about how much data Apple’s Healthkit tracker (or any other health tracker) picks up in a 24-hour cycle: your exercise, heart rate, steps, sleep cycle, food intake, stairs, and more. Much of who you are and what you do is contained in a wrist-sized device.

Now wait a minute. Health records are protected from unwanted eyes under the law, right? Under the Health Information Portability and Accountability Act (HIPAA), your glucose monitoring app data sent from your iPhone to your doctor is private. Ultimately, your health care provider bears the burden to protect that information.

But Apple is not a healthcare provider under the Act and bears no such burden. In fact, the majority of mobile health apps are under no obligation to protect your information.

Now Apple, or any other healthcare app coordinating data with a health care provider, can’t distribute your information to third parties, especially if they promised privacy in marketing their apps. But there’s no law that says what a tech company or developer can do with that collected data.

So, if you send your glucose readings from your app to your doctor, who then puts the data in your medical records, the records are protected under HIPAA, but the app is not covered under the act. That means that the same data is both protected and unprotected.

What’s more, anonymous health care records (those without name, social security, and other identifiers) are not privacy protected. And doctors’ names on the records are not protected either. Wouldn’t a pharmaceutical company wisely target a doctor who often prescribes a certain drug for marketing purposes?

The good intentions behind HIPAA were to prevent discrimination based on an individual’s health records. An employer considering an applicant can’t request medical records from the potential employee’s doctors. But what if the employer can get that information through the back door, say, through data miners?

And companies, like IBM, LexisNexis, and QuintilesIMS gather this data for purposes of scientific study. Tanner claims everyone makes money on that data. That is, everyone except the patient:

If you are a major drug store and you’re selling anonymized patient information, you may get tens of millions of dollars a year from those sales to the data miners. That’s lucrative for you. The data miners are very happy to have that information. The drug companies are very happy to buy that information. Everyone in this trade is happy, except for the patients, who don’t really know about it.

The greater harm, Tanner says, is the possibility that patients informed about these practices may be less forthcoming about revealing their health issue for fear it may come back to haunt them.

Patients for Public Disclosure

The flip side of the coin is that all that health data could be put to good use. If health records were public, think about the data analytics that could yield valuable information about which doctors to choose and which insurance companies pay poorly or play fairly.

Insurance companies notoriously code their billing statements (there are 68,000 diagnosis codes) so that the average insured has no idea what the statement means.

All of these practices exposed in public information could force more transparency in billing practices, along with the healthcare trails forged through disease, treatment, and medicine—a veritable goldmine of research data. In fact, there are a growing number of states passing legislation to demand more transparency in healthcare billing records.

Moreover, the public could benefit from the available data in public health records to learn about various treatments out there for their chronic illness or cancer, or even the cost of a hip replacement or other procedure.

And think of the additional mHealth apps that could be developed if not hampered by privacy protection worries and lawsuits by the FDA for failing to fulfill privacy promises. Most healthcare professionals, let alone developers, don’t understand HIPAA and hesitate to run afoul of its restrictions. So they hold back.

Your Data, Your Story

Your data is the story of competing interests: the promise of future scientific discoveries, such as new disease cures and innovative treatments, against the potential for profiteering abuse, discrimination, and impaired freedom of patient choice. As Tanner and others claim about public health records and data mining, there needs to be more transparency and public discussion.