I see what your saying, it’s complicated. I guess I meant was actual diagnosis and results of tests etc… these things are secure. But the data metrics from wearables, voice data mining etc… this could be nightmarish. I may have mentioned this on the forum before- but a colleague of mine said she was talking to her friend about her migraine headache., well about an hour later she was hit with multiple ads for migraine medicine. This colleague had no idea what data mining and privacy was until that moment.

With that said- hospital and pharmacy prescription records are secure-- HIPPA can’t protect what we don’t keep private ourselves. The medical field is laden with regulation it’s hard to get previous records on patients even when they are desperately needed.

Scientifically speaking, no person can paint a medical picture from datamining. It can’t be proved in court, no real professional expert witness could prove it- they would lose their license. There are laws that protect people from insurance exclusion based on previous diagnosis too.

There is concern about the unknown- but actual real medical information will not be leaked out without consequences.

The medical field is very slow to adapt to technology- so when AI is used in medicine- it will be for the treatment of disease. The ethical and federal standards create firewalls- that are very hard to break down even with new discoveries.

Scientifically speaking, no person can paint a medical picture from datamining. It can’t be proved in court, no real professional expert witness could prove it- they would lose their license.

While this is true, nothing has to be proven in court. The conclusions of the AI about your medical state are consumed and judged upon behind the scenes without one being aware of this. An employer not hiring you, because of higher-than-usual health risk scoring will reject you making up some different story.

An employer not hiring you, because of higher-than-usual health risk scoring will reject you making up some different story.

I see what your saying now, discrimination at its worst! SFBayarea has some pretty fierce employment lawyers and board of supervisors… but now I think I’ve been living in a bubble that’s going to pop one day…

So with location tracking and wearables it can be known whether people are actually working out at the gym or just visiting with people!!

Thanks for welcoming my advice, which probably just sounded grumpy. I am up to my eyeballs in pro bono and “low bono” work—extremely low pay or volunteer work to make the world a better freakin’ place. Saving the local farmers market. Writing and placing articles to influence local and regional policymakers. Lots of arts, community, and kids-oriented work.

What I’d suggest, if folks here are serious about all this, is that you hire a firm to make a truly kick-ass communications and/or brand platform, before wasting any more time and energy with the scattershot method. I don’t have the bandwidth to do it pro bono. This is worth spending real money on. Plazm, for example, would probably give you something solid to start with at a highly reduced nonprofit rate—for a smallish investment, you should be able to get a whole lot of strategic & comms work done.

My guess is that multiple sub-platforms might be necessary, based on personal development. You need a whole different package and approach to reach Joe Americana, compared to the approach that would reach a Silicon Valley indentured servant to technology, or a Liberal Bubble dweller.

These forums seem likely to help the latter two formulate messaging and spread it among their own communities. That’s definitely a good start! But for the larger-scale stuff, just hire some professionals and do the thing right. Saves so much money and time in the long run…

I was finally able to explain this to my friend (and have her believe me) when it comes to data mining. I started with a “Did you know that…” example about credit scores. I also explained how that could effect her getting a job ect. She said it sounded like a conspiracy theory, but that she understands. I also showed her Privacy Badger and all the different trackers that it blocked. We have a breakthrough everyone!

She did, however, ask why it mattered that her data could be used via a credit or health score if she was not doing anything wrong on the Internet.

@Siddhi It’s a start! I definitely have those moments with my colleagues-

Just a thought- on your oroject you could have a sample dialogue or interview explicitly about someone inquiring about datamining this is important to understand. Or a comic- graphic novel page- ever heard of “Valley Girl”? You could use the Valley Girl language to the extreme on why do I care about my personal data?

She did, however, ask why it mattered that her data could be used via a credit or health score if she was not doing anything wrong on the Internet.

That is it - besides automated systems making wrong or biased decisions - it is unknowable whether it is right or wrong. Someone else is now the decider based on own criteria. And they will judge you by it, now and forever. Your physical, mental and emotional state.

Bossy in communications? —> You are not a teamplayer —> Sorry, job not for you
Online at night, typing quickly, making mistakes —> You can’t handle stress, update life expectancy —> Sorry, you are not admitted to this insurance plan

And when you encounter real mental illness or it is interpreted you have, then you get permanently labeled: Bipolar, depressed, sociopath, etc.

This is the gray scary area that beholds us who understand the data mining monster potential. Only this monster under the bed is actually real!

@aschrijver made this point in another post with healthcare. Yes indeed we are protected with HIPPA in the US- but what about the info from wearables and map locators- insurance companies and potential employers could paint a picture about you and face discrimination before they even meet you- a lost opportunity.

Companies are using wearables to help employees get fit and save on health care, all while getting access to troves of individual data.

Many consumers are under the mistaken belief that all health data they share is required by law to be kept private under a federal law called HIPAA, the Health Insurance Portability and Accountability Act. The law prohibits doctors, hospitals and insurance companies from disclosing personal health information.

But if an employee voluntarily gives health data to an employer or a company such as Fitbit or Apple — entities that are not covered by HIPPA’s rules — those restrictions on disclosure don’t apply, said Joe Jerome, a policy lawyer at the Center for Democracy & Technology, a nonprofit in Washington. The center is urging federal policymakers to tighten up the rules.

Many consumers are under the mistaken belief that all health data they share is required by law to be kept private under a federal law called HIPAA, the Health Insurance Portability and Accountability Act. The law prohibits doctors, hospitals and insurance companies from disclosing personal health information.

Totally agree on this- once medical information is in the medical record it becomes secure (that is if there is no data breech). But any data collected by any company is free game even for insurance companies. There is no privacy guaranteed for information gathered outside of organized healthcare like clinics, hospitals or pharmacies. Inside these institutions HIPPA covers it.