Video: For years, Ashkan Soltani has warned of Facebook's privacy-eroding tendencies.

Share this story

If there's one person outside of government who has stood against Facebook's crashing wave, it's Ashkan Soltani.

Late last year, the independent privacy researcher was suddenlycalled to speak before the UK Parliament about Facebook's privacy practices, simply because he happened to be in London and, in his own words, "was just a dick on Twitter."

Soltani wasn't just some random Internet troll: he understood the company's technical practices in a way that few did, and, better yet, he could explain them in a way that most civilians would understand.

View more stories
Months earlier, Soltani had given similar testimony before a US Senate subcommittee, where he unequivocally said: "No other single company has done more to erode consumer privacy than Facebook."

Earlier in 2018, Soltani also helped author the new California Consumer Privacy Act, which was signed into law last June, just a few years after being named as the chief technologist at the Federal Trade Commission.

Years ago, the Californian began his career researching undeletable browser cookies. Over time, he's come to a stark realization.

"We have very little privacy protection in the US," he explained at our most recent gathering of Ars Technica Live, our monthly event (second Wednesday of the month) at a local bar in Oakland, Eli's Mile High Club.

"We have very few privacy laws," Soltani said. "What we have is FTC, Section 5, which governs unfair and deceptive trade practices."

In short, one of the FTC's primary jobs is to simply enforce whether companies are abiding by the lengthy, verbose, and legalistic terms that customers agree to.

On January 18, The Washington Postreported that the FTC is homing in on imposing potential fines to Facebook over the recent Cambridge Analytica debacle that erupted in March 2018. That now-defunct British data-analytics company was revealed to have retained data on 50 million Facebook users despite claiming to have deleted it. That's on top of myriad other breaches, bug disclosures, and more.

"When the incentive is to grow at any cost, these violations don't seem like violations—they almost seem intentional," Soltani said.

Or, as he told the Senate committee: "'Growth at any cost' is the new 'unsafe at any speed' and must be treated as such."

A losing battle

Further Reading

Soltani quickly zeroed in on two primary reasons why it has been difficult for American regulators to wrap their arms around companies like Facebook. First, the California tech giant does provide a useful service: politicians want to reach voters and want to communicate with family members, too.

But the second issue is tougher, Soltani explained.

"Historically, if you wanted to govern airbags, there was a model year, you would recall a certain version, and you would hold a company liable," he said. "Software, particularly Web apps and cloud-based software, is constantly changing. It's not the same for you or [me]. You might be in a test group that I'm not in. To know this version versus that version and [how] the law should affect it this way is incredibly difficult."

It's impossible, he explained, for most people to keep track of the ins and outs of APIs and other data-sharing practices.

As an example, Soltani pointed to his June 2018 testing—at the request of The New York Times—which revealed how much a Facebook app on a BlackBerry phone could access. Facebook has maintained "data-sharing partnerships" with numerous device makers, which allows access to friends' data even after saying that it would no longer do so.

"When you're talking about these kinds of violations and these kinds of lies that the companies are telling, it feels like, as a consumer, it's hard for us to make good decisions," he said, pointing out how, when we buy food at the grocery store, all the choices must abide by labeling requirements.

"None of [the foods in the grocery store] can include arsenic, [but] we're not required to test our products," he added. "That's kind of the online regime that we have for digital safety and digital security."

The researcher has repeatedly called on federal regulation to help turn the tide.

Interested in coming to our next Ars Live event? It will be held at 7pm on Wednesday, February 13, 2019, at Eli's Mile High Club, at 3629 Martin Luther King Jr. Way in Oakland.

Share this story

Cyrus Farivar
Cyrus is a Senior Tech Policy Reporter at Ars Technica, and is also a radio producer and author. His latest book, Habeas Data, about the legal cases over the last 50 years that have had an outsized impact on surveillance and privacy law in America, is out now from Melville House. He is based in Oakland, California. Emailcyrus.farivar@arstechnica.com//Twitter@cfarivar

137 Reader Comments

Bringing this shit to light and demanding government oversight because of the proven recalcitrance of these companies to affect meaningful privacy rights sure would make a good Democrat platform, that these companies don't give a shit about your rights and the Republicans aren't going to help you, and further drive people to the polls.

Bringing this shit to light and demanding government oversight because of the proven recalcitrance of these companies to affect meaningful privacy rights sure would make a good Democrat platform, that these companies don't give a shit about your rights and the Republicans aren't going to help you, and further drive people to the polls.

Megacorps throughout the ages have been pursuing "growth at any cost" by all sorts of underhanded, unethical, and sometimes downright evil methods. Even centuries ago the East India Company flooded China with highly addictive opium to boost trade revenue in an otherwise self-sufficient country.

I've always likened megacorps as zombies. They lose any sense of humanity and long-sightedness and mindlessly chase brains profit even at the cost of their own longterm viability and customer base.

Bringing this shit to light and demanding government oversight because of the proven recalcitrance of these companies to affect meaningful privacy rights sure would make a good Democrat platform, that these companies don't give a shit about your rights and the Republicans aren't going to help you, and further drive people to the polls.

...Just sayin'...

But these companies are all run by Democrats.

If you think upper management of any of these companies identify with us proles in any meaningful way I don't know what to say other than... read more?

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

Privacy is not concerned about software versions but about behavior, a very telling mistakes in the analysis. It is the behavior of collecting and disseminating user information that needs to be tightly regulated or stopped. This is fundamentally not a software problem but an ethical problem. Unethical people will do anything that is legal regardless of its morality. To stop them, the behavior has to be made illegal with some serious punishment meted out for violations.

Another point to make is the one-sided EULAs need to be more thoroughly scrutinized to see if they pass legal muster. It is not that there shouldn't be a contract (what the EULA actually is) but is the contract a fair contract for the user.

The analogy to an air bag is false because it is quite possible the problem was a design flaw that only showed up when enough were in service. So a redesign and recall makes sense for this type of problem.

I don't know where he gets the idea that none of the food can contain Arsenic. Every single piece of food does contain Arsenic, it's just the levels are low. If you eat brown rice every day you are at the make daily intake of Arsenic.

I should just kill myself and get it over with.

Or just stop shitposting for the sake of showing how cool you think you are.

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

The right to privacy, which is not explicitly stated in the US Constitution, is part of the Roe v Wade decision. So there is US case law from SCOTUS saying there is a right to privacy. All it needs is some fleshing out.

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

The right to privacy, which is not explicitly stated in the US Constitution, is part of the Roe v Wade decision. So there is US case law from SCOTUS saying there is a right to privacy. All it needs is some fleshing out.

The right to privacy has a much longer lineage than Roe, although it was cited in that case. It certainly extends back to the 19th century, and is grounded in Amendments 1, 3, 4, and 5.

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

The right to privacy, which is not explicitly stated in the US Constitution, is part of the Roe v Wade decision. So there is US case law from SCOTUS saying there is a right to privacy. All it needs is some fleshing out.

The problem is not enough of the general population knows or cares. If enough people complained maybe things would happen. But people keep using garbage like Facebook instead of saying enough is enough.

Megacorps throughout the ages have been pursuing "growth at any cost" by all sorts of underhanded, unethical, and sometimes downright evil methods. Even centuries ago the East India Company flooded China with highly addictive opium to boost trade revenue in an otherwise self-sufficient country.

I've always likened megacorps as zombies. They lose any sense of humanity and long-sightedness and mindlessly chase brains profit even at the cost of their own longterm viability and customer base.

I don't see this changing anytime soon.

I like the story about the Hudson Bay Company--when the 'mountain men' started trapping fur animals to compete with the Hudson Bay trade, the executives at Hudson Bay ordered that all fur barring animals in the Oregon territories be slaughtered immediately--Nothing like calling for a 'mass extinction' to thwart a business rival.

I think the current situation has a lot to do with the income inequality that exists.

A person who makes a million remembers what it was like to do without.A person who makes 10 million per year quickly enters into a world of private schools and private parties and private jets and private meetings at the private golf course.

I have some contact with that world -- it is completely unlike the world I actually live in.

I think those people forget-- and/ or they inherited and never understood in the first place...

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened.

To me, this is even more rattling as it pertains to Richard Stallman.The man has been called a lunatic for over 30 years now. Even I, having been a member of the open source community for decades and generally admiring Stallman, used to consider him a bit radical.

But turns out he was absolutely right about everything.Closed source is dangerous to our privacy, and maybe our democracy.

Bringing this shit to light and demanding government oversight because of the proven recalcitrance of these companies to affect meaningful privacy rights sure would make a good Democrat platform, that these companies don't give a shit about your rights and the Republicans aren't going to help you, and further drive people to the polls.

...Just sayin'...

But these companies are all run by Democrats.

If you think upper management of any of these companies identify with us proles in any meaningful way I don't know what to say other than... read more?

If you think my current government representatives identify with us proles in any meaningful way, I don't know what to say... wake up?

Trusting people who want to do business with me is easier for me than trusting those who are in government.It has to do with my life experience.

You seem like a very important person and I value your opinion on these matters.

I think it's a legitimate question to ask: Do we trust 'these' people in power or 'those' people in power? Personally, I'm reluctant to put much too much trust into either group of people, and so I vote with my wallet as well as my with my ballot and hope it makes a difference.

I don't know where he gets the idea that none of the food can contain Arsenic. Every single piece of food does contain Arsenic, it's just the levels are low. If you eat brown rice every day you are at the make daily intake of Arsenic.

I think we can all agree that statements of the form "X does not contain Y" are almost never true in a literal sense but should be interpreted as "X does not contain Y at any level that is significant in the context of the current discussion". We can also all agree that we'd come across as even more pedantic than we already do if we always (or even ever) used the second form...

"Growth for the sake of growth is the ideology of the cancer cell." - Edward Abbey circa ~ 1968

Abbey had the right idea, but couldn't make the appropriate mark on the "us" at the time. If we'd all, regardless of politics, listen to the message, and then act we could make a difference. Regardless, this growth for the sake of growth, and at any cost, is one of the debilitating factors in our current installation of capitalism. Short-sighted "quarter by quarter... must have growth" thinking clouds, in my opinion, the more important picture of: proper growth; solid foundations; and health of the public and its good.

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

The right to privacy, which is not explicitly stated in the US Constitution, is part of the Roe v Wade decision. So there is US case law from SCOTUS saying there is a right to privacy. All it needs is some fleshing out.

The fourth and fifth amendments explicitly mention privacy , though through different contexts. Roe V Wade was decided on the 14th amendment - equal protections under the law, not privacy.

Between the fourth and fifth amendments, the "right to privacy" is implicit in the meaning. Companies have abridged such rights in the past (such as placing video cameras in locker rooms to catch thieves) and were found to have unlawfully invaded the privacy of their employees. So there's more than enough precedent to define limits on the behaviors of companies without citing a specific constitutional mandate.

The fact is, the constitution's provisions only apply to GOVERNMENT. Laws are what constrain companies, since they are under no obligation to observe the requirements of the Constitution. This is why there are many laws that delineate rights - worker rights, human rights, etc. They restrain companies from violating certain generally accepted "universal rights".

So bringing up the constitution with respect to company behaviors is a red herring argument to begin with.

In that respect, the following quote flummoxed me in it's glaring lack of insight:

Quote:

"Historically, if you wanted to govern airbags, there was a model year, you would recall a certain version, and you would hold a company liable," he said. "Software, particularly Web apps and cloud-based software, is constantly changing. It's not the same for you or [me]. You might be in a test group that I'm not in. To know this version versus that version and [how] the law should affect in this way is incredibly difficult."

Just pass a law saying that if you do something of a nature that generates data from an individual without their explicit consent to do so, you go to jail.

Let the companies figure out how to implement it. It shouldn't be up to the individual to figure out what the API's are. The companies should be dealing with limits placed on THEM and abiding by them or getting fined in proportion to the infraction (meaning they lose all profit from it, and get hit with putative fines to boot - or better still, the people who made the decisions to violate their constraints go to jail).

It's really not that complicated if one looks at it from a more macroscopic viewpoint. We want certain behaviors to stop. Behavior #1 we want stopped is gathering data about people without their explicit consent. If that means those services aren't offered to those who don't consent, that's fine. If that means only limited services (because they still make money from ads) that's fine, too. Make it all opt-in in stages (you can gather the data, but you can't sell, trade or give it to anyone else without explicit consent on a per case basis) for example.

But I don't see how this is at all tricky because it's curbing a specific behavior on the part of bad actors. The real problem that I see is that such laws rarely lack enough teeth to promote deterrence. The fines should include full loss of profit from the behavior and a fine on top of that, and/or holding those responsible for allowing that behavior to happen in the first place personally liable for the company's actions.

I don't really think we need that many new laws, to be honest. I think we need MUCH more painful penalties for violating them. My preference is to hold the corporate members personally liable. That way, the company itself can continue to function, but those who allowed the bad behavior are punished and the executives can't personally profit from that bad behavior.

The problem is not enough of the general population knows or cares. If enough people complained maybe things would happen. But people keep using garbage like Facebook instead of saying enough is enough.

Boycotts (much less so called "vote with your wallet" nonsense) almost never work, when they do work it's often for the WRONG reasons (for one to work requires spread of information AND motivation to commit to inconvenient action, which generally functions best over emotionally charged issues… which just as easily involve misinformation as good information: see anti-vaxxers) and it never stops some other company from quietly engaging in the same practices and getting away with it (none of this is new in regards to Facebook, just the now open information about it having happened for sure is new… linkedIn engaged in horrible practices for years involving spamming contacts pulled from app permissions and is not only doing fine but flourishing, etc)

People do what's personally convenient at the time, based on their knowledge and their own personal rubric of convenience vs quality vs cost.

You can not reliably effect change of these types of issues via the market, particularly not of any entrenched business. Only legislation has a chance of solving this in any meaningful way. If you want this stopped your only real hope is better consumer rights and privacy regulation.

If Facebook dies down in the meantime, it's not going to have been over this (plenty of social network services rise and fall primarily and nearly only over teen fads, for example), and whoever takes their place will easily engage in similar practices, because that's the way to make money in this segment without directly charging consumers for service, and no one wants to pay for a similar service when there are free ones that their entire social network is on.

Classic race to the bottom, which is what markets naturally do, not the opposite like the ultra laissez faire so-called "free market" loonies either naively believe or lie about believing. You can't win on more information in marketing, there's a reason commercials are designed the way they are rather than a careful list of raw facts. Some consumers want to be informed, but even among them most are swayed into a final purchase by combinations of biases (incl availability heuristic) and primary cost vs quality (and primarily of their direct interests in the purchase, not indirect ones) vs convenience concerns.

Bringing this shit to light and demanding government oversight because of the proven recalcitrance of these companies to affect meaningful privacy rights sure would make a good Democrat platform, that these companies don't give a shit about your rights and the Republicans aren't going to help you, and further drive people to the polls.

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

The right to privacy, which is not explicitly stated in the US Constitution, is part of the Roe v Wade decision. So there is US case law from SCOTUS saying there is a right to privacy. All it needs is some fleshing out.

The current court majority is more likely to gut it.

Elections matter.

are you under the impression the Supreme Court can just change laws at the stroke of a pen whenever it feels like it? 'cos that's not how it works. there are a lot of hurdles to clear before you can get something in front of the SC. in order for anything significant to happen relevant to Roe v. Wade, first a case would have to be brought in the lower courts by someone with standing (i.e. someone directly affected by the law.) Then it has to work its way up through appeals and the circuit courts. Then you have to hope your case is one of the roughly 1-2% of petitions they grant cert to and agree to hear. and that's a big hurdle, since IIRC the majority of cases they agree to hear are the result of a "split" in the lower courts, either between federal circuits or state supreme courts and the federal courts.

and the last hurdle is that the SC has historically been loath to reverse itself. and even then, they don't necessarily invalidate an entire law, just certain aspects of it. For example, the Heller decision did not "ban gun control." It did two things, 1) declared that the right to keep and bear arms was an individual right, and 2) struck down a specific law in Washington D.C. as infringing on that right. IIRC the majority opinion even specifically said gun control measures were not per se unconstitutional, just the specific D.C. law they ruled on.

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

The right to privacy, which is not explicitly stated in the US Constitution, is part of the Roe v Wade decision. So there is US case law from SCOTUS saying there is a right to privacy. All it needs is some fleshing out.

The fourth and fifth amendments explicitly mention privacy , though through different contexts. Roe V Wade was decided on the 14th amendment - equal protections under the law, not privacy.

Between the fourth and fifth amendments, the "right to privacy" is implicit in the meaning. Companies have abridged such rights in the past (such as placing video cameras in locker rooms to catch thieves) and were found to have unlawfully invaded the privacy of their employees. So there's more than enough precedent to define limits on the behaviors of companies without citing a specific constitutional mandate.

The fact is, the constitution's provisions only apply to GOVERNMENT. Laws are what constrain companies, since they are under no obligation to observe the requirements of the Constitution. This is why there are many laws that delineate rights - worker rights, human rights, etc. They restrain companies from violating certain generally accepted "universal rights".

So bringing up the constitution with respect to company behaviors is a red herring argument to begin with.

In that respect, the following quote flummoxed me in it's glaring lack of insight:

Quote:

"Historically, if you wanted to govern airbags, there was a model year, you would recall a certain version, and you would hold a company liable," he said. "Software, particularly Web apps and cloud-based software, is constantly changing. It's not the same for you or [me]. You might be in a test group that I'm not in. To know this version versus that version and [how] the law should affect in this way is incredibly difficult."

Just pass a law saying that if you do something of a nature that generates data from an individual without their explicit consent to do so, you go to jail.

Let the companies figure out how to implement it. It shouldn't be up to the individual to figure out what the API's are. The companies should be dealing with limits placed on THEM and abiding by them or getting fined in proportion to the infraction (meaning they lose all profit from it, and get hit with putative fines to boot - or better still, the people who made the decisions to violate their constraints go to jail).

It's really not that complicated if one looks at it from a more macroscopic viewpoint. We want certain behaviors to stop. Behavior #1 we want stopped is gathering data about people without their explicit consent. If that means those services aren't offered to those who don't consent, that's fine. If that means only limited services (because they still make money from ads) that's fine, too. Make it all opt-in in stages (you can gather the data, but you can't sell, trade or give it to anyone else without explicit consent on a per case basis) for example.

But I don't see how this is at all tricky because it's curbing a specific behavior on the part of bad actors. The real problem that I see is that such laws rarely lack enough teeth to promote deterrence. The fines should include full loss of profit from the behavior and a fine on top of that, and/or holding those responsible for allowing that behavior to happen in the first place personally liable for the company's actions.

I don't really think we need that many new laws, to be honest. I think we need MUCH more painful penalties for violating them. My preference is to hold the corporate members personally liable. That way, the company itself can continue to function, but those who allowed the bad behavior are punished and the executives can't personally profit from that bad behavior.

The companies already have that covered. If you want to use the product or software (yes, licensing is here for physical products as well as software), you must agree to let the company sell you and your information. Your only option is to not purchase or use the product or software. If you pay for it rather than get it "for free," you might have options to limit peripheral annoyances (like some ads), but not the fundamental data collection and sale. There's nothing in the Constitution that can be used to justify a law that will prevent that, because it's being done as a private contract. Yes, it's remotely possible that contract law might be revised to rein in things somewhat, since many of those contracts are obviously abusive, but considering that the "people" that the government is of, by, and for is the ownership class of society, the chances of such a thing happening are about the same as for one of us jumping on a starship and heading "out" somewhere later today.

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

The right to privacy, which is not explicitly stated in the US Constitution, is part of the Roe v Wade decision. So there is US case law from SCOTUS saying there is a right to privacy. All it needs is some fleshing out.

The current court majority is more likely to gut it.

Elections matter.

are you under the impression the Supreme Court can just change laws at the stroke of a pen whenever it feels like it?

Bringing this shit to light and demanding government oversight because of the proven recalcitrance of these companies to affect meaningful privacy rights sure would make a good Democrat platform, that these companies don't give a shit about your rights and the Republicans aren't going to help you, and further drive people to the polls.

...Just sayin'...

But these companies are all run by Democrats.

If you think upper management of any of these companies identify with us proles in any meaningful way I don't know what to say other than... read more?

It’s not uncommon for people to identify themselves politically as something that seems at odds with their interests and personal behaviour. Plenty of very wealthy people call themselves socialists while not really doing or supporting anything that could meaningfully be described as socialist. A close family friend of mine was always very well paid, he had significant personal wealth and did everything in his power to pay as little tax as possible, while also being able to tell you with a straight face that he was a communist!

The upper echelons of political parties tend to be far more focused on doing well for themselves and their families than they are aligned with the needs or interests of their voters. This makes them gravitate towards wealthy business leaders, especially in countries like the US where lobbying and fundraising are so important.

I worked in politics and found that party affiliation mattered far less than individual character and principles when it came to deciding whether a politician was likely to act in the interests of the common man.

Bringing this shit to light and demanding government oversight because of the proven recalcitrance of these companies to affect meaningful privacy rights sure would make a good Democrat platform, that these companies don't give a shit about your rights and the Republicans aren't going to help you, and further drive people to the polls.

...Just sayin'...

But these companies are all run by Democrats.

If you think upper management of any of these companies identify with us proles in any meaningful way I don't know what to say other than... read more?

I doubt they (liberal tech executives) identify with any of us plebes or the democratic base in the same way oil executives don't identify with the republican base. He was commenting on the OP that called it a good democratic platform, and that is a fair point.

This is yet another example of the Kassandra Effect - someone with clear insight sounds the warning bell and we ignore it because it fits our personal narrative to do so, only realizing after the fact that we should have listened. Ever since reading the Iliad and the Odyssey, Kassandra keeps cropping up everywhere I look....

Ultimately, unless the Congress can craft - and have go into effect and enforce - privacy regulations with actual teeth (e.g., significant financial penalties ala GDPR, etc.), there is so little incentive for firms to change their data mining and selling. It's big business, and piddly penalties are just "the cost of doing business" (i.e., if I earn $1.1M for selling your data - and for selling your data I get fined $10K, who cares?).

And I hate to sound cynical, but I question whether the "into effect and enforce" part will ever happen.

How many of you realize that the Dodd-Frank law to preclude another 2008 financial crises originally included the requirement that loan companies/banks retain a 5% interest in the loan? And how many of you realize that this part of the law was immediately challenged in court - and overthrown?

To be honest - writing a decent privacy law in the US will be relatively easy *compared to* enacting and enforcing it; that's where the rubber hits the road.

The right to privacy, which is not explicitly stated in the US Constitution, is part of the Roe v Wade decision. So there is US case law from SCOTUS saying there is a right to privacy. All it needs is some fleshing out.

The current court majority is more likely to gut it.

Elections matter.

are you under the impression the Supreme Court can just change laws at the stroke of a pen whenever it feels like it?

Chaneg laws? No. Gut laws?

Absolutely.

what is the difference? "gutting" a law is changing it, no?

and I expanded my post above. the SC is an appellate court. they can't just up and decide to "gut" (change) a law on their own. a challenge to the law has to work its way up through the court system by someone with standing to challenge it.

The problem is not enough of the general population knows or cares. If enough people complained maybe things would happen. But people keep using garbage like Facebook instead of saying enough is enough.

I think part of the problem is that the bad things Facebook have been doing (or allowed to happen) haven’t tended to impact individuals on a personal level. They hear about Facebook being a malign influence, but the threat seems abstract and distant, while the benefits they enjoy from using social media are relatively personal and obvious.

The problem with FB is that the arguments against it frequently confuse several different issues:

1. What level of personal data privacy should govern all companies? EU has a very different view to the USA.

2. Does FB have a duty to provide impartial news (IMO no).

3. Does FB have such a market dominance that it is now acting in ways which breaches US anti-trust (US law) or anti-competition law (EU position).

It is only the data privacy which is a new issue, by which I mean an issue which, prior to the internet age, was never considered because it was not practical.

IMO the EU has approached data privacy in a much better way than the US. That does not mean the EU has it completely right - but the general principles are definitely a massive improvement to the US "anything goes" principle

The problem with FB is that the arguments against it frequently confuse several different issues:

1. What level of personal data privacy should govern all companies? EU has a very different view to the USA.

2. Does FB have a duty to provide impartial news (IMO no).

3. Does FB have such a market dominance that it is now acting in ways which breaches US anti-trust (US law) or anti-competition law (EU position).

It is only the data privacy which is a new issue, by which I mean an issue which, prior to the internet age, was never considered because it was not practical.

IMO the EU has approached data privacy in a much better way than the US. That does not mean the EU has it completely right - but the general principles are definitely a massive improvement to the US "anything goes" principle

I wondering if the outlook would be same if these were European companies or the reverse would be true in the US.

You have to be ignorant to ever use Facebook. Or you have to like to engage in risky behavior. That was obvious many years ago. All of a sudden people demand "privacy" while standing naked at an open window. If you need to share your thoughts with friends, you can just call them on the phone. Then only the NSA is listening.

The FTC laws were made for prduct cycles measured in years. It took many years for the first air bags to be put in cars. Even then there have been problems with the implementation. To expect a constant barrage of new "features" and at the same time expect that any software platform will be secure is ridiculous. It can't be done.

The proliferation of different browsers, apps, and OS's means that regression testing is impossible. The game has changed. Stay off social media if you value your privacy.