Mark Zuckerberg also launched Facebook with a disdain for intrusive advertising, but it wasn’t long before the social network giant became Google’s biggest competitor for ad dollars. After going public with 845 million users in 2012, Facebook became a multibillion-dollar company and Zuckerberg one of the richest men on Earth, but with only a promise that the company would figure out how to monetize its platform.

Facebook ultimately sold companies on its platform by promising “brand awareness” and the best possible data on what consumers actually liked. Brands could start their own Facebook pages, which people would actually “like” and interact with. This provided unparalleled information about what company each individual person wanted to interact with the most. By engaging with companies on Facebook, people gave corporate marketing departments more information than they could have ever dreamed of buying, but here it was offered up free.

This was the “grand bargain,” as Columbia University law professor Tim Wu called it in his book, The Attention Merchants, that users struck with corporations. Wu wrote that Facebook’s “billions of users worldwide were simply handing over a treasure trove of detailed demographic data and exposing themselves to highly targeted advertising in return for what, exactly?”

In other words: We will give you every detail of our lives and you will get rich by selling that information to advertisers.

European regulators are now saying that bargain was a bad deal. The big question that remains is whether their counterparts in the U.S. will follow their lead.

Facebook's ability to figure out the "people we might know" is sometimes eerie. Many a Facebook user has been creeped out when a one-time Tinder date or an ex-boss from 10 years ago suddenly pops up as a friend recommendation. How does the big blue giant know?

While some of these incredibly accurate friend suggestions are amusing, others are alarming, such as this story from Lisa*, a psychiatrist who is an infrequent Facebook user, mostly signing in to RSVP for events. Last summer, she noticed that the social network had started recommending her patients as friends—and she had no idea why.

"I haven't shared my email or phone contacts with Facebook," she told me over the phone.

The next week, things got weirder.

Most of her patients are senior citizens or people with serious health or developmental issues, but she has one outlier: a 30-something snowboarder. Usually, Facebook would recommend he friend people his own age, who snowboard and jump out of planes. But Lisa told me that he had started seeing older and infirm people, such as a 70-year-old

urope has propelled past the United States when it comes to constraining the abuses of Big Tech. In June, the European Union fined Google $2.7 billion for steering web users to its shopping site, and investigations remain active over similar treatment on Android phones. European regulators fined Facebook for lying about whether it could match user profiles with phone numbers on its messaging acquisition WhatsApp. They demanded Apple repay $15.3 billion in back taxes in Ireland. And they forced Amazon to change its e-book contracts, which they claimed inappropriately squeezed publishers.
AP Photo/Rick Bowmer

Trust-Busted: In 2002, Microsoft Chairman Bill Gates had to testify at federal court in his company's antitrust case. The public trial led Microsoft to sfoten its aggressive strategy against rivals.

Unfortunately, these actions were treated mainly as the cost of doing business. The Facebook fine totaled not even 1 percent of the $22 billion purchase price for WhatsApp, and it allowed the two companies to remain partnered. Government policy, in effect, has “told these companies that the smart thing to do is to lie to us and break the law,” said Scott Galloway in his presentation. Google’s remedy in the shopping case still forces rivals to bid for placement at the top of the page, with Google Shopping spun off as a stand-alone competitor. This does weaken Google’s power and solves the “equal treatment” problem, but it doesn’t protect consumers, who will ultimately pay for those costly bids. “The EU got a $2.7 billion fine to hold a party and bail out Greek banks,” said Gary Reback, an antitrust lawyer and critic of the EU’s actions. “No amount of money will make a difference.”

However, one thing might: Europe’s increasing move toward data privacy. The General Data Protection Regulation (GDPR), scheduled for implementation in May 2018, empowers European web users to affirmatively opt out of having their data collected, with high penalties for non-compliance. Consumers will be able to obtain their personal data and learn how it is used. They can request that their data be erased completely (known as the “right to be forgotten”) as well as prohibited from sale to third parties. Platforms could not condition use of their products on data collection. A separate, not-yet-finalized regulation called ePrivacy would forbid platforms from tracking users across separate apps, websites, and devices.

When Facebook first came to Cambodia, many hoped it would help to usher in a new period of free speech, amplifying voices that countered the narrative of the government-friendly traditional press. Instead, the opposite has happened. Prime Minister Hun Sen is now using the platform to promote his message while jailing his critics, and his staff is doing its best to exploit Facebook’s own rules to shut down criticism — all through a direct relationship with the company’s staff.

In Cambodia, Prime Minister Hun Sen has held power since 1998, a reign characterized by systematic looting, political patronage and violent suppression of human rights; when opposition parties used Facebook to organize a strong showing in the 2013 elections, Hun Sen turned to the tool to consolidate his slipping hold on power.

In this he was greatly aided by Fresh News, a Facebook-based political tabloid that is analogous to far-right partisan US news sources like Breitbart; which acted as a literal stenographer for Sen, transcribing his remarks in "scoops" that vilify opposition figures and dissidents without evidence. Sen and Fresh News successfully forced an opposition leader into exile in France, and mined Facebook for the identities of political opponents, who were targeted for raids and arrests.

The Cambodian government has cultivated a deep expertise in Facebook's baroque acceptable conduct rules, and they use this expertise to paint opposition speech as in violation of Facebook's policies, using the company's anti-abuse systems to purge their rivals from the platform.

Offline, the government has targeted the independent press with raids and arrests, shutting down most of the media it does not control, making Facebook -- where the government is able to silence people with its rules-lawyering -- the only place for independent analysis and criticism of the state.

Then, last October, Facebook used Cambodia in an experiment to de-emphasize news sources in peoples' feeds -- a change it will now roll out worldwide -- and hid those remaining independent reporters from the nation's view.

Opposition figures have worked with independent researchers to show that the government is buying Facebook likes from clickfarms in the Philippines and India, racking up thousands of likes for Khmer-language posts in territories where Khmer isn't spoken. They reported these abuses to Facebook, hoping to get government posts downranked, but Facebook executives gave them the runaround or refused to talk to them. No action was taken on these violations of Facebook's rules.

Among other things, the situation in Cambodia is a cautionary tale on the risks of "anti-abuse" policies, which are often disproportionately useful to trolls who devote long hours and careful study to staying on the right side of the lines that companies draw up, and scour systems for people they taunt into violations of these rules, getting the platforms to terminate them.

When ordinary Facebook users find a post objectionable, they click a link on the post to report it. Then a Facebook employee judges whether it violates the platform’s rules and should be taken down. In practice, it’s a clunky process that involves no direct communication or chance for appeal, and the decisions made by Facebook can seem mysterious and arbitrary.

But for the Cambodian government, that process has been streamlined by Facebook.

Duong said every couple of months, his team would email an employee they work with at Facebook to request a set of accounts be taken down, either based on language they used or because their accounts did not appear to be registered to their real names, a practice Facebook’s rules forbid. Facebook often complies, he said.

Clare Wareing, a spokesperson for Facebook, said the company removes “credible threats, hate speech, and impersonation profiles when we’re made aware of them.” Facebook says it only takes down material that violates its policies.

Since the earliest days of Facebook, social scientists have sent up warnings saying that the ability to maintain separate "contexts" (where you reveal different aspects of yourself to different people) was key to creating and maintaining meaningful relationships, but Mark Zuckerberg ignored this advice, insisting that everyone be identified only by their real names and present a single identity to everyone in their lives, because anything else was "two-faced."

Zuck was following in the footsteps of other social network entrepreneurs who attempted to impose their own theories of social interaction on mass audiences -- danah boyd has written and presented extensively on the user rebellions of Friendster from people who wanted to form interest-based affinity groups and use pseudonymous identities for different activities, which Friendster rejected out of a mix of commercial concerns (it wanted users to arrange their social affairs to make it easier to monetize them) and fringe theories of social interaction.

But while all the other social networks collapsed, Facebook thrived, and imposed the Zuckerberg model of "one identity, one context" on billions of users, who, research consistently finds, are made unhappy and angry by their use of the service, but are nevertheless psychologically compelled to continue using it, creating a vicious feedback loop that even Zuck has acknowledged as a risk to his business.

In 2008, I found myself speaking with the big boss himself, Facebook CEO Mark Zuckerberg. I was in the second year of my Ph.D. research on Facebook at Curtin University. And I had questions.

Why did Facebook make everyone be the same for all of their contacts? Was Facebook going to add features that would make managing this easier?

To my surprise, Zuckerberg told me that he had designed the site to be that way on purpose. And, he added, it was "lying" to behave differently in different social situations.

Up until this point, I had assumed Facebook's socially awkward design was unintentional. It was simply the result of computer nerds designing for the rest of humanity, without realising it was not how people actually want to interact.

The realisation that Facebook's context collapse was intentional not only changed the whole direction of my research but provides the key to understanding why Facebook may not be so great for your mental health.

The secret history of Facebook depression Dr Kate Raynes-Goldie/Phys.org »

dismissing Facebook’s change as a mere strategy credit is perhaps to give short shrift to Zuckerberg’s genuine desire to leverage Facebook’s power to make the world a better place. Zuckerberg argued in his 2017 manifesto Building Global Community:

Progress now requires humanity coming together not just as cities or nations, but also as a global community. This is especially important right now. Facebook stands for bringing us closer together and building a global community. When we began, this idea was not controversial. Every year, the world got more connected and this was seen as a positive trend. Yet now, across the world there are people left behind by globalization, and movements for withdrawing from global connection. There are questions about whether we can make a global community that works for everyone, and whether the path ahead is to connect more or reverse course.

Our job at Facebook is to help people make the greatest positive impact while mitigating areas where technology and social media can contribute to divisiveness and isolation. Facebook is a work in progress, and we are dedicated to learning and improving. We take our responsibility seriously.

That, though, leaves the question I raised in response to that manifesto:

Even if Zuckerberg is right, is there anyone who believes that a private company run by an unaccountable all-powerful person that tracks your every move for the purpose of selling advertising is the best possible form said global governance should take?

My deep-rooted suspicion of Zuckerberg’s manifesto has nothing to do with Facebook or Zuckerberg; I suspect that we agree on more political goals than not. Rather, my discomfort arises from my strong belief that centralized power is both inefficient and dangerous: no one person, or company, can figure out optimal solutions for everyone on their own, and history is riddled with examples of central planners ostensibly acting with the best of intentions — at least in their own minds — resulting in the most horrific of consequences; those consequences sometimes take the form of overt costs, both economic and humanitarian, and sometimes those costs are foregone opportunities and innovations. Usually it’s both.

Facebook’s stated reasoning for this change only heightens these contradictions: if indeed Facebook as-is harms some users, fixing that is a good thing. And yet the same criticism becomes even more urgent: should the personal welfare of 2 billion people be Mark Zuckerberg’s personal responsibility?

Here’s how this golden age of speech actually works: In the 21st century, the capacity to spread ideas and reach an audience is no longer limited by access to expensive, centralized broadcasting infrastructure. It’s limited instead by one’s ability to garner and distribute attention. And right now, the flow of the world’s attention is structured, to a vast and overwhelming degree, by just a few digital platforms: Facebook, Google (which owns YouTube), and, to a lesser extent, Twitter.

These companies—which love to hold themselves up as monuments of free expression—have attained a scale unlike anything the world has ever seen; they’ve come to dominate media

Not to put too fine a point on it, but all of this invalidates much of what we think about free speech—conceptually, legally, and ethically.

The most effective forms of censorship today involve meddling with trust and attention, not muzzling speech itself.

What’s more, all this online speech is no longer public in any traditional sense. Sure, Facebook and Twitter sometimes feel like places where masses of people experience things together simultaneously. But in reality, posts are targeted and delivered privately, screen by screen by screen.

CEO Mark Zuckerberg wrote on Facebook today, “I’m changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” VP of News Feed Adam Mosseri tells TechCrunch “I expect that the amount of distribution for publishers will go down because a lot of publisher content is just passively consumed and not talked about. Overall time on Facebook will decrease, but we think this is the right thing to do.”

The winners in this change will be users and their sense of community, since they should find Facebook more rewarding and less of a black hole of wasted time viewing mindless video clips and guilty-pleasure articles. And long-term, it should preserve Facebook’s business and ensure it still has a platform to provide referral traffic for news publishers and marketers, albeit less than before.

The biggest losers will be publishers who’ve shifted resources to invest in eye-catching pre-recorded social videos, because Mosseri says “video is such a passive experience”. He admits that he expects publishers to react with “a certain amount of scrutiny and anxiety”, but didn’t have many concrete answers about how publishers should scramble to react beyond “experimenting . . . and seeing . . what content gets more comments, more likes, more reshares.”