Member's Off-site Blogs

people will believe the worst...

As the world becomes more complex and governments everywhere struggle, trust in the internet is more important today than ever.

The internet is our shared space. It helps us connect. It spreads opportunity. It enables us to learn. It gives us a voice. It makes us stronger and safer together.

To keep the internet strong, we need to keep it secure. That's why at Facebook we spend a lot of our energy making our services and the whole internet safer and more secure. We encrypt communications, we use secure protocols for traffic, we encourage people to use multiple factors for authentication and we go out of our way to help fix issues we find in other people's services.

The internet works because most people and companies do the same. We work together to create this secure environment and make our shared space even better for the world.

This is why I've been so confused and frustrated by the repeated reports of the behavior of the US government. When our engineers work tirelessly to improve security, we imagine we're protecting you against criminals, not our own government.

The US government should be the champion for the internet, not a threat. They need to be much more transparent about what they're doing, or otherwise people will believe the worst.

I've called President Obama to express my frustration over the damage the government is creating for all of our future. Unfortunately, it seems like it will take a very long time for true full reform.

So it's up to us -- all of us -- to build the internet we want. Together, we can build a space that is greater and a more important part of the world than anything we have today, but is also safe and secure. I'm committed to seeing this happen, and you can count on Facebook to do our part.

Facebook has received criticism on a wide range of issues, including its treatment of its users, online privacy, child safety, hate speech, and the inability to terminate accounts without first manually deleting the content. In 2008, many companies removed their advertising from the site because it was being displayed on the pages of individuals and groups they found controversial. The content of some user pages, groups, blogs, and forums has been criticized for promoting or dwelling upon controversial and often divisive topics (e.g., politics, religion, sex, etc.). There have been several censorship issues, both on and off the site.

In the lifespan of its service, Facebook has made many changes that directly impact its users, and their changes often result in criticism. Of particular note are the new user interface format launched in 2008, and the changes in Facebook's Terms of Use, which removed the clause detailing automatic expiry of deleted content. Facebook has also been sued several times.[1]

On August 19, 2013, it was reported that a Facebook user from Palestinian Autonomy Khalil Shreateh found a bug that allowed him to post material to other users' Facebook Walls. Users are not supposed to have the ability to post material to the Facebook Walls of other users unless they are approved friends of those users that they have posted material to. To prove that he was telling the truth, Shreateh posted material to Sarah Goodin's wall, a friend of Facebook CEO Mark Zuckerberg. Following this, Shreateh contacted Facebook's security team with the proof that his bug was real, explaining in detail what was going on. Facebook has a bounty program in which it compensates people a $500+ fee for reporting bugs instead of using them to their advantage or selling them on the black market. However, it was reported that instead of fixing the bug and paying Shreateh the fee, Facebook originally told him that "this was not a bug" and dismissed him. Shreateh then tried a second time to inform Facebook, but they dismissed him yet again. On the third try, Shreateh used the bug to post a message to Mark Zuckerberg's Wall, stating "Sorry for breaking your privacy ... but a couple of days ago, I found a serious Facebook exploit" and that Facebook's security team was not taking him seriously. Within minutes, a security engineer contacted Shreateh, questioned him on how he performed the move and ultimately acknowledged that it was a bug in the system. Facebook temporarily suspended Shreateh's account and fixed the bug after several days. However, in a move that was met with much public criticism and disapproval, Facebook refused to pay out the 500+ fee to Shreateh; instead, Facebook responded that by posting to Zuckerberg's account, Shreateh had violated one of their terms of service policies and therefore "could not be paid." Included with this, the Facebook team strongly censured Shreateh over his manner of resolving the matter. In closing, they asked that Shreateh continue to help them find bugs.

United States president Barack Obama has met with bosses from Facebook, Google and other internet giants to discuss plans to overhaul the surveillance practices of America's spy agencies.

Those attending included Google's executive chairman Eric Schmidt and Facebook founder Mark Zuckerberg, who says he last week called the president personally to express frustration with the vast online intelligence dragnets.

A White House official says the meeting is part of Mr Obama's continuing dialogue on the issues of privacy, technology and intelligence.

"The president reiterated his administration's commitment to taking steps that can give people greater confidence that their rights are being protected, while preserving important tools that keep us safe," the White House said.

But Mr Zuckerberg, a public critic of government data gathering practices, says more needs to be done.

"While the US government has taken helpful steps to reform its surveillance practices, these are simply not enough," he said through a spokesperson.

"People around the globe deserve to know that their information is secure and Facebook will keep urging the US government to be more transparent about its practices and more protective of civil liberties," he said.

Some of the largest US technology companies, including Google, its rival Yahoo, social networking site Twitter and others, have been pushing for more transparency, oversight and restrictions to the US government's gathering of intelligence.

IBM is spending more than a billion dollars to build data centers overseas to reassure foreign customers that their information is safe from prying eyes in the United States government.

And tech companies abroad, from Europe to South America, say they are gaining customers that are shunning United States providers, suspicious because of the revelations by Edward J. Snowden that tied these providers to the National Security Agency’s vast surveillance program.

Even as Washington grapples with the diplomatic and political fallout of Mr. Snowden’s leaks, the more urgent issue, companies and analysts say, is economic. Technology executives, including Mark Zuckerberg of Facebook, raised the issue when they went to the White House on Friday for a meeting with President Obama.

It is impossible to see now the full economic ramifications of the spying disclosures — in part because most companies are locked in multiyear contracts — but the pieces are beginning to add up as businesses question the trustworthiness of American technology products.

In a global rollout from today, Facebook will start removing the message function from its mobile app for iOS and Android and instead require users to install its standalone Messenger app, which, it says, is "fast and reliable."

Facebook is about to eliminate the message feature of its mobile app, pushing its users to install the company’s standalone app Messenger instead, TechCrunch reports.

The company has begun sending out notifications to users in Europe saying that the message service will disappear from Facebook’s main mobile app for iOS and Android in about two weeks.

“We have built a fast and reliable messaging experience through Messenger and now it makes sense for us to focus all our energy and resources on that experience,” the company said in a statement Wednesday, Reuters reports.

Users in a handful of European countries, including England and France, will be the first users forced to download the Messenger app, but eventually users in all countries will see the message service in the main app disappear, spokesman Derick Mains said to Reuters.

There are two interesting lessons to be drawn from the row about Facebook's "emotional contagion" study. The first is what it tells us about Facebook's users. The second is what it tells us about corporations such as Facebook.

In case you missed it, here's the gist of the story. The first thing users of Facebook see when they log in is their news feed, a list of status updates, messages and photographs posted by friends. The list that is displayed to each individual user is not comprehensive (it doesn't include all the possibly relevant information from all of that person's friends). But nor is it random: Facebook's proprietary algorithms choose which items to display in a process that is sometimes called "curation". Nobody knows the criteria used by the algorithms – that's as much of a trade secret as those used by Google's page-ranking algorithm. All we know is that an algorithm decides what Facebook users see in their news feeds.

So far so obvious. What triggered the controversy was the discovery, via the publication of a research paper in the prestigious Proceedings of the National Academy of Sciences that for one week in January 2012, Facebook researchers deliberately skewed what 689,003 Facebook users saw when they logged in. Some people saw content with a preponderance of positive and happy words, while others were shown content with more negative or sadder sentiments. The study showed that, when the experimental week was over, the unwitting guinea-pigs were more likely to post status updates and messages that were (respectively) positive or negative in tone.

Statistically, the effect on users was relatively small, but the implications were obvious: Facebook had shown that it could manipulate people's emotions! And at this point the ordure hit the fan. Shock! Horror! Words such as "spooky" and "terrifying" were bandied about. There were arguments about whether the experiment was unethical and/or illegal, in the sense of violating the terms and conditions that Facebook's hapless users have to accept. The answers, respectively, are yes and no because corporations don't do ethics and Facebook's T&Cs require users to accept that their data may be used for "data analysis, testing, research".

Facebook's spin-doctors seem to have been caught off-guard, causing the company's chief operating officer, Sheryl Sandberg, to fume that the problem with the study was that it had been "poorly communicated". She was doubtless referring to the company's claim that the experiment had been conducted "to improve our services and to make the content people see on Facebook as relevant and engaging as possible.

A tribunal is to hear a legal challenge by civil liberty groups against the alleged use of mass surveillance programmes by UK intelligence services.

Privacy International and Liberty are among those to challenge the legality of alleged "interception, collection and use of communications" by agencies.

It follows revelations by the former US intelligence analyst Edward Snowden about UK and US surveillance practices.

The UK government says interception is subject to strict controls.

The case - also brought by Amnesty International and the American Civil Liberties Union and other groups - centres on the alleged use by UK intelligence and security agencies of a mass surveillance operation called Tempora.

The UK government has neither confirmed nor denied the existence of the operation.

But documents leaked by whistleblower Mr Snowden and published in the Guardian newspaper claimed the existence of Tempora, which the paper said allowed access to the recordings of phone calls, the content of email messages and entries on Facebook.

Facebook, the world’s top social media platform, is reportedly seeking to hire hundreds of employees with US national security clearance licenses.

Purportedly with the aim of weeding out “fake news” and “foreign meddling” in elections.

If that plan, reported by Bloomberg, sounds sinister, that’s because it is. For what it means is that people who share the same worldview as US intelligence agencies, the agencies who formulate classified information, will have a direct bearing on what millions of consumers on Facebook are permitted to access.

It’s as close to outright US government censorship on the internet as one can dare to imagine, and this on a nominally independent global communication network. Your fun-loving place “where friends meet.”

Welcome to Facespook!

As Bloomberg reports: “Workers with such [national security] clearances can access information classified by the US government. Facebook plans to use these people – and their ability to receive government information about potential threats – in the company’s attempt to search more proactively for questionable social media campaigns ahead of elections.”

A Facebook spokesman declined to comment, but the report sounds credible, especially given the context of anti-Russia hysteria.

Over the past year, since the election of Donald Trump as US president, the political discourse has been dominated by “Russia-gate” – the notion that somehow Kremlin-controlled hackers and news media meddled in the election. The media angst in the US is comparable to the Red Scare paranoia of the 1950s during the Cold War.

Facebook and other US internet companies have been hauled in front of Congressional committees to declare what they know about alleged “Russian influence campaigns.” Chief executives of Facebook, Google, and Twitter, are due to be questioned again next month by the same panels.

Palihapitiya’s comments last month were made one day after Facebook’s founding president, Sean Parker, criticized the way that the company “exploit[s] a vulnerability in human psychology” by creating a “social-validation feedback loop” during an interview at an Axios event.

Parker had said that he was “something of a conscientious objector” to using social media, a stance echoed by Palihapitaya who said that he was now hoping to use the money he made at Facebook to do good in the world.

“I can’t control them,” Palihapitaya said of his former employer. “I can control my decision, which is that I don’t use that shit. I can control my kids’ decisions, which is that they’re not allowed to use that shit.”

He also called on his audience to “soul search” about their own relationship to social media. “Your behaviors, you don’t realize it, but you are being programmed,” he said. “It was unintentional, but now you gotta decide how much you’re going to give up, how much of your intellectual independence.”

SAN FRANCISCO — Facebook Inc. warned Monday it could offer no assurance that social media was on balance good for democracy, but the company said it was trying what it could to stop alleged meddling in elections by Russia or anyone else.

The sharing of false or misleading headlines on social media has become a global issue, after accusations that Russia tried to influence votes in the United States, Britain and France. Moscow denies the allegations.

Facebook, the largest social network with more than 2 billion users, addressed social media’s role in democracy in blog posts from a Harvard University professor, Cass Sunstein, and from an employee working on the subject.

“I wish I could guarantee that the positives are destined to outweigh the negatives, but I can‘t,” Samidh Chakrabarti, a Facebook product manager, wrote in his post.

Facebook, he added, has a “moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible.”

Contrite Facebook executives were already fanning out across Europe this week to address the company’s slow response to abuses on its platform, such as hate speech and foreign influence campaigns.

The data was reportedly collected by University of Cambridge psychology academic Aleksandr Kogan via a survey app on Facebook years ago, and then passed onto Cambridge Analytica, which used it to target people with political advertising during the 2016 US election campaign.

The proximate cause is the Cambridge Analytica controversy. In violation of Facebook’s rules, the Trump-linked political consultancy schemed to get access to the data of 87 million users. This has made Facebook a scapegoat for Trump’s victory on par with the Russians and James Comey (at least before the FBI director got fired and became a Trump adversary).

In 2012, Barack Obama’s re-election campaign did a less-underhanded version of the same thing as Cambridge. The great chronicler of the Obama digital operation, Sasha Issenberg, wrote of how its “ ‘targeted sharing’ protocols mined an Obama backer’s Facebook network in search of friends the campaign wanted to register, mobilize, or persuade.”

It’s not Zuckerberg’s fault that he has suddenly been deemed on the wrong side of history, but the Cambridge Analytica blow-up is bringing a useful spotlight on the most sanctimoniously self-regarding large company in America.

Facebook can’t bear to admit that it has garnered the largest collection of data known to man to sell ads against and line the pockets of its founder and investors.

The problem isn’t that Zuckerberg is a businessman, and an exceptionally gifted one, but that he pretends to have stumbled out of the lyrics of John Lennon’s song “Imagine.” To listen to him, Facebook is all about connectivity and openness — he just happens to have made roughly $63 billion as the T-shirt-wearing champion of “the global community,” whatever that means.

It’s this pose that makes him and other Facebook officials sound so shifty. In a rocky interview with Savannah Guthrie of “The Today Show” last week, Sheryl Sandberg was asked what product Facebook sells. “We’re selling the opportunity to connect with people,” she said according to The Washington Post, before catching herself, “but it’s not for sale.”

Something or other must be for sale, or Facebook is the first company to rocket to the top ranks of corporate America based on having no product or profit motive. Guthrie, persisting, stated that Facebook sweeps up data for the use of advertisers. Sandberg objected, “We are not sweeping data. People are inputting data.”

Uh, yeah. That’s the genius of it. In a reported exchange with a friend while he was a student at Harvard, Zuckerberg boasted of having data on thousands of students because “people just submitted it.”