Search form

TRENDING:

SPONSORED:

Facebook feels the heat

Facebook is becoming more willing to address its effects on politics and society — and to look at changes in its policies.

Months after CEO Mark Zuckerberg said Facebook was not a media company and did not influence the 2016 election in a substantial way, the company is taking new steps to prevent "fake news" from showing up in its feeds.

It is also addressing how terrorists and other criminals have used the social media network, and what to do with the online identities of users who have died.

The company on Thursday said it would start a new series of posts to explore these "hard questions."

“The decisions we make at Facebook affect the way people find out about the world and communicate with their loved ones,” Elliot Schrage, Facebook’s vice president of public policy and communications, wrote in the initial post.

ADVERTISEMENT

“It goes far beyond us. As more and more of our lives extend online, and digital technologies transform how we live, we all face challenging new questions — everything from how best to safeguard personal privacy online to the meaning of free expression to the future of journalism worldwide.”

The first step came later that day, when the company detailed the strategies it employs to find and delete terrorist content on its site. Those efforts include using artificial intelligence and a team of 150 workers.

Facebook’s public reckoning with these issues comes as its extraordinary growth has resulted in pressure from policymakers around the world to address its problems.

The new campaign has been a long time coming. Facebook staffers say privately that the company has been grappling with how to address its massive growth and changes in how users get and use information. Internally, some have admitted the need for change, but insist the social media platform is a net force for good.

Facebook didn’t explain what brought on the decision to publicly address many of these issues now, but a spokesman said that they had been thinking about improving transparency.

“We’ve thought about how to continue to provide a high level of transparency as Facebook grows, and new issues emerge at increasing volume and speed,” the spokesperson said in an email. “We see this blog as an opportunity to communicate more about our decision-making process, and begin an open dialogue on issues as they emerge, regardless of how challenging."

Others in the tech community say Facebook had no choice but to act and put its own practices under the microscope.

“They're trying to grow as fast as possible, and I think that once you reach a certain point globally, you have to take a certain level of responsibility and try to tackle these issues,” said Dipayan Ghosh, who has worked at Facebook and the Obama White House as a policy adviser.

“If they can get those things right, they're set on a course for even more expansion and growth,” he added. “Now, they will need to maintain people's trust, and certainly gain people's trust, in order to expand, but that's in the company's interest to do."

The stakes are high for the company. If users began to question the quality of the content they are receiving that could force many away from the platform and damage the company's bottom line.

And there is growing pressure from governments. Many of the hard questions Facebook wants to tackle have already been raised by politicians and regulators, especially in Europe.

Earlier this month, after a series of terrorist attacks in her country, UK Prime Minister Theresa May accused Facebook and other tech companies of providing a “safe space” for terrorists to gather and organize. After an attack near London Bridge, May outright floated regulating big internet companies like Facebook.

And the company is also facing scrutiny in Europe over antitrust and privacy issues.

The E.U. fined Facebook the equivalent of $122 million last month for sharing data from its encrypted messaging subsidiary WhatsApp with the rest of the company.

Dutch and French privacy regulators have reprimanded Facebook for sharing user data with third parties without consent or notification. Germany has threatened to fine the company for not doing enough to crack down on hate speech and has opened an antitrust investigation into the collection of personal data.

In the U.S., Facebook has faced intense scrutiny over whether it allowed hoax news stories to spread on its platform during the presidential campaign

Immediately after the election, Facebook CEO Mark Zuckerberg dismissed criticisms of his platform’s role in the election outcome.

“Personally I think the idea that fake news on Facebook — of which it’s a very small amount of the content — influenced the election in any way is a pretty crazy idea,” Zuckerberg said at a conference in November.

But in recent months, he has acknowledged that the social media giant has a responsibility to promote reliable information on its platform, while downplaying links to misleading or fake stories.

The company has been adding fact-checking tools for users and has sought out journalists to help combat misinformation.

Facebook has also faced troubling questions over a number of incidents where people have used the site to broadcast violent acts.

A Buzzfeed investigation found that at least 45 violent incidents — including murders, rapes, assaults and suicides — have been broadcast on Facebook Live since the product was launched in December 2015.

Zuckerberg has also been taking a more public role, which many see as a nod to the company's growing influence. He's been touring the U.S. to “learn about people's hopes and challenges, and how they're thinking about their work and communities,” leading many to speculate that he’s laying the groundwork for a future presidential campaign. Zuckerberg has said that he is not running for public office.

Regardless, Facebook’s size and growth is forcing Zuckerberg to take steps to expand its appeal and reassure the public that it is taking their concerns seriously.

“I think Facebook is growing up and going through growing pains and getting there, and part of that process is what's happened over the past couple of days,” Ghosh said.

Some critics are skeptical of Facebook's public reckoning. They say the problems Facebook faces are inherent to its model and don't believe the platform will actually be willing to make any real changes.

Dumas argued that during the election Facebook did little to curb the spread of hyperpartisan stories from both sides proliferating across its platform.

“Part of the reason I don’t have high expectations is that their business model is built on engagement. The way they do is by giving people content that is emotionally charged and reinforces their biases,” he added.