Share

Norway Is Right to Be Pissed at Facebook Over Photo Censorship

The front cover of Norway's Aftenposten, seen at a newsstand in Oslo on September 9, 2016.

Cornelius Poppe/Norsk Telegrambyra AS/Scanpix/REUTERS

Norway is really pissed at Facebook.

This week, the world's largest social network banned an iconic photo taken during a napalm attack during the Vietnam war, because it includes a naked nine-year-old girl. Facebook claimed the photo violated its ban on nudity, and especially child nudity. When Norway's largest newspaper, Aftenposten, reported the ban and included the photograph in its story, Facebook also banned the story. So the paper published an open letter to Mark Zuckerberg, accused him of abusing his power. And then the Norwegian Prime Minister got involved. She tried to pub the photo to Facebook, accusing the company of censorship and curbing freedom of expression, and Facebook deleted that too.

But after all the hullabaloo, Facebook backpedaled, saying today that it would reinstate the photo. It's a happy ending. Except that this will happen again. This kind of thing happens all the time.

Facebook juggles images and text from over 1.7 billion across the globe. And this means that if it's going to police pornography and other stuff most people don't want to see—like beheadings—it is going to make mistakes. That is just the nature of the beast. Computer algorithms just aren't good enough to handle this stuff in the way it should always be handled. And when you have humans do it, they are going to make errors of judgment. But what we will say, Facebook, is the world needs to better understand how this stuff is handled. Right now, the world doesn't understand it at all. And it deserves to.

The Twitter Conundrum

Mark Zuckerberg and company are in a no-win situation. Some people say: "Why don't you just let me see everything my friends post?" After all, that's pretty much what services like Twitter and Reddit do. But Twitter and Reddit have received so much criticism from other people because they don't police content. The critics complain that they promote harassment and hate speech. And, well, both of them have struggled to grow the way Facebook has.

Facebook must find the right middle ground. That's kind of what it is trying to do, and this means it's going to get things wrong, sometimes very wrong. On some level, that's fine. The trouble is that we have so little insight into how the system works and how it breaks down. In that front-page story, Espen Egil Hansen, the editor of Norway's largest newspaper, called mark Zuckerberg "the world’s most powerful editor" and accused him of abusing his power. But who knows what's really going on here. The decision to censor that photograph was made not by Zuckerberg but by some mix of technology and bureaucracy buried deep inside the organization.

Facebook likely relies on a combination of algorithms and human labor–much of it provided by contractors–to block pornography, videos of beheadings, and other unsavory things from the site. But what those algorithms look for and what policies are in place for its human moderators remains a mystery. Yes, it has its community standards page that explains that hate speech, nudity, and graphic violence are banned. But who exactly decides whether, say, photos of dead soldiers counts as graphic violence, or whether an iconic photo violates the nudity ban? What criteria do they use to make that decision? How much is human and how much is tech? What role do the algorithms really play?

Traditional publishers have to make judgement calls like this all the time. The difference is that if a traditional publisher—be it Gawker or The New York Times—decides to run or withhold a particular photo, it's clear who made the decision. An editor or group of editors can say exactly why the decision was made. You might not always agree with their reasoning, or even believe the answers they give, but you know who to hold accountable.

A Technology Company Run By Humans

At an event in Rome last month, Zuckerberg described Facebook as a technology platform rather than a news company. The trouble with that logic is that while Facebook itself might not be reporting the news, it has a tremendous influence over what news people actually see. And that requires editorial judgement, as the recent trending topics controversy demonstrated. Plus, Facebook is not driven solely by technology. It's driven by a mix of humans and technology. And, well, humans build the technology. They decide how it works.

Gizmodo reported earlier this year that Facebook employees were deliberately removing stories from conservative websites from its trending stories results. Facebook denied the claims, and in an apparent effort to shield itself from accusations of editorial bias, fired the trending topics editorial staff and automated the entire process. Shortly after relaunching, the trending stories section promoted a fake news story about Fox News anchor Megyn Kelly, demonstrating why the company had editors screening the results in the first place.

Artificial intelligence won't solve this problem. Facebook relies heavily on a branch of artificial intelligence called deep learning, which is noted for producing inscrutable algorithms. Yes, humans write the algorithms, and influence them by training them on different sets of data. But in many cases AI developers don't actually understand how their algorithms work, only that they do work. That will inevitably end up leading to unintended consequences that require human editorial judgement to resolve. You just can't take people out of the publishing loop. Of course, Facebook has long understood this. That's why its moderation system relies on a combination of automation and human review and will do so for the foreseeable future.

While Zuckerberg himself probably isn't sitting around his office dictating which breastfeeding photos to ban, he still bears ultimate responsibility for Facebook's policies. Facebook might not be a news organization, but it's definitely an editorial organization, and an extremely powerful one at that. And as Facebook's influence over what we see online grows, so too does the need to hold it accountable.