Business

Why Facebook's 'Trending Topics' controversy matters

That's the message you might have taken away if you had only casually followed the latest media freakout over its relationship with the social network. Facebook, of course, does not hate conservatives. It loves everyone, and just wants us all to connect.

It all began last week when a Gizmodo report featured anonymous interviews with contractors who expressed grievances about working on Facebook's "Trending Topics" section, which is a relatively small box on the upper right corner of Facebook's desktop version. Days later, Gizmodo followed up with one person who claimed that other editors had downplayed conservative media outlets in the section.

The implications were big, but the actual impact remained relatively small.

Numerous people in the media industry that spoke on background with Mashable about "Trending Topics" all came to the same conclusion: We don't really know how much traffic that section drives, but it's just not something anybody really cares about that much. Conservative media outlets that spoke with Mashable also did not seem terribly concerned about the issue.

Now, it's quite clear why everyone is so concerned.

After the media firestorm, the launch ofa Senate inquiry, and repeated statements from Facebook, the company ultimately releasedits guidelines for the "Trending Topics" section.

It is a sudden look into the inner workings of a company that is incredibly powerful and very opaque. The conversation had quickly shifted away from the allegations of conservative bias to something bigger: Has Facebook been lying to us all along?

Humans clearly had a far bigger part in "Trending" than we had thought. If this part of Facebook had not been an unbiased, data-driven news source, could that also be true of its far more influential News feed algorithm?

Facebook Elections signs stand in the media area at Quicken Loans Arena in Cleveland, Thursday, Aug. 6, 2015, before the first Republican presidential debate.

Image: John Minchillo/AP Photo

Questions about Facebook's media neutrality come at a particularly sensitive time. Facebook is asking that media companies trust it with the future of their businesses through on-platform features like Instant Articles, bots and most notably live video. As former Chartbeat CEO Tony Haile put it recently, Facebook now hosts, curates, distributes and monetizes whatever the media produces. The media is left to the actual production of editorial content.

That wouldn't be a huge deal if not for the fact that Facebook boasts 1.6 billion users and has not been particularly shy about changing what type of content they see, be it pictures or links or videos. The media had few options other than to trust Facebook to use technology to create a meritocracy for the content itself, the industry's last foothold.

That trust is now in question. And that's a big deal.

A human touch

Facebook's news feed is supposedly run by an algorithm that figures out what to show you based on your interests and what other people are enjoying. "Trending Topics" was also based on an algorithm, but with a bit of humanity on top.

There's a very subtle wrinkle here about what we all seemed to think was going on. "Trending" editors, we thought, were just there to make sure things like "pizza rolls" didn't end up being passed along as news.

Facebook thinks pizza rolls are not newsworthy which should be all the evidence you need that Facebook has no taste pic.twitter.com/r7gApFu33i

Based on what Facebook has finally revealed, it looks the other way around. The algorithm was feeding editors, who were given a leeway to pick topics and judge the importance of news based on other media outlets. The editors could even insert some topics that they saw percolating.

That's far from some diabolical plot to scuttle conservative media. If anything, it sounds like the same way a modern newsroom operates.

"Did Facebook lie? In a "the sky is green" way, no. But in the polished, expertly-hedged, truth-with-a-spin way that corporate communications departments are so good at, yes," wrote Motherboard's Sarah Emerson.

It's not the operation itself that is the issue. Facebook's "Trending" operation looks pretty reasonable. If anything, Facebook's willingness to bring in editors might provide a bit of hope to journalists. As one editor of a conservative news operation told me recently, his concern with Facebook was not about left vs. right but rather the mix between news and entertainment.

Facebook's curation guidelines are actually surprisingly detailed and more thought-out than many newsrooms' procedures probably

I want to believe

It's fair to say the media overreacted to the particular news, but the larger concerns are valid.

Facebook is the game. Plenty of other websites command big audiences, but they don't purport to offer the kind of meritocratic payoff of Facebook. Win or lose, we all expected that Facebook's users — not Facebook — were the deciding factor. That was essential considering how powerful Facebook had become.

Which is what lead to the collective media freakout. That's what happens when a dominant company rolls out products without providing much of any transparency to the public or to journalists. What Facebook asked was for faith that it would keep its promise of technologically guaranteed fair play. Now, that promise has been broken, perhaps accidentally.

Believing in that promise in the first place was pretty naive. As my colleague Seth Fiegerman pointed out, "algorithm" is just a fancy word for a math equation written by people. The reality of imperfect human involvement creates a certain amount of responsibility from Facebook and many other companies to provide a better understanding of what they're doing.

What we do know is that Facebook has been pushing live video in the news feed — and paying major media companies to provide the content (including Mashable). That combination would seem to be a far more obvious and top-down editorial strategy than the "Trending" fiasco. Facebook changed the game and then began picking partners.

Which is a far too long way of saying: It's not what Facebook did; it's how they represented what they were doing. "Trending" wasn't just an extension of the news feed. It was a mini-newsroom. Just telling us that from the jump would have sufficed.

Now, it's tougher to trust Facebook. And with a company this big and powerful, it would be criminally negligent to give them the benefit of the doubt.

Does Facebook really need our trust? Based on the company's own reaction, it would seem so. Mark Zuckerberg posted to his personal page on the issue.

"The reason I care so much about this is that it gets to the core of everything Facebook is and everything I want it to be. Every tool we build is designed to give more people a voice and bring our global community together," he wrote.

Mashable
is a global, multi-platform media and entertainment company. Powered by its own proprietary technology, Mashable is the go-to source for tech, digital culture and entertainment content for its dedicated and influential audience around the globe.