Even the most casual observer of the media would find it difficult to avoid the furor surrounding this week's allegations of bias against conservative viewpoints in Facebook's Trending section.

Facebook CEO Mark Zuckerberg, a noted social progressive (IMHO admirably so) recently took to his personal page to fervently deny any bias in the Trending section. He announced he would "be inviting leading conservatives and people from across the political spectrum to talk with me about this and share their points of view."

While these aren't the first allegations of Trending section censorship levied against Facebook, hopefully Zuck's rhetorical dedication to inclusiveness will satisfy his company's detractors and further prove that, as one Mashable writer didn't feel at all embarrassed to gush, Facebook "does not hate conservatives. It loves everyone, and just wants us all to connect."

I have little doubt that Facebook will devise a way to create a more transparent system in which human Trending editors can more accurately represent the spectrum of its users. But should that really be the goal?

The truth of the matter is that any system that includes humans in the mix can never be perfect. As long as humans—we petty, fallible, emotional, illogical things—are involved, there will always be room for subjectivity, error, or outright malpractice (i.e. the human element).

In fact, the technology to replace Facebook's human editors with cold, unshakable algorithms probably already exists. According to a post by Facebook's VP of Search, Tom Stocky, the Trending section is currently a collaboration between humans and algorithms.

"Popular topics are first surfaced by an algorithm...then audited by review team members to confirm that the topics are in fact trending news in the real world and not, for example, similar-sounding topics or misnomers," he writes.

I was actually taken aback to learn that there even was a "team" of humans editors behind Facebook's Trending section. It seems strange that in 2016, an age when computers are increasing their ability to parse the nuances of language at a breathtaking rate, that some Homo sapien (let alone a team of them) would still be involved with this highly regimented process.

Rival Twitter's influential "Trends" section is (nearly) entirely automated. "Trends are determined by an algorithm and, by default, are tailored for you based on who you follow and your location," its FAQ section states. "This algorithm identifies topics that are popular now, rather than topics that have been popular for a while or on a daily basis, to help you discover the hottest emerging topics of discussion on Twitter that matter most to you."

Of course, even Twitter Trends are not completely free of tampering by sentient beings. Human minders at Twitter have the power to remove offensive trending topics when they naturally arise. But, for the most part this system stands in stark contrast to the heavily curated operation of Facebook Trending, and I find it to be a valuable tool for news discovery. To be sure, it's populated with a number of pop culture and meme hashtags that are of little interest of me, but when relevant news rises, that has often been the place I first find it.

Software > Souls I liken the use of algorithms in news/information discovery to those used in music discovery. Apple argues that one of the biggest selling points for Apple Music is its human-curated playlists. This was never a selling point for me. I love music. And I can say without any hesitation that the majority of treasured musical finds from recent years have come courtesy of the automated playlists on services like Pandora and Spotify. No external "experts" necessary.

I have little doubt that the higher-ups at Facebook at the very least considered the prospect of a purely algorithm-curated Trending section. They evidently concluded that human editors created a better experience than algorithms could, though this could change in the future.

Related

The question about whether Facebook intentionally manipulated its Trending section or not is almost entirely beside the point. The fact that humans are behind any operation leaves open the possibility of bias (or a paranoia about bias, depending on where you stand).

The human factor is sloppy and unfair. And—for better or worse—technology is slowly removing it from the world. On one hand, that means that we—as individuals and as a species—are becoming less necessary, but on the other hand can facilitate a less messy society riddled with all our human hang-ups and uncertainty. Glass half full, people.

Now if we can just figure out the part about how we're all going to support ourselves, we'll be golden.

Evan Dashevsky is a features editor with PCMag and host of our live interview series The Convo. He can usually be found listening to blisteringly loud noises on his headphones while exploring the nexus between tech, culture, and politics. Follow his thought sneezes over on the Twitter (@haldash) and slightly more in-depth diatribin' over on the Facebook.
More »