After denying some of the allegations in the investigation, Mark Zuckerberg said Facebook would create an “independent body” to help make decisions on content moderation. Zuckerberg and his company have to solve a genuine paradox: calibrating the right amount of censorship on a platform touted as a digital space of free expression and communication. One way to solve this dilemma would be to take a different approach to content moderation, one the may even be an upstream swim against the company’s bottom line.

While tech companies assert that their social media platforms are open and neutral, the economic incentives to get views, likes and shares often blind them from thinking about how the negative aspects of society get augmented and spread on their platforms. Digital spaces can be bastions of free speech; a place for friends and family to keep in touch, but they can also be gamed to spread misinformation and political propaganda.

The independent bodies created in Silicon Valley often focus too much on code, and not enough on effective moderation. What Facebook needs is less engineers, designers and lawyers, and more thinkers who can spot how the social dynamics play out in the digital world. It needs people who study things like race, gender, linguistics, social media culture, historical movements and other humanities.

Illustration: Medium

Facebook should put a range of academics on this independent body.

How could Facebook combat fake profiles if it had people who understood speech patterns? How could they have foreseen Russia exploiting our society’s racism and misogyny if people with African American Studies or Gender Studies degrees were a part of the moderation team? How could the company have tried to mitigate the tribal sorting function and outrage amplification of social media if they consulted those who studied group identity in chatroom or video game culture?

Facebook should talk to scholars like Tarleton Gillespie, who studies how content moderation shapes social media. Safiya Noble would show them how societal inequalities and biases get encoded into digital tools. Ramesh Srinivasan or Renata Avila could explain how the cultural biases of the Silicon Valley affect the world. Whitney Phillips could give them a sense of how “trolling” changed the culture of the internet, and even our politics. Joy Buolamwini’s research could help the company cultivate a culture of “algorithmic accountability.” And though it would be a bitter medicine to swallow, consulting critics like Jaron Lanier or Siva Vaidhyanathan could be a huge step in trying to change the toxic feedback loop its platform incentivizes. There’s many, many more names that could be added to this list, but the point is that Facebook needs to look in areas outside of computer science in order to tackle its pressing issues.

There are critiques of the idea that academics would improve Facebook, mainly because of its business model: increase engagement and connect users to advertisers. To borrow an analogy from Lanier and former Google design ethicist Tristan Harris, we can think of Facebook like a casino. Yes, casinos owners create a sociable atmosphere (bright colors, live entertainment, food and drinks, comfortable furniture, ornate decoration, an array of games and machines, etc.) where individuals or parties can come to have a good time. However, we shouldn’t forget that casinos are designed to facilitate one goal: get people to gamble away their money. Though Zuckerberg asserts that he is “driven by a sense of purpose to connect people and bring us closer together,” Facebook’s goal as a business is to keep you on Facebook. There is a possibility that hiring academics wouldn’t change anything.

While these are valid critiques which should be nuanced even further, Facebook has an estimated 2.2 billion people who use its platform. It’s larger than any single country on the planet, yet crucial moderation decisions are being outsourced. Crucial decisions about what users do or do not see is being made by those who likely do not have the full, complex, comprehensive, culturally specific knowledge and analytical training to make these choices.

Facebook is a business and therefore more totalitarian in its governing structure, but making content moderation more democratic would help Zuckerberg make more informed decisions, using the same logic as a president who doesn’t choose a general for every single cabinet position. And whether or not Facebook or any other social media is a “neutral platform” misses the point, since the world we live in is anything but neutral. Having an independent body that is structured as collaboration between experts from different backgrounds would help Facebook deal with the dissonance arising from the clash between its founder’s purported values and the reality of how the platform is used in a complex and crazy world.

Scholarship isn’t a panacea for how social media inflates political divisions and turmoil, but it would help tech innovators make better decisions about how to design their platforms in ways that discourage or prevent misuse. The company needs people with the knowledge and skills to spot individuals and governments seeking to misuse its platform. The problems sparked by technology can’t be solved by technologists alone. Academics have the knowledge and training to spot certain social patterns that AI, algorithms or outsourced workers cannot.