The term “so stupid only an intellectual could believe it” was created for situations like these, where a company’s internal practices and algorithms with political leanings are said to be apolitical in nature. Facebook, which on many occasions overtly manipulated its platform for partisan political purposes, including burying content favourable to Mitt Romney and artificially boosting BLM, is now receiving backlash from its own employees.

A group within the company comprised of 100 employees called “FBs for Political Diversity” have lashed out against Facebook, calling it an intolerant liberal monoculture (liberal being used in the American sense meaning to be on the left). Brian Amerige, a senior engineer for Facebook wrote: “We are a political monoculture that’s intolerant of different views. We claim to welcome all perspectives, but are quick to attack — often in mobs — anyone who presents a view that appears to be in opposition to left-leaning ideology.”

Though this is a trend that many believe to be pervasive to Silicon Valley, Menlo Park, Palo Alto, and the general Tech industry, the accusations of political tribalism and biased algorithms could result as particularly harmful to Facebook in particular.

The reason that Facebook is not legally liable for actions taken on its platform, such as the sharing or replication of copywritten material, is due to the fact that they are not content publishers or curators. By labelling themselves as a communication platform, they are free from many of the legal burdens that publishers could face based on the content shared on their platform.

Facebook maintains that they are not in fact publishers, but as they have been caught pushing and suppressing certain content based on political considerations, this neutral communication platform label could be thrown out the window. This would effectively make Facebook liable for all the content published on their platform which could be ruinous to the platform.

As currently stands, Facebook would be wise to follow a politically neutral approach to content, deleting or banning only very extreme or violent publications. At the moment, they are taking on the role of moral arbiters where they get to decide for their users what content and ideas are best for them, ignoring completely that it is these users who chose to see this type of content.