Racist Border Patrol groups show Facebook still has work to do on hate speech

Facebook keeps saying it’s getting better at filtering out racist, sexist, and other hateful content. But if it’s struggling to do so with public posts, the company may have an even harder time in private posts behind tightly closed doors. Case in point: the multiple secret Facebook groups of US Customs and Border Protection agents that reportedly share that type of content.

But this has also put the spotlight, yet again, on Facebook’s moderation policies. Facebook’s automated technologies and artificial intelligence have the ability to detect some of the content that violates its policies, even when it’s posted in secret groups, but those technologies are not perfect. It often relies on users to flag potentially policy-violating content, but in the case of secret groups, users outside the group can’t see it.

Secret Facebook groups are invisible to you unless you’re invited — but not to Facebook

A secret Facebook group is, as its name suggests, secret. People can only join them by invitation, and only current and former members can see the group’s name or description. They’re not like closed groups, which you can request to join to see what’s in them. Secret groups don’t show up in search results, so there’s no way for you to know they exist unless someone invites you in.

But just because non-members can’t see what’s in secret groups, or that they exist, doesn’t mean Facebook can’t. It uses artificial intelligence, human moderators, and other mechanisms to try to filter out content that violates its policies, even when it’s in a secret group. But that has its limitations. According to data from Facebook, it caught about 65 percent of content that violated its hate speech policies in the first quarter of the year before users reported it, or about 4 million hate speech posts. (It’s much better at catching nudity, terrorist content, and fake accounts automatically.)

That means that on hate speech — like what’s being posted in the secret CBP groups — Facebook would need to rely more on users reporting violating content. But of course members of those groups probably aren’t going to be reporting their peers and people they agree with. And because secret groups are such a black box, there’s really no way for outside entities to monitor how much moderation Facebook is really doing.

House Oversight Committee Chair Elijah Cummings (D-MD) sent a letter to Facebook CEO Mark Zuckerberg following the ProPublica report asking Facebook to preserve all documents, communications, and data from the “I’m 10-15” group and provide postings from the group to the committee. He also said the posts reported appear to violate Facebook’s Community Standards on hate speech.

A Facebook spokesperson in an emailed statement said the company’s policies apply across Facebook, including secret groups. “While the general public can’t see content within these groups, our detection systems can. Using a combination of technology and human review, we routinely remove many types of violating content before anyone reports it,” the spokesperson said. “There is still more we can do, and we continue to improve our technology to detect violating content.”

The spokesperson declined to comment on whether the company would comply with Cummings’s request or had shut down the 10-15 group, citing the federal investigation into the group. Facebook said it has removed “several pieces of content” from the Real CBP Nation group.

Facebook banned the conspiracy theorist media star Alex Jones last year, but private groups with thousands of members continued to promote his work. Groups dedicated to the pro-Trump conspiracy theory QAnon thrive on Facebook, as do communities that oppose vaccination. In 2017, the military launched an investigation of a secret Facebook group composed of Marines who had shared naked photos of fellow female service members. Private and public groups also played a significant role in helping white supremacists organize their march in Charlottesville in 2017.

And ProPublica reports that civil rights groups have been flagging hateful posts in secret groups to Facebook for years, and the company hasn’t really responded much.

According to CNN, the “I’m 10-15” group changed its name to “America First” after the ProPublica report but has now archived the page, meaning there can’t be more posts or comments. A spokesman for the CBP did not return a request for comment on its investigation.

Facebook is moving more in the direction of private groups, not less

Facebook has good reason to try to improve how it polices content in private groups: Closed communities are the direction the company’s going in.

In March, Zuckerberg said in a post that Facebook intends to build out a “privacy-focused messaging and social networking platform” moving forward. He said the company would focus on private interactions, encryption, and reducing permanence (meaning it won’t keep messages or posts longer than necessary). The decision makes sense, given that Facebook already owns private messaging app WhatsApp and has its own messaging service, Messenger, and Facebook is reportedly planning to integrate those messaging services as well as Instagram.

And Facebook has said it wants to focus on groups. In a post on the first day of its F8 developers conference in April, Facebook said it wants to put groups first and make it “easy for people to go from public spaces to more private ones.” It claims that more than 400 million people on Facebook belong to a “meaningful” group, though it does not provide a breakdown of how many of those groups are public, closed, or secret.

To be sure, secret groups aren’t, by default, bad. Facebook pointed out that they can be a place for people to share intimate and important details about their lives with only specific communities, such as victims of domestic abuse, people who suffer from addiction, or people with illnesses or medical diagnoses.

But the controversy over the CBP groups demonstrates the challenges Facebook still faces in how it deals with the content on its platform and how bad actors might use the tools it’s created, including secret groups, for nefarious purposes. Its technologies may catch two-thirds of hate speech on the website, but even with that, it apparently missed the 10-15 group’s hate for years.

Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.