Facebook told user Janet (a name given to the user by the BBC to protect her identity) that it had removed hateful anti-Muslim posts when they actually remained live on the social network.

After reporting the posts, she received a message saying: “We removed both the group and all its posts, including the one you reported.” But this was not the case.

Facebook told the BBC that it is looking into a possible glitch in its content moderation system. The glitch reportedly sends a message telling users that content they have reported has been taken down, when in fact Facebook’s moderators have deemed it permissible to stay online.

“We are investigating this issue, and will share more information as soon as we can,” Facebook said. Business Insider contacted Facebook to ask if the glitch has been fixed, what caused it, and how many users it may have affected.

Janet shared examples of content which had stayed up after she was told they’d been removed, including from a group with upwards of 54,000 members named “LARGEST GROUP EVER! We need 10000000 members to Make America Great Again!” Janet reported the group for anti-Muslim and anti-immigrant rhetoric.

“[Facebook] has been promoting themselves in my Newsfeed saying they are trying to keep our democracy safe by eliminating content that is false and divisive,” Janet said.

“If they are sending me notices they removed the content and offensive groups but in reality are not, doesn’t this go against what they say in public or to Congress?”