European Commission Launches “one-hour-rule” for Hate Content Removal

The European Commission demanded that social media platforms such as Google, Facebook and Twitter remove offensive and incendiary content from their pages within one hour, or face penalties.

The EU executive branch released a new set of recommendations on Thursday, stating that web companies must take action to delete terrorist propaganda, hate speech, child pornography and websites selling stolen goods or counterfeits— within sixty minutes from their being uploaded.

“Online platforms are becoming people’s main gateway to information, so they have a responsibility to provide a secure environment for their users. What is illegal offline is also illegal online,” EU’s Vice-President for the Digital Single Market Andrus Ansip said in a statement. “While several platforms have been removing more illegal content than ever before – showing that self-regulation can work – we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens’ security, safety and fundamental rights.”

While most web giants already have strong guidelines in place to address hate speech and other incendiary or illegal content, they usually only ensure that objectionable posts are removed within 24 hour from their appearance online.

The shift to the EU’s stricter regulation was welcomed with perplexity and worry , with a trade association protesting that the new guidelines leave too little time to take action.

“Such a tight time limit does not take due account of all actual constraints linked to content removal and will strongly incentivise hosting services providers to simply take down all reported content,” the Computer & Communications Industry Association said in a statement.