One of the biggest challenges in addressing online hate is figuring out what it is.

Under the Canadian Criminal Code, it’s an offence to promote hatred against any identifiable group, or advocate genocide.

Chris Tenove, a postgraduate researcher at the University of British Columbia, said distinguishing between speech that promotes hatred and opinions that are ugly or unpopular is vital.

“If you’re too loose with what you’re saying counts as hate speech, and then you start talking about the need for taking criminal action on it, some people interpret that as a very concerning attempt by government to criminalize quite a lot of speech,” said Tenove, who is studying political theory with a focus on global governance and digital politics.

Tenove points out that even political leaders are using language online that, by some standards, might be considered hate speech.

“Even amongst politicians and political elites, there seems to be more language that is skating closer to hateful or harmful speech than in the past,” he said.

Last month, Motherboard reported a Twitter employee’s comments that algorithms aimed at automatically detecting and deleting white supremacist content could also delete posts by several Republican politicians. Society would not likely accept that kind of censorship, the employee suggested.

The risk is that unpopular or ugly — but legal views — could be censored as a result of efforts to stop hate speech.

2. Be transparent.

In 2017, Germany passed the Network Enforcement Act, an anti-hate speech law often referred to as the Facebook Act. It set out fines of up to $75 million for social media companies that failed to take down obviously illegal posts that promoted hate within 24 hours of receiving a complaint.

The law has been controversial, with critics arguing companies could chose to err on the side of caution, resulting in reduced free speech.

But both sides agreed the law should require the companies to report how many posts they removed, allowing the public — and researchers — to analyze the effectiveness and impact of the policy.

Heidi Tworek, a UBC history professor who studies the impact and politics of mass media, said any policies need to ensure companies are open about their activities in dealing with complaints of hate speech.

“We can’t as researchers do research unless we have the evidence,” she said.

Tworek and Tenove, who collaborated on a submission to the Public Policy Forum on regulating online hate speech, say there’s not enough data on the frequency of online hate or how it affects marginalized groups.

“We don’t have hate speech or the infractions of the terms of service to tell us what’s happening in Canada,” said Tenove.

Without that base research, it’s hard to know whether a policy is working, he said.

“How do we measure is a really important question,” explained Tworek. “How do we actually quantify what is hate speech, and how do we know which measures are having an effect?”

3. Consider approaches beyond fining the companies.

Germany’s law made headlines for forcing companies to take down hate speech or suffer heavy fines. But fines could create a false sense that a policy is working when in fact companies are censoring legitimate debate to avoid any risk.

Tworek said her research found the German law resulted in the removal of more than 28,000 items from Twitter last year. Facebook removed only 362, but its complaint page was hard to find, demonstrating one way companies could avoid fines.

Tworek and Tenove recommend using other tools to limit the spread of hate speech, including revamping Canada’s Human Rights Act to return parts of Section 13. The clause, repealed by the Harper government in 2013, had made it an offence to communicate in ways that were likely to expose “a person or persons to hatred or contempt.”

Section 13 was controversial for its perceived infringement on freedom of speech. But when it was repealed, complainants lost a quicker, more accessible alternative to criminal complaints in responding to hate speech online. Some experts recommend revisiting the clause instead of changing criminal law.

“Having a human rights framework sets a larger ambit for what constitutes hate speech, but doesn’t include potential jail time,” said Tenove.

But the process was rushed and secretive, according to some groups that withdrew from the talks.

Without their contributions, Tworek said, it’s hard to measure how marginalized groups are affected by online hate speech and whether it’s discouraged them from engaging online.

“Bringing civil society organizations to the table is extremely important in creating buy-in,” she explained.

She and Tenove advocate creating a moderations standards council to ensure a co-ordinated, informed response to online hate speech. The council would bring together companies, governments and non-profits to ensure a consistent approach across platforms.

Tenove said policymakers should avoid making online hate speech a partisan issue when considering things like changing the Rights Code.

“It would be a shame if it were immediately polarized into a partisan issue, and I think there’s a real possibility of that,” he said.

5. Avoid policies that can be twisted to censor free speech.

Tworek said certain measures might seem like a good idea now but could be used to censor free speech in the long run.

“Something that might seem fine in France under [President Emmanuel] Macron does not look good in France under [right-winger Marine] Le Pen,” she explains. “These rules need to be designed for future governments as well as the governments of today.”

For example, she notes some of the countries with the most aggressive laws against fake news are authoritarian states less interested in fighting online disinformation than having the power to censor critical views.

“One of the questions we need to be asking ourselves is, ‘Are you designing laws and regulations that are going to be preserving freedom of speech as far as you possibly can, and not give tools to an authoritarian regime?’” she said.

Get The Tyee in your inbox

Your privacy is important to us.

When subscribing to a newsletter edition you'll also get early notice on Tyee events, news, promotions, partner messages and special initiatives.

Further to the provision of the Personal Information Protection Act, personal information is kept confidential by TheTyee.ca and will not be sold, traded, released, shared or distributed to any other individuals, organizations or agencies without prior consent or notification.

Measures have been enacted to ensure the integrity of personal information and to protect it from misuse, loss or alteration. All information submitted to The Tyee is only available to employees or sub-contractors who are bound by agreement with The Tyee to keep the information private. E-mail addresses are only used for the purposes of Tyee-related correspondence or comment moderation.

If you have concerns related to your privacy please contact us at info@thetyee.ca