Tag: Facebook

On Tuesday 11th September, Lucy Powell MP introduced the Online Forums Bill to Parliament. It was a ‘Ten Minute Rule Bill’, a mechanism by which opposition and backbench members of parliament can introduce legislation. The text of Ms Powell’s speech may be found in Hansard and there is a video on Parliament.tv.

The speech makes some challenging points. How is it that Facebook groups can grow to tens of thousands of people in secret, with no oversight or scrutiny? One such group, which discussed autism, recommended that parents give their kids ‘bleach enemas’ to cure the condition.

Powell also points out that members of these groups often feel too intimidated to speak out against the most vocal and radical members of the group. This shifts the dynamics of such groups to ever more extreme positions, and is a very particular free speech issue in itself.

The news that conspiracy theorist and inciter-to-violence Alex Jones had been simultaneously banned from several social media platforms sparked several days of debate and comment – on both mainstream and social media. At stake were questions about the wisdom and efficacy of such a ban, and the acceptable limits of free speech.

Last week I posted a quote from Dr Alex Mills of University College London, on Facebook’s woefully inadequate Terms & Conditions that related to defamation. That was drawn from a panel discussion I participated in on 22 March 2018 hosted by UCL’s Institute of Advanced Studies, entitled ‘Defamation – A Roundtable on Lies and the Law‘.

The propaganda website InfoWars has been banned from Facebook, the Apple iTunes podcasting platform, and Spotify. Most people have welcomed the fact that these technology companies have finally acted to enforce their own terms and conditions, though others (including, obviously, InfoWars itself) says that this is an infringement of free speech.

Back in March, I participated in a round-table discussion hosted by the University College London’s Institute of Advanced Studies, on the subject of defamation. I will post my remarks at some point, but for now (primarily because of a media appearance I made today) I wanted to share a remark made by Dr Alex Mills about the state of Facebook Terms & Conditions.

What you have when you look at Facebook’s community standards is a defamation law that you would write on a postcard if you were trying to explain a sort of version of American defamation law to someone who wasn’t a lawyer.

Following the revelations about the harvesting of personal data by Cambridge Analytica and the ongoing worries about abuse and threats on social media, the UK House of Lords Select Committee on Communications last week began a new inquiry entitled ‘Is It Time To Regulate The Internet?’. At the witness sessions so far, peers have opened by asking each expert to comment on whether they favour self-regulation, co-regulation, or state-regulation.

The instinct to regulate is not limited to the U.K. Late last year senator Dianne Feinstein (D-CA) said:

You’ve created these platforms, and now they’re being misused, and you have to be the ones to do something about it… Or we will.

With the reader’s indulgence, these developments remind me of a point I made a few years ago at ORGcon2013, when I was speaking on a panel alongside Facebook VP for Public Policy EMEA, Richard Allan:

If we as the liberal free speech advocates don’t come up with alternative ways of solving things like the brutal hate speech against women, the hideous environment for comments that we see online, then other people are going to fix it for us. And they’re going to fix it in a draconian, leglislative way. So if we want to stop that happening, we need to come up with alternative ways of making people be nicer!

Following my short appearance in a BBC news report yesterday, I had hoped to publish a companion blog post here, making all the free speechy points that were edited out of my contribution. Instead, I strayed off piste and ended up with this litany of complaints about Facebook. A useful aide memoir for the future, with a couple of useful insights, maybe.

When it comes to free speech, even the most hardened advocates tend to draw the line at incitement to violence. “Your right to swing your arms ends just where the other man’s nose begins” wrote Zechariah Chafee. Freedom of expression is not absolute, and when people publish text or video that is likely to provoke violence, it is legitimate to censor that content.

Inciting violence and hate is what Britain First group appear to have been doing, so the Facebook decision to ban their page feels righteous. Good riddance? Nothing to see here? Move along?

The racist far right group Britain First have been banned from Facebook. BBC South East reported the story and interviewed yrstrly for English PEN. Here’s what I said:

We abhor what Britain First stands for, but nevertheless there are some unintended consequences with this move. Shutting down speech you don’t like is deeply problematic—It means that countries around the world can use it as an excuse to shut down speech they don’t like. And it also alienates certain sections of the British population, [with whom] we really need to have a dialogue…

Obviously this is just a small excerpt from a longer interview I gave to the news team. There is a lot more to say about this issue, in particular about how we appear to have ceded most of our political discourse to private companies running social media platforms. There is also a real issue surrounding the efficacy of counter-speech, and what both social media and the traditional broadcasters might do in order to give better, bigger platforms to the kind of options that can counter and neutralise the far right threat. I will post more on this soon.

I was quoted very briefly in the Mail on Sunday this weekend, in an article about a new police strategy for cracking down on Twitter abuse and threats.

It is feared that this will lead to large numbers of comments being reported to social media providers or police as inappropriate, even if they were only meant jokingly or had no malicious intent. Robert Sharp, of the anti-censorship group English PEN, said: ‘Threats of violence must of course be investigated and prosecuted, but the police need to tread carefully.’

When I posted this to Facebook just now, I was going to add the abbreviation ‘NSFW’, Not Safe For Work. But that prompts two thoughts. The first is that my work actually involves looking at links and images like those displayed here! I often wonder if I have inadvertently shocked my colleagues who have accidentally wandered past my screen while I was reading some link about porn or violence or racism or something.

Second, its surely a problem that our culture, as reflected in the Facebook image usage policies, deems images such as masectomies, nude drawings, and breastfeeding as “NSFW” regardless of context. Why shouldn’t these images, undeniably in the public interest, be viewed at work?

I reckon we should start labelling images and GIFs from sporting events as ‘NSFW’ because surely that’s the number one content that should not be viewed at work, damging as it is to productivity.