Social giants face regulatory backlash

Social media companies risk tough regulation by frustrated governments across the world if they do not eradicate hate content on their platforms, according to top media, privacy and technology lawyers.

The Morrison government is expected to introduce new legislation to parliament this week which would impose three-year jail terms on social media executives and fine companies hundreds of millions of dollars if their platforms are used to broadcast “abhorrent violence” and removal is not swift.

"They are operating under this social licence where they have been allowed to do this and publish without impunity until they’re notified things are going wrong," Norton Rose Fulbright global head of technology and innovation Nick Abrahams said.

Advertisement

"I think they are straining the goodwill in that licence. That licence could be revoked."

Mr Abrahams, who co-authored Big Data, Big Responsibilities, said the suggestion of jail time is clearly an example of the frustration the government feels towards social media companies not doing enough to reflect community expectations.

"If they don’t start to engage proactively with governments around the world, governments could get frustrated and over-regulate in a way that could really impact the revenue model," he said.

"At a legal level, that’s complex, in the sense the publisher is actually the offshore entity, theoretically the Australian entity has nothing to do with that, other than an arm’s length arrangement with the home office.

"If you are profiting from that, it might be seen as the Australian entity, then those folks sitting on the ground here in Australia could well face liability. It’s convoluted, but it's not impossible."

Mr Abrahams said he expected Facebook to make some concessions on live-streaming, which is a very complex issue, whether there is some sort of delay and moderation, to prevent horrific acts going up and spreading.

Holding Redlich partner Angela Flannery said the government first needs to understand the technological options Facebook, Google, Twitter and others have available to remove or stop content that is considered a problem.

Advertisement

"If platforms do not use the technology that is available to them to remove/stop content then it makes it much easier to justify taking action against those platforms," Ms Flannery said.

"Any argument that the platforms make that they can’t act quickly to detect and remove such content would need to be considered very carefully.

"If Google and Facebook can target ads in the way they say they can, ie, based on where you are at a given point in time and based on your known interests, then those platforms should have the answers for removing content in close to real time as well."

Ms Flannery said there are also complex issues of what content is considered a problem, beyond the obvious terrible events such as the Christchurch attack, and how that is covered by law.

Those platforms should have the answers for removing content in close to real time.

— Angela Flannery

One of the issues the government is grappling with is how to categorise social media companies; if they're considered publishers by law that affects the regulation, she said.

"If a platform is a publisher, then it seems that ultimately jail terms might be appropriate – a publisher of extremist material should be subject to the threat of jail. However, as you know, digital platforms don’t consider themselves to be publishers.

Platforms are really a hybrid – they do facilitate users of their platforms to publish material online and the platforms do have the power to remove material," Ms Flannery said.

Advertisement

"So there is an argument that, in the most egregious cases, for example, if a platform has been asked to remove material that falls within a prohibited category and does not do so even where it has the capacity to – that jail terms could be considered.

But this really should be reserved for the most egregious cases."

'Civil penalties make a lot more sense'

Ms Flannery said the question of censorship is an area where care needs to be taken.

"There is obviously content that shouldn’t be available. However, where exactly is it that the line is drawn? At the moment, digital platforms exercise significant discretion as to what they allow and what they remove from their platforms and we need to have public debate and discussion on this if the Government is going to impose regulation in this area," she said.

"I don’t think we should have a law, for example, that requires the removal of content that is 'contrary to Australian values', given many Australians quite legitimately see this as encompassing very different things."

Mills Oakley partner Kathryn Edghill said the devil will be in the detail and regulation will have to account for many different factors.

"What’s not fast enough? What practically can be done? What do you do when you take it down and it pops up in another area? I think it would be difficult to enforce that unless there was some degree of wilful negligence," she said.

"Anybody who’s got appropriate systems in place, those systems are followed, policies are known to everybody … I think it would be very hard to have criminal liability for that. Civil liability might be a bit different.

"Civil penalties make a lot more sense. It would be feasible to tie in to the length of time it’s up there, or time it took an organisation to take it down, having regard to a whole myriad of factors."