Ever searched your name in Google? Well this guy did and every began to think he was a crime lord controlling an Australian crime gang. Truth is he was innocent...

4

SHARES

Plugging your name into a Google search bar just to see what comes up is something that we all do every once in a while. There’s a certain buzz that comes with seeing your name up in those bold blue lights, even when it inevitably leads to that self-pitying feeling you get when you realise that you have five namesakes who are all more famous than you are. For Milorad Trkulja however, an Aussie entertainment promoter, this simple act has been hitting a slightly sinister note for many years.

Advertisement

He was shot in 2004 by a balaclava-clad man in a crime that was never solved. But this didn’t stop the internet from having its own theories. A selection of websites decided that the shooting was an attempted professional hit, and Mr Trkulja was, in fact, a major crime figure.

This information had no basis: the police report explicitly rejected the possibility that it was claiming as truth. However, this did not matter. Now that it had been published online it would start appearing in searches containing Milorad Trkulja’s name. Through Google images, he was now connected to crime giant Tony Mokbel, and the search engine also linked him to the now extinct site ‘Melbourne Crime’, which monitored the city’s gang related incidents.

Advertisement

Advertisement

As you can imagine, for Mr Trkulja, also known as Michael, these were extremely disconcerting times. According to the man himself, members of the public started deliberately avoiding him, and a couple even refused to sit next to him at a wedding because of what had been published.

In 2009, his lawyers contacted the internet giants over the content, requesting that Google take action. They didn’t. This lit a fire in Michael’s belly. Fed up with his predicament, he brought a landmark court case against the company, which would eventually end earlier this month with him 200,000 Australian dollars (equivalent of $208,000 or £130,000) richer, as Google were ordered to pay out for publishing defamatory material.

Google’s defence during the case was that they did not publish the material at all. The search results were automated, said the company, claiming it was the innocent disseminator of the information. This was generally accepted by the jury, but the fact that the firm had been notified by Mr Trkulja’s lawyers about the harmful material previously, and done nothing, was enough for the court to decide that they should cough up.

Mr Trkulja said he felt ‘vindicated’ by the result, after he also won a similar case against Yahoo earlier in the year over the same shooting incident. “I've lived in Australia 41 years”, he said. “This case is not about the money, it's about protecting my family, my children and my reputation.”

Now I know what you’re thinking: this kind of money is less than pocket change for Google. And indeed, you’re right. You can be fairly confident that Larry Page and his pals won’t have any trouble making rent this month. This case, however, does set a dangerous precedent for the company, as it opens the doors to similar cases being brought against them, with the possibility of a lot more money, as well as a reputation built up over many years, going down the drain.

The question to ask, I suppose, is whether the principle behind this conviction is sound. Should Google be responsible for the websites it includes in its index? Both sorts of possible responses seem sensible at first glance. On the one hand, it feels like very basic business responsibility, as a search engine, to have some sort of handle on the kind of websites that you include in your search results. To do otherwise would be, in some sense, to shirk some kind of duty you have to the people who use your service.

Conversely, the amount of time taken to trawl through every murky corner of the net and censor out sites that are, in some way, undesirable is unimaginable, and practically speaking it is probably an impossible task. In addition, variety, however unsavoury it may taste, is the spice of the online world: I’m sure most people would agree that removing sites that distribute false information from search engines would harm the online experience. It inevitably leads, people would say, to a slippery slope on which censorship, along with gravity, is the master.

I suppose the response to this view is that the slope is not all that slippery: you can draw a clear, unmoving line at personal defamation (as has happened in this case) and leave it at that. But then the issue of time still remains. You can solve this by saying that search engines must only respond to cases of which they are transparently made aware. This will leave you in a similar position to, and with the same conclusion as, the court who decided this case. Once notified of defamatory material appearing in their search results, it is reasonable to suggest that Google acted irresponsibly in taking a hands-off approach, and so the punishment handed to them was just.
Search engines have come under increasing pressure recently to exact more quality control. Google, this year, were ordered to disable part of its auto-complete function after it linked a Japanese man to crimes he did not commit, and the firm also faces legal action from Bettina Wulff, the wife of a former German President, over claims that the same function suggested words such as “prostitute” and “red light district” alongside her name in the Google search bar.

While it’s not clear what exactly the future implications of this case will be, what’s certain is that search engines are no longer simply the bearers of information that they once were. The message from the legal system is clear: the ‘don’t shoot the messenger’ defence will no longer do, and companies like Google will have to keep a firm grip on their indexes or pay a hefty price.