Italian Court Autocompletes Google Case With “You Lose!”

Imagine the difficulties that Google must have compared to many other Internet companies. Along with the great success and truckloads of cash comes the reality that different laws in different lands are going to create many different headaches especially in the legal realm. Italy has been particularly rough on Google and now it has ruled against the search giant in a case that involves the content of search results from Google’s autocomplete function.

Google has lost a case in Italy over the defamatory nature of autocomplete suggestions, according to a lawyer for the complainant.
On Tuesday, lead counsel Carlo Piana wrote on his blog that the Court of Milan has upheld its earlier decision to order Google to filter out libellous search suggestions. These are the suggestions that pop up in Google’s search input bar, proposing what the user might be wanting to search for.

No that I am upset for Google. This kind of activity is just part of playing the game at the scale which they do.

As for the actual ruling? It’s difficult to justify based on censorship and free speech but there is also plenty of room for discussion. As a searcher I would want to be warned of any wrong doing that the party I am researching has been up to. As a person in the Internet age though, I am well aware of the ability of Internet savvy people to smear someone’s name in the SERP’s. There is also the trouble of guilt by association if someone is searching for a person or business with a common name and they see negative results that may have nothing to do with the searcher’s actual target.

Unfortunately, the Internet is far from perfect and cases like this show how these imperfections play out in various countries and cultures. It’s a variable that marketers have to be aware of for their own efforts as well.

The article continued

Google lost its bid to claim the protection of the E-Commerce Directive’s safe harbour provisions, which partly shields hosting and ISPs from liability for content held on or transmitted over their systems. However, the court viewed the autocomplete suggestions as being produced by Google itself.

“Google argued that it could not be held liable because it is a hosting provider, but we showed that this is content produced by them (and by the way, they do filter out certain content, including terms that are known to be used to distribute copyright-infringing material), although through automated means,” Piana wrote.

The lawyer said the suit is “by no means an endorsement to censorship”, as the allegations had been fully discussed with Google before the court action was even considered and only two phrases were put forward to be filtered out of autocomplete.

Apparently, the unnamed ‘victim’ was in the world of finance so I immediately received a much clearer vision of what might be happening here. Hmmmm, finances, claims of fraud and cons. Seems to fit. While that is terribly stereotyped it is good to know that this is something that happened in an industry where that claim is much more common and can be made loudly online (and is very often).

So how did Google respond to the verdict?

“We believe that Google should not be held liable for terms that appear in autocomplete as these are predicted by computer algorithms based on searches from previous users, not by Google itself,” the company said. “We are currently reviewing our options.”

I found that response pretty curious because it seems like Google is talking out of both sides of its servers. That statement acts as if an algorithm is some separate entity that does its own thing. Didn’t someone (as in a Google employee) program that algorithm to do what it does? This is the kind of creepy talk that Google needs to avoid at all costs because the more they advance with technology and engineering the less human they sound.

Go to the light. Enter the Goog. We’ll take care of everything for you (cue the evil laugh and maniacal rubbing together of hands).

J.G. Howard

Programming the algorithm to do what it does is absolutely human. There’s nothing creepy about making tools available that give us an expanded view of our world, and ourselves. I know I would certainly want to know if past users were searching for fraud activity of a particular company. It doesn’t mean I’m going to make a judgment based on search results, but it does mean I get different perspectives. Italy has a long history of surveillance, censorship, and media manipulation. That should be a bit more alarming than Google defending its right to present predictable search terms to it’s users.