Extent of Google's renewed battle against online child abuse revealed

Google has tweaked its algorithm to limit search
results for suspected child sexual abuse queries, implemented
Microsoft's digital fingerprint technology to tag illegal images
and is developing a similar system to detect and delete film
footage showing illegal activities. These measures in the
technology giant's renewed battle against online child sexual abuse
were announced by Google executive chairman Eric Schmidt today in
the Daily Mail, hours ahead of an "internet safety" summit
taking place at Downing Street.

"Internet companies like Google and Microsoft have
been working with law enforcement for years to stop
paedophiles sharing illegal pictures on the web," Schmidt wrote in the article. "We actively remove child sexual abuse
imagery from our services and immediately report abuse to the
authorities… But as David Cameron said in a speech this summer,
there's always more that can be done."

The Daily Mail also referred to Google's
measures as a "stunning U-turn". They are not.

Schmidt
goes on to explain that in the last three months 200 Google
employees were put to work to use "state-of-the-art technology to
tackle the problem". He is directly answering the calls for action
that were so publicly featured in the Daily Mail itself
over the past year.

The newly modified algorithm (which Google constantly
tweaks anyway) means 100,000 child sexual abuse queries come up
short -- instead warnings will appear from Google and charities at
the top of search results for more that 13,000 of these queries,
directing those individuals to places they can get help. For now
this focusses on English search results globally, but will expand
to cover 158 languages in the next six months.

Google has also implemented Microsoft's digital
fingerprint system -- every time an employee identifies an image as
illegal, it is given a unique marker that enables the algorithm to
hunt for and tag and remove other similar images on the web. Just
days before the Downing Street summit, Microsoft had also announced
it was opening a Cybercrime Centre in Washington state which would
tackle, among other things "technology-facilitated child
exploitation" with its PhotoDNA software. A kind of PhotoDNA for
video is also under development at Google, with YouTube engineers
now testing the technology which will be made available to other
internet companies next year.

Google has, for its part, publicly conceded to the
pressure that has been squarely directed at it from the start of
the campaign. It was Google that was so often cited by politicians as a leading player that needed
to take responsibility for its part in the illegal online child
abuse trade. And that campaign ensured search engines soon became
targeted, rather than cited -- aligned with the perpetrators behind
those crimes, if they refused to take action.

The political campaign was largely instigated by two
terrible crimes perpetrated against children that came to light in
the past year. During the trials of Mark Bridger, found guilty of
killing five-year-old April Jones, and Stuart Hazell, who murdered
12-year-old Tia Sharp, it was revealed both had searched for
disturbing images of child sexual abuse and rape online in the days
prior to their crimes. This content is of course illegal, and
companies like Google had already been working with the Internet
Watch Foundation for years to take down and report any identified
instances to authorities.

However, in light of the Bridger and Hazell cases,
the very act of looking at illegal material before committing a
crime became central to the debate. Rather than being treated
separately as illegal material supplied by criminal networks of
child abusers that needed to be targeted, the material came to be
identified as a cause of the horrific crimes perpetrated by Bridger
and Hazell. Hence, internet search giants become the number one
players whose actions would now be held to the highest standards:
how they chose to move forward, would show the public whether they
were complicit with or against these crimes.

John Carr, secretary of the Children's Charities'
Coalition on Internet Safety, told
BBC Radio 5 live earlier this year: "There is enough
evidence to suggest that if we can put more barriers towards guys
getting to child abuse images, fewer of them will do it and more
children will be safe". Bold statements such as these compounded
the idea that the act of seeing the material instigates the crime,
and that anyone enabling that perusal is, by default,
complicit.

"After viewing child pornography and a rape scene on
TV at home, Bridger decided to abduct April, kickstarting the
biggest missing manhunt in police history," writes the Daily
Mail today. It suggests that it was only after viewing the
material that Bridger decided to abduct April Jones. Of course,
it's impossible to know when and for how long the former abattoir
worker had been thinking about it. But the comment shows how
ingrained this assumption of cause and effect has become -- a fact
that means internet search engines cannot escape the debate.

The Daily Mail also referred to Google's
measures as a "stunning U-turn". They are not.

Google has been working closely with the IWF for
years and Microsoft has been developing its PhotoDNA software for
years. Google can take down illegal material and help look for it,
something it has been doing for years, but is now being seen to be
more proactive in stepping up that approach. What Google had been
against was the filtering of all pornographic material, something
that looks likely to ahead through ISP-level filters.

Google chairman Schmidt had in the past warned of the
"slippery slope" that could come of using technology to implement a
blanket blockade, and he urged for precise legislation to be
implemented before any action take place. This would be designed to
prevent inadvertent blocks of legitimate material. However,
today Schmidt revealed "these changes have cleaned up the
results for over 100,000 queries that might be related to the
sexual abuse of kids". The inclusion of the word "might", is
telling. Google has chosen what parameters to use, and it's
possible legitimate queries could be included. Those parameters are
not public knowledge. Prime Minister David Cameron had, however,
already passed on a list of "unambiguous" terms drawn up by the
Child Exploitation and Online Protection Centre (CEOP) to internet
search companies, and challenged them to block them.

Earlier this year Wired.co.uk spoke
with professor of social psychology in the Department of
Media and Communications at London School of Economics Sonia
Livingstone about the proposed internet curbs, and she commented: "I am particularly
concerned that we leave such measures to industry, which deals with
pornography according to its proprietary and largely unaccountable
processes that manage 'customer care' and 'terms and conditions'.
This gives no right of redress, no transparency in what is blocked
or why, and no analysis of overblocking."

This sentiment was today shared
byJoe McNamee, executive director
atEuropean Digital Rights. "The UK
government has focussed on getting private companies to take
unproven technologies to address problems that have never been
subject to any credible amount of independent analysis," McNamee
told Wired.co.uk. "Is it responsible and defensible to replace real
action against crime with superficial actions against the symptoms
of the crime? Is it responsible and defensible to adopt any policy
on such an important issue with so little proper analysis? Is it
reasonable and defensible to use unproven -- and possibly
counterproductive -- technologies to do this?

"No, it is not. But if the press can be relied on to
run the story about Google and not the story about the police cuts,
can we expect a different outcome?"

McNamee is referring to the cuts at senior levels
made to the CEOP, reported in PC Pro this September. The article referred to the
cuts in context of very public -- and apparently inaccurate --
comments made by David Cameron and Claire Perry that CEOP staffing
levels had been increased by 50 percent in the aftermath of the
online child sexual abuse campaign. The September article revealed
that in fact the most experienced officers were being axed and
funds redirected to boost staff numbers at more junior
levels.

It is not a question of one measure or the other
though, as McNamee is implying. The government has not passed on
responsibility to internet search companies and given up on "real
action" as implemented by the likes of the CEOP. A government
statement asserts that: "Britain and the US have team up to target
child abuse online with a new UK-US taskforce, set up between US
Assistant Attorney General and the UK government, that will
identify cross-Atlantic targeting of criminals who think they are
hidden from the law, including those operating on the 'dark web'.
Ex-Google and Facebook chief Joanna Shield will lead an industry
group of technical experts to explore what more can be done." It
also points to a £25m, three-year parent awareness campaign being
funded by ISPs

"The UK government has been
misdirecting public attention away from efforts to take real action
against the real crimes via a public relations campaign to get
money from private companies," adds McNamee, referring
directly to Google's £1m donation to the IWF in June. Google already
donated money to the IWF every year, however, along with companies
including Facebook, BT and Telefonica. Those amounts are usually in
the range of £20,000 a year, and since Google's £1m donation is to
be spread over four years it's not all that different.

New to the table, however, is Google's plan to send
its computer engineers to the IWF in the UK and to the US National
Centre for Missing and Exploited Children to help in tracking
material. It will also fund engineer internships at the IWF and
NCMEC, and will work with Microsoft, the IWF and the National Crime
Agency to tackle peer-to-peer networks used for illegal activities.
The latter is the real core of the crime, and could potentially do
far more to prevent the making and proliferation of illegal
material.

"We will now work with the National Crime Agency and
others to monitor the effectiveness of the new technology
introduced by Google and Microsoft," Cameron told the Daily
Mail, so there is a degree of accountability and analysis
going into the implementation of these measures. Though the details
of this remain unclear.

In a statement ahead of the summit Cameron said legislation forcing
internet search giants to step up is not out of the picture yet:
"If the search engines are unable to deliver on their commitment to
prevent child abuse material being returned from search terms used
by paedophiles, I will bring forward legislation that will ensure
it happens. There are some terms that are so shocking and
unambiguous that I believe they should return nothing at all. It's
not an infringement of free speech, it's responsible business
practice."

"With the progress that has been made in four
months, I believe we are heading in right direction but no-one
should be in doubt that there is a red line: if more isn't done to
stop illegal content or pathways being found when someone uses a
child abuse search term, we will do what is necessary to protect
our children."