Googlehas
been slammed for 'fixing' its racist image recognition algorithm by
simply removing the word 'gorilla' from its auto-tag tool.

The
software outraged many users back in 2015 after it tagged images of a
computer programmer and his friend as primates.

Now,
nearly three years later, it has been revealed the company has 'fixed'
the issue by blocking identification of gorillas, chimpanzees and
monkeys.

Twitterusers
have criticised the company for not working to develop a diverse model
for an algorithm and instead just banning identification of gorillas and
black people.

Scroll down for video

+2

In
2015 Jacky Alcine, from New York, spotted photographs of him and a
female friend had been labelled as gorillas by Google Photos image
recognition software. Google has admitted its image labelling
technology is 'nowhere near perfect' but instead of fixing it the
company has simply banned the term 'gorilla'

WHAT
HAPPENED?

Google
launched its standalone Photos app in May 2015, announcing a number
of features such as automatically creating collections of people and
objects like food or landscapes.

However
shortly after, Jacky Alcine, from Brooklyn, New York, spotted photos
of him and a female friend posing for the camera had been grouped
into a collection tagged 'gorilla'.

In
a series of Tweets to Google back in 2015, Mr Alcine said: 'Google
Photos, y'all f***** up. My friend's not a gorilla.

'The
only thing under this tag is my friend and I being tagged as a
gorilla.

'What
kind of sample image data you collected that would result in this
son?

'And
it's only photos I have with her it's doing this with.

'I
understand how this happens, the problem is more so on the why. This
is how you determine someone's target market.'

Google
launched its standalone Photos app in May 2015, announcing a number of
features such as automatically creating collections of people and
objects like food or landscapes.

The
internet giant's Google Photos application uses an auto-tagging feature
to help organise images uploaded to the service and make searching
easier.

However
shortly after its launch, Jacky Alcine, from Brooklyn, New York, spotted
photos of him and a female friend posing for the camera had been grouped
into a collection tagged 'gorilla'.

The
incident caused outrage and Google said that it was 'appalled' and
'genuinely sorry' for the mistake.

At
the end of last year, reporters fromWiredtested
Google Photos using 40,000 images - many of which contained animals.

It
could identify many animals such as pandas and poodles. It could also
identify baboons, gibbons, orangutan and marmosets.

However,
it could not identify gorillas, chimpanzees or monkeys.

A
spokesperson confirmed that 'gorilla' was censored after the 2015
incident and 'chimp,' 'chimpanzee,' and 'monkey' have since been blocked
too.

'Image
labelling technology is still early and unfortunately it's nowhere near
perfect,' the spokesperson said.

SHARE
THIS ARTICLE

Share

106shares

'How
about making the effort to develop a diverse (eg not all white) model
base for the algorithm?' tweeted Agustin Fuentes, a
US-based Professor of Anthropology.

'Google
has "fixed" its image recognition algorithm which misidentified black
people as gorillas. The algorithm no longer identifies gorillas, or
black people', tweeted John Overholt, a curator of Early Modern Books
& Manuscripts at Harvard University.

'How
about making the effort to develop a diverse (eg not all white) model
base for the algorithm?' tweeted Agustin Fuentes, a US-based Professor
of Anthropology

'Google
has "fixed" it's image recognition algorithm which misidentified black
people as gorillas. The algorithm no longer identifies gorillas, or
black people', tweeted John Overholt, a curator of Early Modern Books
& Manuscripts at Harvard University

In
a series of tweets to Google back in 2015, Mr Alcine said: 'Google
Photos, y'all f***** up. My friend's not a gorilla.

'The
only thing under this tag is my friend and I being tagged as a gorilla.

'What
kind of sample image data you collected that would result in this son?

'I
understand how this happens, the problem is more so on the why. This is
how you determine someone's target market.'

His
tweets triggered a response from Yonatan Zunger, chief architect of
social at Google, who said programmers were working on a fix to the
problem.

He
said: 'Thank you for telling us so quickly. Sheesh. High on my list of
bugs you *never* want to see happen. Shudder.'

+2

Jacky
Alcine's tweet about the problem triggered a horrified response from
Google's chief architect of social Yonatan Zunger, who said
engineers were working on a variety of fixes to prevent similar issues
in the future

Mr
Zunger later said that Google had turned off the ability for photographs
to be grouped under that label to stop the problem.

He said however the error may occur in
photographs where their image recognition software failed to detect a
face at all.

He said a fix for that was being worked
upon.

He added: 'We're also working on longer-term
fixes around both linguistics – words to be careful about in photos of
people – and image recognition itself, eg better recognition of dark
skinned faces.