Mr Zunger said Google had already taken steps to avoid others experiencing a similar mistake.

Mr Alcine said the error had affected several photos in his collection

He added it was “also working on longer-term fixes around both linguistics – words to be careful about in photos of people – and image recognition itself – eg better recognition of dark-skinned faces”.

This is not the first time Google Photos has mislabelled one species as another.

Users are able to remove badly identified photo classifications within the app, which should help it improve its accuracy over time – a technology known as machine learning.

Google has faced criticism since the error was made public

However, Google has acknowledged the sensitivity of the latest mistake.

“We’re appalled and genuinely sorry that this happened,” a spokeswoman told the BBC.

“We are taking immediate action to prevent this type of result from appearing.

“There is still clearly a lot of work to do with automatic image labelling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

But Mr Alcine told the BBC that he still had concerns.

“I do have a few questions, like what kind of images and people were used in their initial priming that led to results like these,” he said.

“[Google has] mentioned a more intensified search into getting person of colour candidates through the door, but only time will tell if that’ll happen and help correct the image Silicon Valley companies have with intersectional diversity – the act of unifying multiple fronts of disadvantaged people so that their voices are heard and not muted.”

This entry passed through the Full-Text RSS service – if this is your content and you’re reading it on someone else’s site, please read the FAQ at fivefilters.org/content-only/faq.php#publishers.