On the go and no time to finish that story right now? Your News is the place for you to save content to read later from any device. Register with us and content you save will appear here so you can access them to read later.

Disgruntled workers are walking off the job at Google in protest of the tech giant's role in a military project with the United States army.

The Silicon Valley behemoth — which is under fire in Australia over how it tracks customers and uses their data — is facing dissent from some inside the company over lending its artificial intelligence capability to the US drone program.

An internal petition calling for Google to stay out of "the business of war" was reportedly gaining support in the US, with some workers quitting to protest the collaboration, known as Project Maven.

The Pentagon is using Google's leading artificial intelligence technology to allow its drones to process and instantly recognise images.

While Google indicated that AI findings would be reviewed by human analysts and would not be used for offensive missions, the technology could pave the way for automated targeting systems on armed drones, ICRAC said in an open letter of support.

"As military commanders come to see the object recognition algorithms as reliable, it will be tempting to attenuate or even remove human review and oversight for these systems," ICRAC said in the letter.

"We are then just a short step away from authorising autonomous drones to kill automatically, without human supervision or meaningful human control."

Google is not the first institution to cop major backlash for being involved in the ethically dubious area of autonomous weapons.

In April, Australian AI and robotics professor from UNSW, Toby Walsh, led a boycott of a top South Korean university due to concerns around the development of killer robots.

The boycott involved more than 50 of the world's leading artificial intelligence and robotics researchers from 30 different countries and came after the Korean university opened an AI weapons lab in collaboration with a major arms company that builds cluster munitions in contravention of UN bans.

For years Professor Walsh has been steadfast in his opposition to AI technology being applied to weapons systems, previously telling news.com.au "it would be a terrifying future if we allow ourselves to go down this road".

The Electronic Frontier Foundation (EFF) in the US was another group to welcome the internal Google debate, stressing the need for moral and ethical frameworks regarding the use of artificial intelligence in weaponry.

"The use of AI in weapons systems is a crucially important topic and one that deserves an international public discussion and likely some international agreements to ensure global safety," wrote the EFF's Cindy Cohn and Peter Eckersley.

"Companies like Google, as well as their counterparts around the world, must consider the consequences and demand real accountability and standards of behaviour from the military agencies that seek their expertise."