A report by Gizmodo reveals that Google is partnering with the United States military to develop techniques of what it calls Algorithmic Warfare – under the code name Project Maven. Google’s role in the project is to provide the Pentagon with artificial intelligence modules that use machine learning to help military drones identify targets automatically, so that people don’t need to.

Google has claimed that Project Maven is ”for non-offensive uses only”, but that assertion is contradicted by Marine Corps Colonel Drew Cukor’s statement last year that he purpose of the project is to ”increase the ability of weapon systems to detect objects”.

Cukor ought to know what Project Maven is for. He’s the leader of the Algorithmic Warfare team at the Pentagon.

When weapons systems identify targets, it isn’t to deliver advertisements to them, or to send them coupons.

Besides, the people at Google are certainly intelligent enough to understand that once the military learns how to program its flying killer robots to identify targets automatically, the technology can easily by applied to all forms of weapons systems.

Google’s protestations of innocence are akin to a gun manufacturer’s claim that nobody needs to worry about its products, because they’re designed to hunt deer and rabbits.

Google once promised that it would do no evil, but I can’t think of anything more evil than engineering robotic systems to automatically identify people to be killed.