A lot of people use the term “Killer Robots” and then they think of the Terminator movies. This is; however, fairly inaccurate.

Killer robots are more kin to remote control aircraft or the drones of the present – which have clearly already been weaponized and used to kill people. The controller is a human being. Robot in word origin is more akin to “serf labor” or since there are no serfs today – slaves. Robots are slaves to the controller.

AI Human Killers would be something orders of magnitude worse. This is the Terminator. It makes decisions autonomously and has a consciousness that makes a determination – kill this person or not.

This has been explored in many Science Fiction stories. In Star Trek: Voyager there was an episode where there were AI intelligent bombs. Long after the civilization that created them perished they were still active and ready to kill people. Isaac Asimov explored this concept with the Three Laws of Robotics.

The first law of the Three Laws of Robotics is:

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

He headed off the problem not only by not setting the precedent that robots/AI could kill people, he made an explicit permanent feature of their programming that they must not cause harm to humans.

He choose wisely. Will we choose wisely? Or will the dystopian future of The Terminator series be the rule?

Based on the leaders of the world and the symbolism of their actions I find it hard to believe that they will resist creating AI Human Killer (HK) devices to further their own political objectives.

There is some hope; however, that we have now created rules not to use biological weapons, chemical weapons, or nuclear weapons in space. Not everyone participates on these restrictions. Some governments have used chemical weapons against their own people.

The problem about the precedent of not using biological weapons, chemical weapons, or nuclear weapons in space is that all of these weapons are nondiscriminatory in nature. AI Human Killers would be discriminatory weapons. Even with today’s technology in facial recognition it would be easy to code to kill people with certain facial features. Imagine dumping a platoon of AI HK units in a city to cleanse them of whatever specific racial phenotype that you don’t like.

Their is a major question; however, in determining the point when a mechanism is a Killer Robot or if it is an AI HK.

Finally, if there were a battle between Killer Robots and AI HKs – as implied by the article title – there would be no question of who would win – AI HK. The reason is reason. Killer robots could be a machine gun attached to a motion sensor and shoots. An AI HK would figure out that you need the machine gun attached to motion sensors in an optimal configuration to secure a building.