On its website, CSER said: "Many scientists are concerned that developments in human technology may soon pose new, extinction-level risks to our species as a whole. Such dangers have been suggested from progress in AI, from developments in biotechnology and artificial life, from nanotechnology, and from possible extreme effects of anthropogenic climate change.

"The seriousness of these risks is difficult to assess, but that in itself seems a cause for concern, given how much is at stake."

CSER has been co-founded by eminent scientist Sir Martin Rees, former Astronomer Royal and former master of Trinity College, who wrote on the Guardian that "new hazards are emerging that could be so catastrophic that even a tiny probability is disquieting".

Are you worried about a robotic apocalypse?

"[We] begin with the conviction that these issues require a great deal more scientific investigation than they presently receive. Our aim is to establish within the University of Cambridge a multidisciplinary research centre dedicated to the study and mitigation of risks of this kind.

We are convinced that there is nowhere on the planet better suited to house such a centre. Our goal is to steer a small fraction of Cambridge's great intellectual resources, and of the reputation built on its past and present scientific pre-eminence, to the task of ensuring that our own species has a long-term future. (In the process, we hope to make it a little more certain that we humans will be around to celebrate the University's own millenium, now less than two centuries hence.)

Such machines do not currently exist, and robots used on the battlefield require human intervention before they use lethal force, but HRW warned that militaries are closer than ever to implementing automatic killing machines and such machines should be pre-emptively banned.