Obviously it's good to have a safety measures around these things, even the most automated factories have emergency shut-down measures and humans supervising stuff.

But in regards to AI...we are decades, if not centuries away from anything even remotely resembling a self-aware "concious" program. It's may not even be a matter of simply lacking the raw computing power, it could be something else entirely.

I don't think you understand the idea of how quickly technology has evolved in the past decade alone. Ten years ago if you told someone that they'd have a casual tablet in the year 2016 that was more powerful than anything that they had in 2005, they'd probably think you're crazy. I mean, my Fire HDX 7" is more powerful than the Pentium 4 Dell I owned back in 2006 and this tablet is only from 2013! Take one from today and you'd see something on par with a Core 2 chip more than likely... if not more powerful yet, and I'm speaking strictly ARM CPUs here, not the ULV Intel stuff that's competing.

A.I. is something that'll likely happen within the next decade or two at MOST, if we don't nuke ourselves into the stone age.

I may be wrong, I hope I'm wrong, but it seems to me that creating a true autonomous AI will be the same as creating a black hole.

We - humankind - are all very curious to see how one looks like, so our competitive nature says that we won't stop until we get there, and then it's over because it escapes our control.

Black hole is a non issue. Anything we could feasibly create would evaporate in a few nanoseconds if that thanks to Hawking radiation. AI is also a non issue assuming you don't let the thing have unrestricted access to the Internet or, like, robotic arms and machine guns.

As with most electronic devices, AI can be defeated quite easily by pulling the plug. If that somehow fails, we've always got nukes. The EMP would utterly destroy any modern electronic device (ironically the older ones are more likely to survive, but the shockwave should destroy any vacuum tubes used).

Black hole is a non issue. Anything we could feasibly create would evaporate in a few nanoseconds if that thanks to Hawking radiation. AI is also a non issue assuming you don't let the thing have unrestricted access to the Internet or, like, robotic arms and machine guns.

As with most electronic devices, AI can be defeated quite easily by pulling the plug. If that somehow fails, we've always got nukes. The EMP would utterly destroy any modern electronic device (ironically the older ones are more likely to survive, but the shockwave should destroy any vacuum tubes used).

I know about that first part, as of now, according to the state of the art knowledge, humankind doesn't have the means to create one with any sense of danger, but I was more making a parallel when it comes to AI, and that one is pretty feasible. You are even more or less explicitly acknowledging that its a much more realistic scenario when you put some assumptions of things that humankind won't do in there.