And that kinda leads to a thread I posted about a month ago about the "reconstructing and reviving" of the dead and whether they'd be the same person or even truly considered "alive."

If there is no soul, are we not just organic machines? And if so, does that really make us drastically different from intelligent forms made of metal and plastic? This then leads to the question of what "life" truly is.

If a video game developer removed tumors from players, they'd whine about nerfing their loss in weight and access to radiation powers. -Cracked.com

I believe they should be given rights, but not the same rights we have (and that's not to say I think they deserve less, if such an AI existed) An AI will not have the same limitations or issues humans have, so placing human rights on an AI would simply not suit the form of life that's been created. It's a totally unique form of life and it needs its own unique set of rights.

It will naturally have to share many of the rigts we have currently, but some would have to be created to suit an AIs life more specifically or else they're open to possible gray areas where things don't quite fit.

No. They're not human. Even if they can be programmed to simulate emotions/human tendencies that still does not make them human. And I have a feeling that the people who think they should would change their mind if the AI was presented in an ugly, metal slab of a "body".

The OP Isn't talking about simulation, he is talking about true, self aware, emotion based AI. Basically, everything we are, just created instead of born.

IF we ever develop an AI that exists in a mechanical body and is on par with humans in terms of sentience, we cannot treat them as lesser beings, if we don't want to ever give mechanical beings the same rights as we have, we will do well not to create life in any way but the usual way.

Personally I would welcome the idea of an AI. And to the people heralding the robot apocalypse, watch those movies again, and look at the robot motives, it's people like you that subjugated them in the first place, even after being aware of creating an AI that caused said apocalypse.

If humans ever create an AI, and treat it right, there's no reason for them to try to destroy organics.

If they don't have rights, neither should we, since we are also sophisticated molecular machines that developed sentience. What really makes us different from machines when you get down to the scale of molecules?

If in the future we have to make sentient AI's there would be certain laws made regarding the situation. Might be you're just never allowed to make enough human-like things that they would need same set of rights as humans. There's just no *poof* heres a robot that thinks and feels like a human and we need to give it rights asap. It's a slow progress and society will see how it'll go about it.

Originally Posted by BlueRet

"This server shutdown was so epic not even Illidan was prepared for it"

I find it funny that people think so highly about our brain (it is a masterpiece yes, but not something you can't recreate or surpass). People like to call it 'being human' or having a 'soul'. You do know that self awareness, emotions, conciousness and all those things are results of electric impulses and signals in our brains yes? In the future, when computers will have more calculation powers etc it is possible to surpass the capabilities of an entire brain (given the knowledge of the entire brain structure and that kind of stuff). If they would place your brain into a robot, attach all the neural connections etc to the whole body then you will still be yourself but in a different body. Of course this is still sci fi, because we do not have the technology yet.

Our brain is nothing more but an organic super computer. And once computers are capable enough, we can actually simulate our brains once we know all the connections that are present.

Regarding the OP's question, I would only answer it with a yes once the AI reaches the same level as our own brain. Because then we are basically equally intelligent and self aware, and therefore deserve to be treated equally but once you have AI at such high level. Well, it would be the same as just shooting a human because the emotional impact and that kind of stuff are the same.

When an AI becomes fully self-aware, the only differences between it and a human are their anatomy and creation process, no? Take the androids in Blade Runner, for example. They were created by humans, and are as such completely artificial, including their minds, but they are fully sentient, with their own thoughts, emotions and desires. What reason could there possibly be for them not having the rights as humans, if everything else about them is?