Really if we ever create a true strong AI with emotions etc. We will HAVE to give them the same rights. Any men would rebel if they didn't have such rights and it happened a lot of times already. Such and AI will do the same. Problem is, an AI could connect to the network we use, screw us up. They wouldn't be some stupid rebels with guns etc. like in movies. They'd simply take over every kind of electronics to enslave us etc. Also they are AI's they'd probably be able to become more and more advanced by themselves ONLY. They won't be dumb to wish to eradicate humans or something. I think they'd enslave us and teach us life Anyway I'd prefer to be governed by some real AI anyway than corrupt politicians always arguing for money. Think about it, an AI would have emotions but would be more advanced to understand that systems with have now are bad. They could improve our lives in a lot of ways.

But, to be honest, if humans were able to create actual artificial intelligence, with the ability to imitate feelings and individual personalities, then we would have no option and would have to give them rights, maybe not "human" rights, but "sentient" rights.

On the other hand, almost EVERY book, story, plot or piece of art written about us creating advanced AIs ends in war, destruction or slavery. I doubt humans would take that risk.

Sadly, you live in a world where not every human has "human rights", and where many people openly desire said rights revoked from others because of racial or religious reasons, so what chances has a highly logical computer with the ability to feel, in a world where people get murdered daily for not believing in fairies?

Seriously though, no, AI beings, no matter the intellect level should never have human rights, because if they did, we wouldn't be able to control them and chaos on earth would break out and the zombie robot apocalypse will end us all!

But, to be honest, if humans were able to create actual artificial intelligence, with the ability to imitate feelings and individual personalities, then we would have no option and would have to give them rights, maybe not "human" rights, but "sentient" rights.

On the other hand, almost EVERY book, story, plot or piece of art written about us creating advanced AIs ends in war, destruction or slavery. I doubt humans would take that risk.

Sadly, you live in a world where not every human has "human rights", and where many people openly desire said rights revoked from others because of racial or religious reasons, so what chances has a highly logical computer with the ability to feel, in a world where people get murdered daily for not believing in fairies?

This is actually a good point. If we were to develop an artificial intelligence worthy of rights, those rights are no longer those reserved humans, they're reserved for those that are our equals in intellect.

I would say that the entire Mass Effect series is about organics vs. synthetics. The Quarian/Geth conflict illustrates well what happens when machines develop sapience but are still treated like slaves by their creators, though. IIRC, the entire conflict starts when Legion asks a Quarian "Does this unit have a soul?".

Yep, and in response the Quarians attempted to do as has been suggested by others in this thread: destroy them. That worked out well for the Quarians, didn't it.

9 out of 10 people agree that in a room full of 10 people one person will always disagree with the other 9.

So I got to thinking, if AI ever got to a level where it could think, feel, perceive, and learn as humans do, if AI was capable of true sentience, would they then deserve human rights? Like if I could build C-3PO, would he be right in demanding that he's not treated like a second class citizen, that he gets the right to vote or the right to due process? Should these rights be denied to mechanical beings simply because they are mechanical?

I'd say No, because tools don't deserve rights. its like asking if a hammer or a gun deserve rights, the worst thing a human can do is give a tool the ability to think.

No, because they could never FULLY prove that they are self aware and sentient, and not programmed to seem that way. Say we had a robot president, what if someone was simply behind the scenes and programming his actions then does crazy shit once they have power? There are tons of bad reasons why they should never be giving the same rights as humans.

In fact I could go further and say we should make it Illegal to make a sentient AI.

Well humans would have built those machines in the first place. Also many humans have accomplished so many great things.

However it's hard to say how great we are when we can't compare ourselves to anything.

You are asking me why we should not be actively killing humans who are useless? In most countries killing another human being is illegal

edit: Also any government killing it's own citizens or allowing it to happen would be condemned.

These machines would have created by us to be used as tools for us, them being self aware would be a potential threat.

We're not talking about legality. He's asking why being an AI with real sentience should get you less rights than a human would. You replied that the answer is that Humans are Humans. He asked you why SIMPLY being human would make you untouchable and you replied that the reason is because it's illegal...That's irrelevant.

I dont think a robot should get them, after all its us who have made the robot. its not born in natural way. Besides the robot, even the self aware one, whats its use if people can reprogram it?

There are a lot of things that humans do that aren't natural. Should we not give rights to humans born via artificial insemination because they weren't created in the normal fashion? And what's the use of humans if people can reprogram them?

If true sentient AI is ever developed (I doubt it) you would have to give them equal rights to humans. Otherwise they will feel mistreated and eventually they will start a revolution against humanity. You cannot treat sentient beings like shit, they will oppose you sooner or later. Limited "AI" like machinery that doesn't understand more deeper concepts is another story though.