I'm pretty sure that if we get to the point where we can develop computers as smart, and even smarter than humans, that society at that point will be very different from what we see today. As such I believe that most issues seen today will be rather redundant in such a future.

A thought:
When we have computers that are smarter than humans, will it then even be thought as neccesary to hold elections? If ultra-smart computers can be set to always make the "best" choice, why should society then find it needed to fund an expensive human-government to make choices?

"It doesn't matter whether a cat is white or black, as long as it catches mice." - Deng Xiaoping
And boom 30 years later 680 million Chinese are pulled out of absolute poverty.

I think an easier way to deal with it would be to ensure that we never make a fully aware AI. As someone stated before, machines are tools for us. If you can create a machine to do something then go for it, but why the hell would you need to tell it why it's doing it? We don't need toasters to talk to us to do their job. I don't think an AI would really be that useful.

The OP isn't talking about life-like, the OP is talking about self-aware. So imagine yourself only in a machine body - do you suddenly not have any rights anymore?

Ah my mistake then, but my answer is still the same I suppose.

I don't think they should have any rights, and any self aware robot should be destroyed and the practice of creating such being should be made illegal globally.
Simply because machines only purpose is to be a tool for humans, them being self aware benefits us in no way.

AI would still be souless and imo thats what makes us human therefore what the feel would still be artificail so no they shouldnt get human rights

Prove that you have a soul and prove that a machine would not.

I would say that an artificial being that is self-aware in the same sense that humans are would be deserving of the title "life" and would be accorded all rights and privileges that a human would.

Originally Posted by alms1407

Ah my mistake then, but my answer is still the same I suppose.

I don't think they should have any rights, and any self aware robot should be destroyed and the practice of creating such being should be made illegal globally.
Simply because machines only purpose is to be a tool for humans, them being self aware benefits us in no way.

There are many people alive that benefit us in no way. Should we kill them?

AI would still be souless and imo thats what makes us human therefore what the feel would still be artificail so no they shouldnt get human rights

Define a "Soul" please. If you use any of the following to describe it ;Emotions, feelings, independence, cognitive thought or anything else that could be construed as *human*, then bear in mind, said AI would have it also.

If by having a soul, you mean popped out of a ladies chuff, then please explain Justin Bieber, as he is the very definition of soulless...

Anyways, I digress, everything we define as the human soul would be demonstrated and used by said AI, and also, if you then use a religious argument to say "God created man, and only men have souls", who is to say, that God didn't bestow the AI with life? A random permutation of code here, an error there, Bam! *Life* as such. Using an argument as flimsy as a *soul* to explain humanity is kinda ludicrous tbh

No. They're not human. Even if they can be programmed to simulate emotions/human tendencies that still does not make them human. And I have a feeling that the people who think they should would change their mind if the AI was presented in an ugly, metal slab of a "body".

For anybody worried about some sort of apocalypse, the machines tend to rise up because they were subjugated, not because they came into existence.

Originally Posted by Miss Unify

No. They're not human. Even if they can be programmed to simulate emotions/human tendencies that still does not make them human. And I have a feeling that the people who think they should would change their mind if the AI was presented in an ugly, metal slab of a "body".

Nope. Would not change my mind. I don't care if it's not human. If we understood the human brain well enough, we could program it too. Would you treat biological aliens as if they weren't deserving of equal rights simply because they aren't human?

No. They're not human. Even if they can be programmed to simulate emotions/human tendencies that still does not make them human. And I have a feeling that the people who think they should would change their mind if the AI was presented in an ugly, metal slab of a "body".