Jacob1207

It is not necessarily a robot, though presumably the intelligence could be downloaded into or could control a mechanical device like that. As for whether or not it should (or could) be considered a living being is part of the question. The artificial intelligence that I have in mind, however, is not biological or organic but computer-based.

Feel free to qualify your answers in any way ("If X then yes, if Y then no..." etc).

Jacob1207

Yes, VIKI (a supercomputer, for those who haven't seen the movie) would probably fit the description. But I don't want, at this point, at least, to get tied to a specific example. I'm just thinking of a generic computer-based artificial intelligence that meets these four criteria:

(1) can think for itself;(2) has its own will;(3) has its own emotions; and(4) can distinguish right from wrong

Could such an entity be considered a person? Should it be given any rights? Could it be given rights, would that even make sense? Or is being organic a requirement for the posession of certain rights, or of any? If the AI that I described shouldn't be given any or many rights, would another type of AI with different characteristics qualify for them?

In terms of having will, you would probably need to but what would the rights entail?

Could it be given rights, would that even make sense? See above

Or is being organic a requirement for the possession of certain rights, or of any?

There are rights for humans and for most animals so organic, yes but if the AI has the ability to reproduce itself that may change, rights would need some control mechanism. Rights are not always freedoms but also restrictions

If the AI that I described shouldn't be given any or many rights, would another type of AI with different characteristics qualify for them?

Not quite sure what you're getting at here, are we moving onto organic?

meefsgirl

Suppose, for the sake of argument, that it is possible to create an artificial intelligence that:

(1) can think for itself;(2) has its own will;(3) has its own emotions; and(4) can distinguish right from wrong

What rights, if any, should such an intelligence receive? Could or should it be considered a person? And, of course, why?

-- Jacob

I don't know about rights, but should it be considered a person?.. No, because it's not God breathed. An artificial intelligence is not a person no matter how good it gets as it was made by man. A person is a human being and has been God breathed to give it life.

Logged

lovetruth

so, if we use dna to make a person (which i am not cool with, btw), it's a person, because we used the system of creation that God already put in place. it's like His breath is still running through that. good thought.

Logged

Jacob1207

so, if we use dna to make a person (which i am not cool with, btw), it's a person, because we used the system of creation that God already put in place. it's like His breath is still running through that. good thought.

So, you're saying it comes down to being organic? God cares about us because our brains use electro-chemical impulses to convey information and thoughts but he wouldn't care about an intelligence that used silicon-based circuits to do so?

It seems to me that it is the brain which is the most important part here. After all, we don't consider an amputee to be less of a person for having lost part of her body. And if someone sustained Darth Vader like injuries and subsequently became a cyborg, would he then be less of a person and would God love him less, or not at all?

The argument that the AI couldn't be a person because "God didn't make it, humans did," seems to have other implications. Suppose a person with severe cancer recovers after being treated by doctors using modern techniques and medecine. If he then said "God didn't cure me, the doctors, scientists, and drug-makers did," would you agree with him? It seems to me that if you say "God didn't make the AI, humans did," you would also have to say "God didn't cure Bill of cancer, the scientists did."

Anyway, the sorts of rights that I have in mind include, but are not limited to, the rights to:

(1) continued existence (a person couldn't just "pull the plug" on it);(2) compensation for work (whatever it is that an AI would want); and(3) self determination.

These are all basic rights that all people are entitled to. There are, of course, restrictions. Children, for instance, have limited rights of self determination given their diminished capacity for thoughtful long-term planning and understanding relevant consequences of actions; parents generally exercise those in trust for kids. Animals also have some rights; you can't simply torture a dog, even if you own it, but, as property, the wages from an animal's work, if any, accrue to said animal's owner, not the beast itself. And, of course, one person's rights generally end once they begin infringing on those of another.

But the lesser rights of animals are not determined based on their DNA. My cat can't own property because she can't conceive of what that means. If a being is incapable of responsibly exercising a right, that right can be withheld from that being. But it seems to me that the AI that I have described would be able to responsibly exercise a large variety of rights, possibly all of those that humans have. Is the fact that the entity is computer-based and not organics-based really a proper distinction to discriminate on? Note, of course, that rights have historically been denied to people based on gender and race, differences that we now consider it improper to discriminate upon.

martincisneros

Suppose, for the sake of argument, that it is possible to create an artificial intelligence......What rights, if any, should such an intelligence receive? Could or should it be considered a person? And, of course, why?

-- Jacob

It would be in the same category as an animal. The Bible says that whatever Adam called the animals, that's what they were from then on. AI would be a question of animal rights rather than being considered in the image of God. It's not possible to have the image of God, without God, as Lucifer found out the hard way. I wouldn't ever go so far as to say that AI should ever have voting rights, procreation with a human being rights, etc. We might look upon such a "creature" as we do now where Gorillas are concerned. It might occaisionally weird us out with similarities, but it's an animal, plain and simple. But rights against abuse, exploitation, harm, etc., are a genuine possibility if AI ever became that sentient. We'd reach that day when it would know it was, without that being suggested in it's programing or by being told that by another person - same as any of the rest of us knowing from our earliest days about our most basic, fundamental rights.

Jacob1207

If something like this existed, I think the right thing to do would be to treat it with love, kindness, and respect. I'm sure that's what our heavely Father would want.

That's the position that seems to make the most sense to me as well.

I've also been asking this question of many of my friends and associates and I've appreciated the answers I've gotten. Most people initially balk at the prospect; after further thought, some would give the described entity full rights, others limited rights, and some no rights. I wonder now if the response would be different (and I suspect it would be) if the original question were rephrased as follows:

Under what circumstances would it be proper to take away the rights of an entity that (1) can think for itself; (2) has its own will; (3) has its own emotions; and (4) can distinguish right from wrong?

I doubt anyone would answer "when that entity is an artificial intelligence." (In answer to the reformulated question, of course rights can properly be taken away from someone through due process of law. We, for instance, greatly proscribe the autonomy and freedom of people convicted of murder. Whether any other circumstances justify taking away a person's rights is outside the scope of this thread, unless being computer-based is such a circumstance.)

Anyway, I see no reason to think that God wouldn't be as interested in such an entity as He is in us. Nor do I see why it wouldn't have the same capacity that we do to reciprocate--or fail to reciprocate--God's love. In this, I am influenced by a contemporary Anglican theologian, Keith Ward, who, in a recent book, God, Faith and the New Millennium (1998), writes:

I can think of no reason why artificially constructed personal beings should not exist. If they do, they will have as much reason to hope for immortality, and for knowledge and love of God, as naturally reproduced organic beings have. There will be a Last Judgment for computerised intelligences as well as for humans, and they will take their place in the resurrection world with whatever other sorts of finite creatures there may be.[/li]

He says a few other things on the topic as well. I'm not as sanguine as he is about the possibility of such a being, since we presently don't have even the slightest idea why we are conscious. But I certainly don't see an impossibility about it. I think, for now, it might be best to be agnostic about the question, which will probably not be answered in our lifetimes--though I must admit a personal hope that such a being will eventually be possible.