Honestly no, I think consciousness is a faculty of the soul and I think the soul was placed by some higher power (God if you please) and regardless of what technology we produce I don't think we can get to the point where we can create a soul or consciousness. I do hweer think there is a point we could get to that is a exceptional simultion of consciousness.

just to add to my previous post, For example, If any of you have ever tried those new 20 questions games. Those things are scary and it is aparantly thinking and reading your mind. I do not know how it does it but it's pretty convincing. Just to note It asks you 20 questions and then it tels you what you're thinking of, it guessed spider monkey..not just monkey, spider monkey!! It's unreal

I'm gonna have to say no to this one as well.
I don't know if man will ever be able to create a fully "thinking" computer or not. In order to do that I think we'd have to figure out a lot more about how we humans think in the first place. So I don't believe any computer will ever become conscious. Then again, I'm not sure how we're defining consciousness here.

As for a computer to have a soul, nope. My personal opinion is that a soul is something implanted into each individual by God himself. If computers were going to have souls, God would have to give it to them, not man.

I think Searle with his Chinese room experiment was a good enough argument to show me that computers do not think:

Imagine a person in a room, with one slot on one side of the room, and another slot on another side of the room. We'll call one slot, "slot A" and the other slot, "slot B." In the room with the person is a dictionary that has no English, but is a guide that tells a person what to write if they see a certain Chinese symbol. From slot A come a Chinese symbol. The person takes the Chinese symbol, looks it up in the dictionary, writes the corresponding Chinese symbol, and puts it in slot B.

The information from slot A is the input, the person in the room looking up the symbols is the manipulation of the data, and the giving of the information to slot B is the output. This corresponds to computers 'thinking.' A computer takes in some sort of input, it then manipulates data with a computer program, and then sends some sort of output.

The question is whether the person manipulating the symbols knows what the symbols mean. Of course, unless they speak Chinese, they don't know.

Searle makes the distinction between knowing syntax and semantics. He thinks that computers are programed to know syntax, but they have no semantic knowledge; thus, they are like the person in the room in the Chinese room experiment. It's also prudent to point out that this doesn't rule out 'weak AI', that is, it doesn't rule out that computers do a sort of thinking (if you consider the manipulation of symbols as thinking. It doesn, however, rule out 'strong AI', that is, the idea that computer do the kind of thinking that humans do.

Bk2Kant wrote:just to add to my previous post, For example, If any of you have ever tried those new 20 questions games. Those things are scary and it is aparantly thinking and reading your mind. I do not know how it does it but it's pretty convincing. Just to note It asks you 20 questions and then it tels you what you're thinking of, it guessed spider monkey..not just monkey, spider monkey!! It's unreal

I love those 20Q ball things. They're pretty nifty I have to say. It's pretty sad that one of my most proud moments with a 20Q ball is when I stumped it while thinking of "rifle" and its first guess was "shotgun" ha ha. That's right you question-asking ball, you can't step to my level!

p.s. It did guess rifle on its second guess though. So I only really defeated temporarily.

philoreaderguy wrote:Do you think a man-made computer could ever become conscious? Can it have a soul? Why or why not?

Some people have honestly debated whether computers have souls, but I don't think this can be possible.
A computer is a machine that is constructed from
all sorts of hardware, and it is certainly an
amazing invention, but I don't think it has the capacity to think and feel the way a human does.
It is, after all, a product of technology, and therefore, it cannot reason and feel like a human
being does. That's not to say that computers can't
accomplish great things. Like I said before, it is a
marvelous invention, but it is an impersonal one.
It spits out information, but not does not care
about how we respond to that information.

It's not what you know that makes
you smart, it's knowing what you don't know.

Quantum computers are just as schizophrenic as our minds are. With some proper modeling and "raising" them to recognize commonality in anomolies that occur by comparison to past stored memory data, then there may be a possibility that these types of computers could take on human stylistic consciousness which believes it thinks independently and can know what it doesn't really know.

selfless wrote:Quantum computers are just as schizophrenic as our minds are. With some proper modeling and "raising" them to recognize commonality in anomolies that occur by comparison to past stored memory data, then there may be a possibility that these types of computers could take on human stylistic consciousness which believes it thinks independently and can know what it doesn't really know.

It would take many years of studying to make this happen, though. Even then, I don't think it could rival the rational thoughts and feelings that a human possesses.

It's not what you know that makes
you smart, it's knowing what you don't know.

I think the common sense interfers with this question, throwing garbage concepts in the middle. Three things: tool, made by the man, and material.

TOOL

The common sense says a tool cant develop consciousness, since tools are meant to be manipulated by an external conscience. Tools are conscience-dead. We even look at soldiers as they don't have a conscience. Tools do what they are manipulated to do, period.

So lets do the question again without thinking of the computer as a tool. What if computers weren't meant to be used?

MADE BY THE MAN

The common sense says no again, since we do cakes and they don't develop consciousness and start running around. We do make kids, but we don't manually cook them into self aware beings. That just happens.

So we should ask the question again forgetting we do computers by hand. What if computers just happened?

MATERIAL

The common sense says no again since computers are made of plastic and metal. Fact is we don't know any form of conscience built in these materials.

So we should look at the question forgetting about materials, or, what if computers weren't done in metal and plastic, but in flesh, bones and blood?

CLOSURE

The more human definitions we produce, the less we understand the universe. If you think you have to be HUMAN to develop consciousness, then animals dont have conscience, then, well, nothing else but humans can have one including computers, right? end of the stupid questions, right?

But if you ask seriously a lot of border questions appear: Is an insect aware of itself? is a body cell aware of itself? is a plant aware of itself? what is consciousness? if a cell is aware of itself, where is it located? if its not, how can it be part of our consciousness? Do we need a soul to have conscience?

...

Can anything become self-aware?

We do, for certain. I also think animals do, and in some ways I think everything in the universe has some self awareness.

Computers? can computers become self-aware?

Somewhere in the middle of "I dont know" and "Im just watching" I can say yes, as long they are not tools, they are not made-by-the-man in the way we know, and they are not dead. Without letting these concepts go, its like asking if forks and guitars can become aware of themselves.

I do think we will be able to factory babies in the some day playing with lego ADN modules. I think the question of if these babies have consciousness will be gone as soon as the first assembled kid says "hi dad" to someone.

Well, in my humble opinion, I do not foresee computers or machinery to gain their own consciousness. Technology is ultimately created and deliberately programmed by us - mankind, there are limitations in computers of course. They will never be violatile or unpredictable like human nature. They will just accord to the programmed pathways to react.

If you are talking about computers taking over us humans or gaining power over us, I guess you could put it in the way that we are overly reliant on computers to the point of the subsequent inability of functioning without computers. That is highly possible.

Conscious or not, we are probably not far from a point whereby machines acquire so compellingly accurate impressions of consciousness that we will find ourselves assuming it to be the case, be it truly consciousness or otherwise.