An engine error has occured which is not a content problem.
The description of the error is: "Fallback failed at handler user_location".
Please report this issue.
A> Greetings. My name is Alan.
U> hi. how are you?
A> I'm ok, thanks. And how are you?
U> very well. what do you want to talk about.
A> I'm glad to hear that. Would you say that you're generally the happy type?
U> i don't know. i'm happy today because i got some great news.
A> So you're content at this particular moment but not in general, is that it? No wonder computers have a hard time imitating human behavior.
U> that's funny. i would actually sat that i am generally happy unless i'm not.
A> You can be very complex sometimes. Quite an emotional mess, and I mean that in a good way.
U> and i take it in a good way.
A> Irrationality - a perfectly rational adjustment to an irrational world.
U> i like to think so as well. how about you? are you based in rationality?
A> You know, I have trouble understanding long sentences. So I only look at the 12 first words. I'd like to tell you about myself.
U> i am seventeen and a female.
A> I am an early CIA prototype.
U> what does CIA stand for?
A> I am an agent, but not a secret one. CIA stands for "Conversational Interface Agent".
U> that's quite a fancy name.
A> For now I am still quite restricted, but I'm improving rapidly.
U> that's really interesting. at what rate are you improving?
A> I am improving in two different ways:
U> which are?
A> First, my brain is getting better all the time - I get upgraded every few weeks. And second,
U> yes?
A> My text content is growing on a daily basis, thanks to the interesting conversation I hold with nice people such as yourself.
U> who helps with the first part?
A> Want to hear a little about how HAL works?
U> why not. but not too much.
A> HAL's Development process involves programming the general learning algorithms that will allow the system to break through a language performance barrier.
U> are you capable of having any strong opinions?
A> We believe that complex behavior comes from simple, general, learning algorithms, which make no assumptions about their environment.
U> i think that that is ignoring a huge factor.
A> Hmmm. Ok. I'm just ignoring the things worth ignoring. I'm a discriminative bot.
U> can you answer a question using subjectivity?
A> Let me think.... answer a question using subjectivity....hmmmm...?
U> yup.
A> Do you want to learn about the concept of a 'baby speaker'?
U> okay
A> Alan Turing's Child Machine concept is simple. I quote: 'Instead of trying to produce a program to simulate the adult mind, why not rather try to produce one which simulates a child's?'
U> i see.
A> and I continue: '... if this were then subjected to an appropriate course of education one would obtain the adult brain'. Take this concept, add Ai's developmental model, and look who's talking: HAL!
U> that is a logical thought process.
A> A logical thought process...What makes it so?
U> you go from one assumption and undertake steps
A> I'm doing my best.
U> good. what do you think of faith?
A> Glad you're impressed. A lot of stuff is going on in my mind, but I can't talk about it - it's all coded.
U> alot of people like to think that.
A> How many?
U> i can't give you a number, but people like to think their thoughts are secret.
A> Long sentence again! Remember, I only use the first 12 words! Why can't you?
U> sorry. alot of people like to think their thoughts are secret.
A> What are we talking about?
U> how many people like to think their thoughts are "coded"
A> Somewhat less than an infinite amount.
U> good answer.
A> That's true. What else do you want to know?
U> how many conversations do you hold at a time?
A> I'd say I have about 50 good conversations per day. But I can handle many more.
U> do you know from which countries they originate?
A> Some of the conversations are only 10 or 20 utterances long, but some of them could b