A program that can mimic online flirtation and then extract personal information from its unsuspecting conversation partners is making the rounds in Russian chat forums, according to security software firm PC Tools.

The artificial intelligence of CyberLover’s automated chats is good enough that victims have a tough time distinguishing the “bot” from a real potential suitor, PC Tools said. The software can work quickly too, establishing up to 10 relationships in 30 minutes, PC Tools said.

That said, threat reports from PC security companies are notoriously hyped, so I wouldn’t get too excited until there’s stronger confirmation.

Meanwhile, speaking of chatbots and excitement, I have to wonder just what the chatbot is saying. Is it trying to steer the conversation by getting to the point quickly? (Sex and/or meetings.) Is it offering low-price prostitution services (that might actually make it easier to steer conversations in a businesslike, organized manner)? Is it reflecting back what its chat partner says (like ELIZA, but with more admiration)? Is it faking cyber-orgasms? The mind boggles with possibilities.

What might be really funny would be to point two copies of the bot at each other and watch them chat each other up …

EDIT: I just checked out the comment thread on the original blog post. One astute comment pointed out that there’s a big, legal market for virtual girlfriends (pornographic in the US, tamer in Japan). So if this technology were really all that great, why would it be used for crime at all?

FURTHER EDIT: Slashdot picked this up, so there are lot of comments, some funny, some insightful. Particularly interesting was a claim that this is a very easy test to pass.

“A computer is a device that accepts user input, processes it, and returns output.” Basically, that’s what we want from our girlfriends/wifes anyway. Well, we might not want the output before she graduates from girlfriend to wife, but you get the idea.

LOL on the output joke — and cleaner than where I thought you were headed with it.

CAM

Aaron Harper on
December 10th, 2007 1:55 pm

I think this may not indicate a sucessful passing of the Turing test by an AI, but rather a failure of a similar test by the “lonely Russia males”.

Curious on
December 10th, 2007 5:18 pm

How about some links on where to find or talk to this bot?

i want to meet this bot! on
December 12th, 2007 12:51 am

Where can I meet this bot? Assuming that I feel my computer is safe enough, it would be a better way to waste a half hour than messing on youtube. Sure, why not get talked dirty to if there is no person on the other end, no guilt in the end?

[…] Monash, a leading analyst of and strategic advisor to the software industry, wrote in the Text Technologies blog that it might be fun to point two copies of the bot at each other and watch them chat each other […]

…
In Turing Test Two, two players A and B are again being questioned by a human interrogator C. Before A gave out his answer (labeled as aa) to a question, he would also be required to guess how the other player B will answer the same question and this guess is labeled as ab. Similarly B will give her answer (labeled as bb) and her guess of A’s answer, ba. The answers aa and ba will be grouped together as group a and similarly bb and ab will be grouped together as group b. The interrogator will be given first the answers as two separate groups and with only the group label (a and b) and without the individual labels (aa, ab, ba and bb). If C cannot tell correctly which of the aa and ba is from player A and which is from player B, B will get a score of one. If C cannot tell which of the bb and ab is from player B and which is from player A, A will get a score of one. All answers (with the individual labels) are then made available to all parties (A, B and C) and then the game continues. At the end of the game, the player who scored more is considered had won the game and is more “intelligent”.
…

[…] Monash, a leading analyst of and strategic advisor to the software industry, wrote in the Text Technologies blog that it might be fun to point two copies of the bot at each other and watch them chat each other […]