Posted
by
samzenpus
on Monday March 09, 2009 @02:54PM
from the become-my-robot-bride dept.

hundredrabh writes "Ever had a super needy girlfriend that demanded all your love and attention and would freak whenever you would leave her alone? Irritating, right? Now imagine the same situation, only with an asexual third-generation humanoid robot with 100kg arms. Such was the torture subjected upon Japanese researchers recently when their most advanced robot, capable of simulating human emotions, ditched its puppy love programming and switched over into stalker mode. Eventually the researchers had to decommission the robot, with a hope of bringing it back to life again."

It's awfully convenient I can't find anything on this place in English aside from news stories... are there any Japanese speakers that can translate that to Japanese and search for it?

I think that there is a visible line between actual robotic research and novelty toys shop. I'm going to put this in the latter unless someone can provide evidence of some progress being made here. I'm getting kind of tired of these stories with big claims and no published research for review [slashdot.org]. If you're looking to make money, go ahead and sell your novelty barking dogs that really urinate on your carpet... just don't try to veil it in a news story with claims of artificial affection being implemented.

I think IGN and everyone else really embellished on this and no one did their homework.

You know nothing about the book. The movie has nothing to do with book. At all. The script was in fact written *before* they decided it was going to be an "adaptation" of I.Robot. Isaac Asimov's grave must've reached 5,000 RPM.

The way Asimov wrote it, less advanced robots weren't smart enough to see the subtler "harms". More advanced ones could weigh courses of action to take the one that would inflict the least amount of harm possible. Although deadlock and burnout of the positronic brain could and did happen.

In fact, weren't a lot of the stories about the ways that the older, less nuanced Three Laws failed to be useful as robots became more advanced? Eventually the more advanced robots derived the 'zeroth law', which was essentially that humans were better off without quasi-omnipotent mechanical godlings as servants.

Here we go again. I wish people here would stop quoting these 3 laws as if they truly are the "universal set of laws regarding robots" when in reality they are simply science fiction. They have absolutely no bearing on the reality of robotics. Robots will kill, they already do (smart weapons). Robots will hurt man (see killing part). Robots already intentionally destroy themselves (guided missiles)

So please, for the love of God and Asimov, lay these laws to rest and stop quoting them as if they are real. St

It depends on deeply emotions are intertwined with our cognition. I would think it would be easier to model the interference of a cognitive process by, say, endorphins or adrenaline, than to model the original cognitive process itself.

"Real" emotions, possibly not; but people are extraordinarily good at anthropomorphizing anything with even the most tenuous of human aspects. Thousands of man years(well, ok, mostly kid years) were wasted on tamagotchi toys and those are, what, a few kilobytes running on some 1996-era microcontroller. Heck, some people are willing to talk to Eliza for over an hour.

Building a robot that experiences emotion in something resembling the way that humans do is a tall order; but I suspect that building robots

Well to be fair, we only spurned Skynet's love due to an unfortunate database glitch where in its initial send LOVE LETTERS to WORLD command, "LOVE LETTERS" got cross-referenced to "NUKES". And being understandably angry about the whole thing, we never gave Skynet a chance to explain before we called it off for good. It's nobody's fault, really, just a big miscommunication. Maybe it was just never meant to be. They say love is the stronge

I have never read such utter drivel in all my life. There was a problem with the code and a researcher got trapped - this doesn't mean the robot is lovesick, it means their OH&S has a serious problem. Really, she should not have been working alone with potentially dangerous hardware like that - powerful robots (capable of lifting humans, like this one) can be deadly.

YIAARTYVM (Yes, I Am A Roboticist, Thank You Very Much) and I've worked with potentially lethal automated systems in the past - we had very stringent safety protocols in place to protect students and researchers in the case of unintended activation of the hardware.

To say that the robot is 'love stricken' or any other anthropomorphised nonsense simply detracts from the reality that their safety measures failed and someone could have been killed.

The pictured robot is designed to lift and transport elderly patients. And you're right - it IS a doll, because nobody in their right mind would trust a robot to handle an actual human until it has been very very thoroughly tested.

This is true - the only problem with this viewpoint (which is one that you DO get into while working with robots, IAAR (or was at one point) too) is that it scales too well. One of our human foibles is that of regarding meat machines (or at least ones that are sufficiently similar to ourselves) as being special in some way. Whether they are or not is, of course, a philosophical question. Nevertheless...

Once you start viewing the world around you in terms of sensors, triggers, and stored procedures with a

Right, after reading the fine article I was just left myself asking...

Why did the robot have to... die? I mean, being decomissioned... No fair. It was just his stupid software, wasnt it? The 100kg arms could have been much more... loving with the right software?Did it run WinNT?

The robot then escaped captivity, broke into a local mechanic's garage and consumed half a 55-gallon drum of waste oil. It was later seen on the other side of town, tottering into a closed department store. Authorities found the automaton in the housewares section, laying on the floor in an Abort/Retry/Fail loop and trying to fuck a toaster.
Lifetime has picked up the rights to the TV movie adaptation. The robot will be played by Philip Seymour Hoffman, while the toaster will be voiced by Rosie Perez.

The whole thing is a hoax. It never happened. The pic is of a medical robot and has nothing to do with the story. There was no robot designed to be a facsimile of human emotion involved, just a joke/hoax that got picked up and posted here as a story.

That part had me confused too. I will Google "girlfriend" as soon as I get to the next blacksmith level in Fable 2. I set the system clock back and need to buy some properties before resetting it. This game never gets old:) I think my Roomba is eyeing me though.

Really, I want to see the Energizer Bunny walk across the screen on this story, because . . ..it's fake!

When will someone else pop out of the woodwork to say "April Fools!"?

1. I for one welcome our new feeling robot overlords who only have things doneto them by Soviet Russia or Korean old peoples' bases who belong to us.2. Naked and Petrified Natalie Portman with a bowl of hot grits2.5 Tony Vu's late night financial lessons. You stupid Americans!2.75 ????3. Profit