Robots on TV: AI goes back to baby basics

A robot toddler could have much to teach artificial intelligence (AI) researchers and psychologists alike, by providing a simplified non-human model for early child development.

Sensorimotor theories of cognition argue that body posture and position affect perception. In one experiment, toddlers were presented with an object that was always in the same place – to their left, for example. If their attention was then drawn to that location when the object was absent and a keyword was spoken, the toddlers later associated the keyword with the object, and did so wherever it was presented to them – whether to their right or left.

Anthony Morse at the University of Plymouth, UK, is exploring whether the iCub robotic toddler can learn similar associations. “A lot of AI has been trying to run before it can walk,” he says. “So a lot of the work that I’m involved with is going back to the basics – looking at the foundations of what happens in early childhood development.”

The iCub robot, designed by a consortium of European universities, is equipped with two cameras and the ability to track moving objects. “So we place objects in front of it and it will look at them, and ‘remember’ where it was looking and what it saw,” Morse says. He is testing whether iCub, like a real toddler, can associate a space with the word for an object, even in the object’s absence.

Advertisement

“Having the iCub robot opens up these amazing possibilities – but there is a lot of work to be done to realise those possibilities,” he says.