Share this story

Shanshan Zhou had a longtime childhood fantasy: she dreamt her otherwise static belongings would suddenly begin to play with her—she used to pretend they were alive. So when it came time to do a project for her Physical Computing class at Victoria University-Wellington, she took the opportunity to turn an inanimate object into “living art.” Zhou gave character to an object which, despite its lack of human features, could now connect with people.

Her joint project with an industrial design student, Adam Ben-Dror, and a conceptual artist, Joss Dogget, resulted in a playful moving desk lamp the team calls "Pinokio." When turned on, the lamp is programmed to look around for human faces using a camera embedded where the bulb would normally be. Once it finds a face, the lamp can follow it back and forth; if the face hides behind hands or a notebook, the lamp will continue to look around curiously, trying to peek over the top to see the person again. If the lamp's gaze wanders, a couple of handclaps will draw its attention.

Pinokio | Lamp

The closest cultural reference for Pinokio is probably the only famous anthropomorphic desk lamp in history: Luxo Jr., from the eponymous Pixar short released in 1986 (it lives on in that company’s title card). While Pinokio can’t hop or play with a ball, the way it glances around and cranes its neck do show similarities to Luxo Jr. and its parent lamp, Luxo.

While the Pinokio team acknowledges the similarities to Pixar’s animated lamps, they didn’t realize them until the project was near complete. Ben-Dror notes they could have chosen a different object to bring to life, “but the anglepoise lamp by George Carwardine just lends itself so nicely to being animated.” To bring the lamp to life, Ben-Dror used four digital manufacturing processes to create the parts for Pinokio: laser cutting, CNC lathing, 3D printing, and CNC water-jet cutting.

A sketch of the algorithm Zhou used to program the lamp.

“I think Pinokio down in its soul was a naughty baby or dog,” Zhou says when I ask about the lamp's inspiration. In the demonstration video, the narrative text notes that Pinokio is “naive,” easily fooled during a game of peek-a-boo. Dogget notes that because of the way Pinokio moves and responds to human faces and interactions, the experience “compares so greatly to interacting with a real personality, interacting with a real animal, rather than a semi-intelligent toy.”

Zhou tells Ars that Pinokio's most animalistic motion is when the lamp is looking around, searching for a human face with which to connect. Six servos built into the lamp are used to move it, and the code that dictates its movement is procedural rather than prescriptive. With each glance, the lamp chooses a new random direction to look in. Zhou used the open source language Processing to program the lamp and the hardware prototyping system Arduino to communicate between the code and the servos. To make the lamp track human faces with its camera, the team used the OpenCV library for Processing.

In one of the last sections, the function flickSwitch performs a behavior shown in the video. If a person attempts to turn the lamp off, the lamp will bend itself down and flip its own power switch back to the "on" position. The lamp also has “introvert” and “extrovert” moods, so when it’s feeling social, it will stretch to its full length, getting close to the human in front of it as if to try to see what the person is doing. (In introvert mood, the lamp keeps to itself accordingly.) The team notes on its website that it initially intended to use a notebook (like the one in the video above) to trigger the lamp’s curious extroverted mode, but the servo controlling the full extension of the lamp’s body was broken and the code used to control the reaction to the book was not efficient enough.

Ben-Dror notes that the lamp could have serious applications; for instance, it could “[follow] your hands around on the page, to shine light wherever you are working.” But the broader aim of Pinokio isn’t just to create something that moves but something that emotionally engages people. As Zhou put it on the project website, “I do believe with future robots or human-machine-interaction, we should look into our natural interaction with something that is alive, such as animals and children.”

Share this story

Casey Johnston
Casey Johnston is the former Culture Editor at Ars Technica, and now does the occasional freelance story. She graduated from Columbia University with a degree in Applied Physics. Twitter@caseyjohnston