Controlling Creatures with Linux

How embedded Linux satisfies the various real-time needs of The Jim Henson Company's animatronics and 3-D computer graphic puppets.

The Jim Henson Company is well known for
creating characters. Low-tech characters like the Muppets don't
need much technology, but animatronics, from gerbils to dinosaurs,
do need it, not to mention our 3-D computer graphic puppets.
Performing live, in real time, so they can interact with human
actors and be captured on film, these characters have a curious set
of needs from a technology perspective.

One of Jim Henson's original performance goals was that one
person should be in command of each character, bringing a
spontaneity and personality harder to achieve in a “performance by
committee” (where several people perform a puppet together). The
fascinating thing about a creature that achieves this goal is that
people forget who or what is controlling it and simply interact
with it. Actors and audience alike start conversing with a dog or a
frog or a snowman as if it were human.

With the proliferation of servo motor technology in
animatronic puppets in the early 1980s, managing increasing numbers
of servos became a challenge, so computerized control systems were
designed. During the last 15 years, several generations of control
systems have been developed at the Jim Henson Creature Shop,
including a version that won a Technical Achievement Academy Award
in 1992. The latest Henson Performance Control System (HPCS)
encompasses the best features of previous systems, while adding new
technology available only with today's hardware and computing
environments such as Linux.

Schematic Diagram of PCS/HDPS
System

This system was begun under the guidance of
Computer/Electronics Supervisor Jeff Forbes in early 1998. We had a
vision that one system on a standard architecture could service all
the company's needs. Steve Rosenbluth joined the project at that
point as the control system designer, and Michael Babcock as the
multimedia programmer. Our needs turned out to be rather expansive,
and Linux seemed to be the only thing that could do it all without
flinching.

The system has to support two back ends: one animatronic and
the other computer graphic. So, our puppets are either real-world
robots, or virtual puppets made of polygons and pixels. We can
handle either separately or both at the same time.

Once the software “set up” of a puppet is in the system,
even puppeteers new to the technology can jump in and perform well
within hours. Using the input devices is akin to performing a
musical instrument. At a certain point, the puppeteer, like the
musician, no longer has to think about what he's doing—he just
performs.

PCS setup for animatronic puppets, showing input
controls and Control System laptop displaying the
GUI.

Henson input devices are not motion capture technologies.
Motion capture is both directly analogous to the performer and
largely is nonprogrammable. In motion capture, a performer's arm
simply corresponds to the creature arm, a knee corresponds to a
knee, etc. The performer cannot enhance or reprogram these
relationships. The Henson input scheme is both non-analogous and
user-programmable. Our input devices are abstractions. For example,
a puppeteer's index finger might proportionally control the
sincerity or sarcasm of a creature's entire face. And a puppeteer
can reprogram puppet movement easily between and even during
performances. A person in a motion capture suit would be hard
pressed to perform an octopus. A person operating our control
system could take it in stride.

The Control Computer and Motion Engine

At the core of the system is a Control Computer, running
RTLinux, which processes and distributes motion data to puppets.
The process that runs motion-mixing algorithms on the Control
Computer is called the Motion Engine. Steve Rosenbluth wrote it in
C++. Performer movement, coming through input transducers from the
outside world, passes through the Motion Engine on its way to
networked puppets. Inside the Motion Engine, various algorithms are
performed on the live data, resolving final actuator positions. An
actuator is like a muscle of the puppet; it could be an
electromechanical or hydraulic servo in an animatronic or a
“virtual servo” (a mesh deformation) in a computer graphic
puppet. Motion-mixing relationships are configurable in software to
be one-to-many or many-to-one, and compound mixes can be performed
on top of those. Physics, which add effects like weight or
smoothing, can be added to performance data as it passes through
the Motion Engine.