Marionette Zoo

When we were accepted into the Leap Motion Developer Program in early 2013, it
allowed us to prototype applications and gain insight into this unique new way
to interact with computers.

The Leap Motion controller is a small USB peripheral device which is designed
to be placed in front of a computer screen, facing upwards. Using two infrared
cameras and three infrared LEDs to capture depth information, the device
observes a roughly hemispherical area, to a distance of about 1 meter.

The device streams frames of data (with a maximum of almost 300 frames per
second) to the Leap Motion controller software, where it is analyzed and
forwarded to client applications. This data accessible to applications mainly
consists of hand, fingertip and pointer tool orientation and position
information, but it does not provide depth image or point cloud data.

Having this information that the Leap Motion tracks hands, fingers and pointer
tools in 3d space in a very fast and accurate way, we decided to build an
experimental application, which was not possible before, because of the lack of
suitable input sensor. We came up with the idea of a puppeteering application,
in which one can control marionette puppets with hand gestures.

Artwork and animations

First we made a couple of puppet sketches. We tried to make everything as
simple as possible, so we chose the bird for our first puppet to experiment with,
since it has a quite straightforward skeleton structure compared to other animals.

We also designed the background stage for the bird, a forest with hills. The
stage was cut into layers similarly how the scenes look in an original puppet
theatre. These layers were organized in a parallel way with gaps between them.
There is a game area between the two main layers, in which the puppets move.

When we had the basic idea of the puppets and background scenes, we came up with
visual gags, small stories we envisioned to be played in the application. One
of them was turned into a short animation.

Puppets and physics

The 3d world was done in Blender. We have already used Blender in our previous
projects, so modelling, texturing and rigging was fairly straightforward. For
the physics simulation we chose to work with the Bullet Physics library. Since
Blender has also got Bullet included, we intended to plan the physics there as
well, in addition to the usual modelling tasks.

The creation of the stages was nothing complicated. The images drawn by the
graphic designer were cut to layers, vectorized and imported to Blender from svg
format. The layers then got uv mapped and extruded to reach their final look.

All scene layers were arranged to face a fixed camera in space, which was used
through the whole gameplay.

The workflow of the puppets consisted of building the 3d mesh with textures and
rigging. After having the initial puppet mesh, a low-poly model bounding shape
is created. The rigid body of the crossbar and the puppet were connected with
soft body ropes. The physics constraints, the mass of body parts, and other
parameters were set up in Blender Game Engine, which enabled us to fine tune
the simulation and the behaviour of the 3d model.

Moving the puppets

We found that the most reliable way to control the puppets is to place the
crossbar in the palm of the user, only taking the palm position and palm
direction into account. The puppet crossbar, a kinematic object in Bullet
Physics is transformed using this information. Using physics calculations the
bounding shape structure, the representation of the puppet in the physics
simulation is moved. Each bounding shape is connected to a bone in the skeleton
of the puppet. Knowing this relationship information and the transformation of
the bounding shapes, the puppet skeleton is transformed, and the 3d model is
skinned.

Our initial idea to tie strings to fingers and control a marionette with them
was more or less realized in this way, but we have encountered a limitation of
our solution that the puppets cannot be moved freely. Since the crossbar was
tied to the palm, a puppet part cannot really be controlled separately without
accidentally moving the whole puppet. To make the gameplay more versatile, we
implemented an action system. Using finger gestures, finger taps, certain actions
can be triggered. The bird can tweet, flap her wings, or the t-rex can do a
pirouette.

Here we have faced a challenge that it was not clear how to mix physics
simulation and 3d animation. All movements were controlled by physics which
cannot be mixed with manual skeleton animation. To circumvent this, a physics
animation system was created, in which it was possible to apply force and
torque to bones at given times with a specified force and position.

The following code detail is the simplified xml description of the t-rex
piroutte.

User interface

After several attempts when we tried to use finger tap gestures to navigate
through the user interface, we realized that it was not user friendly enough.
It was difficult to switch between puppets and stages, especially for children.
Finger tap is convenient for quick selection, but it requires some training to
do it right, so we added hover selection, when it was enough to hold the cursor
above an icon for a certain time to select it.

At the end of the user interface iterations we settled with the following puppet
icon designs, which worked well with the interactions planned.

Sound design and programming

With future expandability in sight we made an immediate decision to
employ libpd as the main sound engine for Marionette Zoo. Libpd, gaining
popularity these years, turns Pure Data, an open source visual programming
language into an embeddable library. This decision enabled the sound designer
to run some sound synthesis and effects directly within the application and
also to take over most of the sound programming duties and thus pretty much be
involved in the whole design process.

Each soundtrack was composed with an endeavour to capture the mood reminiscent
of the picked background scene, foraging for non-obvious interpretations.

When it came to sound design we aspired to assign rather natural expressions to
all puppet actions (even to fictional ones as the already referred t-rex
pirouette) in such a playful way that would put a smile on the users’ face.