19.7 Data-driven Imagery

Scene from More Bells and Whistles

Wayne Lytle began his graphics career as a visualization staff member at the Cornell Theory Center. He received a Master’s degree from Cornell in 1989 with his thesis titled “A modular testbed for realistic image synthesis”. His first full multi-instrument music animation More Bells and Whistles premiered in the Electronic Theater at SIGGRAPH 1990. It has since won awards and been shown in various contexts world-wide. In 1991 Lytle received an award from IBM for his early work in music animation. Lytle also contributed to the debate about standards for visual representation, which persists along with questions about numerical simulations. This was illustrated by an animation from the Cornell Theory Center by Lytle called The Dangers of Glitziness and Other Visualization Faux Pas, using fictitious software named Viz-o-Matic. The video, shown in the Electronic Theater at SIGGRAPH 93, documented the enhancement and subsequent “glitz buffer overload” of a sparsely data-driven visualization trying to masquerade as a data-driven, thoughtfully rendered presentation.

Scene from Pipe Dreams

In 1995, Lytle formed Animusic, a content creation company. Two of the more famous animations are Stick Figures and Pipe Dreams, shown at SIGGRAPH 2000 and 2001, respectively. The principle focus of Animusic is the production of 3D computer graphics music animation. Animusic uses proprietary motion generation software called MIDImotion™. Without this software, animating instruments using traditional “keyframing” techniques would be prohibitively time-consuming, and inaccurate. By combining motion generated by approximately 12 algorithms (each with 10 to 50 parameters), the instrument animation is automatically generated with sub-frame accuracy. If the music is changed, the animation is regenerated effortlessly.

Scene from Scene from Stick Figures

The technique differs significantly from reactive sound visualization technology, as made popular by music player plug-ins. Rather than reacting to sound with undulating shapes, the animation is correlated to the music at a note-for-note granularity, based on a non-real-time analysis pre-process. Animusic instruments generally appear to generate the music heard, rather than respond to it.

At any given instant, not only do they take into account the notes currently being played, but also notes recently played and those coming up soon. These factors are combined to derive “intelligent”, natural-moving, self-playing instruments. And although the original instruments created for the”video album” are reminiscent of real instruments, the motion algorithms can be applied to arbitrary graphics models, including non-instrumental objects and abstract shapes.

Steve May, Kirk Bowers, andMark Fontana produced in 1997 an animation titled Butterflies in the Rain, which tells the story of a butterfly exploring a piano that is mysteriously being played by water droplets falling from above. The piece is accompanied by and algorithmically synchronized to MIDI music data transcribed from the reproducing piano roll Butterflies in the Rain, a piece from the 1930’s composed by Sherman Myers and played by Frank Milne.

Scenes from “Butterflies in the Rain”

The animation makes extensive use of procedural techniques. All modeling and animation was done using AL, an animation system developed by Steve May. PhotoRealistic RenderMan ® (Pixar) was used for rendering. Houdini (Side Effects) was used for compositing. All work was performed on Silicon Graphics workstations.

Movie 19.22 Butterflies in the Rain

https://www.youtube.com/watch?v=lVXmtckavDQBased on a concept by Brad Winemiller, this film was produced by Kirk Bowers, Mark Fontana and Steve May at The Ohio State University’s Advanced Computing Center for the Arts and Design (ACCAD). Additional modeling support (piano harp and strings) was provided by Phil Massimi. Software used was emacs (text editor) for modeling, Steve May’s Scheme-based Animation Language (“AL”) for animation, GIMP for texture maps, Pixar’s RenderMan for rendering, and Side Effects’ Houdini for compositing.