What I do - Research

"If we knew what it was we were doing, it would not be called research,
would it? -- Albert Einstein

Learning Two-Person Interaction Models

We address the problem of creating believable animations for virtual humans
and humanoid robots that need to react to the body movements of a human
interaction partner in real-time. Our data-driven approach uses prerecorded
motion capture data of two interacting persons and performs motion adaptation
during the live human-agent interaction. Extending the interaction mesh
approach, our main contribution is a new scheme for efficient identification
of motions in the prerecorded animation data that are similar to the live
interaction. A global low-dimensional posture space serves to select the
most similar interaction example, while local, more detail-rich posture
spaces are used to identify poses closely matching the human motion. Using
the interaction mesh of the selected motion example, an animation can then
be synthesized that takes into account both spatial and temporal similarities
between the prerecorded and live interactions.

HRI & HCI Demo Videos

Blender Conf 2015 Talk

At Blender Conf 2015 I was honored to talk about our human-character interaction
framework. The presentation mainly focuses on extensions we made to Blender
including motion capture addons for optical A.R.T. tracking systems and
Kinects as well as Matlab bindings to compute character responses.