Secondary menu

The virtual characters from Zerrin Yumak

Zerrin Yumak is assistant professor at the Department of Information and Computing Sciences. She does research in the “Motion Capture laboratory. Here she analyzes social and emotional behaviors and turns them into computational models.

Zerrin Yumak is assistant professor at the Department of Information and Computing Sciences. She does research in the “Motion Capture laboratory. Here she analyzes social and emotional behaviors and turns them into computational models.

Who are you and what do you do?
"My name is Zerrin Yumak. I am an Assistant Professor at the Department of Information and Computing Sciences. I am doing research on socially interactive virtual characters and robots. These are characters that are able to naturally interact with people using facial expressions, gestures and gaze. The ultimate goal is to make them understand and respond to emotional and social cues from people.

"My research has potential application areas for better training and effective healthcare. For example, these characters can be used as personal assistants for healthy life style for elderly people or as playful companions for young children. They can be used in games to train professionals for communication and negotiation skills or they can be used in Virtual Reality applications as an aid for the treatment of social anxiety."

Where is this place?
"That is the “Motion Capture laboratory”, or “MoCap lab” for short. It is used to capture the motion of people and objects in a precise manner. Here there are 14 infrared cameras that capture the retroreflective markers on the subject. The process starts with the calibration of the cameras and follows with the reconstruction of 3D space from 2D images. Finally, a skeleton is fitted to the captured markers by labelling the markers with the names of the joints.

"The captured motions are then exported to be used for animating the virtual humans. That is similar to the process used in Hollywood movies such as Avatar. Me and my team, we also use this place for interactive demonstrations. One example is the Virtual Receptionist Sara that is capable of interacting with groups of people in crowded places. She analyses the position, orientation and posture of people using a Kinect depth camera and decides which user is willing to interact with her. We also have a game for supporting elderly people for the rehabilitation phase after hip replacement surgery. The game consists of a cycling game and five mini games mainly focused on balance exercises. "

What is special about this place or your research?
"Believability of the virtual human motions is very important to engage the people and to create a simulation of reality. However, it is not very easy to create believable characters. Motion capture is an effective way to generate real-life motions. My research is about analyzing social and emotional behaviors and turning them into computational models.

"For this first we need to analyze the behaviors of people in real contexts. For example, we capture and analyze how people behave during group social interactions, how often they look away or they look to the other person and what kind of non-verbal behaviors they do. By learning patterns from data using machine learning algorithms, we get some insights into the real-life behaviors. Then we use this knowledge and motion capture data to generate the animations. Starting with the captured motions as a basis, we generate new animations by modifying and combining the existing ones. That requires motion signal processing and appropriate graph search algorithms."