Soon, Your Avatar Can Reflect Your Emotion in Real Time

The graphics of today’s video games have improved so much that I can’t help but be mesmerized even when I am a mere observer most of the time. I am thinking about games such as Skyrim, where the level of details is simply astounding. Even with all the advances made, though, there is still some room for improvement.

If you think about it, there video game avatars do not exactly have real facial expressions. While their expressions do change, there is still some way to go in this regard.

This is what Thibaut Weise of EPFL’s Computer Graphics and Geometry Laboratory has been working on. His idea is very simple: to enable avatars to reflect the facial expressions of their real life users in real time.

The goal is utterly simple, but the implementation might be a different story. Weise seems to be successful in his endeavor, though, as he has founded a startup which has developed the software to achieve his goals. Currently based in Zurich, Weise has a working system which he hopes to deliver to animation and video game designers.

The system works with a camera, which should have motion and depth sensors. Think of the Kinect. Weise’s avatar software will work with that.

Quite understandably, the software has to go through a learning curve in order to be more accurate. It takes about 10 minutes for the software to become familiar with the user’s face. The user is required to make some facial gestures/expressions for this process.

Right now, Weise is still in talks with some people in the video game industry, so I don’t think you’ll be seeing your avatar mimic your every move just yet. Still, the prospect of playing Fight Night and seeing your boxer flinch and twist his facial muscles like you do is pretty cool, isn’t it?