Breakthrough in accurate in-game face modelling

Are facial expressions mapped to in-game avatars the future?

Shares

Game developers have experimented with face-modelling in games before, but they generally hit that 'uncanny valley' problem, whereby your in-game avatar just ends up looking slightly too, well, like a creepy robotic version of your own face to really add any enjoyment or immersiveness to your gaming experience.

However, that may all soon change, if a new research project proves to be successful.

Forget about those clumsy, cartoonish attempts by Microsoft or Nintendo to customise your Xbox Live or Mii avatar, because group of French researcher's - Abdul Sattar, Nicolas Stoiber, Renaud Seguier, and Gaspard Breton - have recently published their findings on "Gamer's Facial Cloning for Online Interactive Games."

Pain, anger, joy, shock!

The researchers propose using two cameras that are installed on the edges of your TV or monitor to pick up real-time images of you playing a game.

"A multi objective active appearance model is then used which produces data to be used in a synthetic face database," reports Scitechbits.com

"Very briefly, in the 2D frontal view of the gamer, several land marks are established and x coordinates of landmarks in the profile view are combined with it to make a 3D shape model. 2D texture of only frontal view is mapped on the 3D shape and is thus called as 2.5D AAM (active appearance model)."

In practise, this means that your facial expressions – of hurt, anger, joy, shock and so on – will be instantly mapped onto your in-game avatar, from two databases.

One of these databases "has a collection of the gamers (human) facial expression", with the second database "an optimal database of synthetic facial expressions."

The researchers claim that the accuracy and efficiency of the face-mapping has been very high and want to see their tech used in other applications in addition to gaming, such as online video conferencing and the like