A GTC 2013 trailer from NVIDIA shows off "Face Works," some new technology they have created to render more realistic human faces. Word is: "NVIDIA Co-founder and CEO Jen-Hsun Huang shows a demo of Face Works technology running on the GeForce GTX TITAN GPU. It includes a must be seen to be believed, amazingly realistic simulation of a face in this second segment of the opening day keynote at GTC 2013." Thanks GamesHQMedia.

Slick wrote on Mar 21, 2013, 17:40:the LA Noire facial animation was far superior to anything that crytek has put out, namely necause it's just capturing the performance of the actors, what you see is what you get. the problem was that it was only the head which was captured this way, and the body was still traditional mo-cap, this made it look a bit weird and disconnected, but this was a limitation of the processing power of last gen consoles.

I never liked the technology used in LA Noir or what it looked like. I'm not sure if it was the Uncanny Valley thing or just weird. When I describe what the technology looks like to people who haven't played LA Noir I'd say this:

Pull a tight fabric over your head, like a white non-transparant pantyhose or whatever, and then use a beamer/projector to project a video of your close-up face talking into the camera onto the pantyhose. That's the impression I got from LA Noir's "revolutionary" technology.

PHJF wrote on Mar 21, 2013, 10:48:Ummm the shit they used in LA Noire was fucking tits... why the fuck isn't EVERY GAME using it by now?

Because it looked shit, was hugely exaggerated and limited the game to running at 30fps. It was like they recorded an actor and then stretched the video over a 3D-model - it plummeted right into uncanny valley. Both Far Cry 3 and Crysis 3 had better facial animations overall; not as nuanced maybe but they looked more "real".

Don't get me wrong, I respect what LA Noire accomplished but the technology just wasn't right, even though they spent an insane amount of time developing it.

the LA Noire facial animation was far superior to anything that crytek has put out, namely necause it's just capturing the performance of the actors, what you see is what you get. the problem was that it was only the head which was captured this way, and the body was still traditional mo-cap, this made it look a bit weird and disconnected, but this was a limitation of the processing power of last gen consoles.

also, the 30FPS cap was an artistic decision. made it look more movie-like which was entirely the design direction they were going for. films are all 24 frames per second, watching a movie at 60fps makes it look like a home video. loses that "cinematic" feeling.

it's due to the fewer amount of frames being fed to the brain, more space between them, leaves the brain free to "draw" the frames between frames. this is part of the magic of movies. ever seen a 24p film on your 120hz interpolated big screen TV? notice how it looks really "fake" all of a suddan? same phenominon.

the more you know.

(Regarding SW:Battlefront II) Frostshite is a horrible piece of shit engine that makes games look artificial as if you were playing on a movie set instead of the actual location. -CJ_Parker

SXO wrote on Mar 21, 2013, 09:31:The sad part is that GK110 chip the TITAN uses was originally supposed to power the GTX 680, with the GK104 the 680 currently uses being meant for the midrange 660 cards. Unfortunately, due to a lack of competition, the extreme performance these chips produced, and nVidia's overt greed, they decided to use the GK104 as the high-end card, and saved the GK110 to the "refresh" cycle. Now that the refresh cycle is here, they instead opted to create a separate category for the GK110 and price it out of reach of most people with an obscene pricetag.

Now I know a bunch of people are going to come in crying about how capitalism works, but greed is greed, and it hurts business as well as consumers.

first off, lack of competition? lol? the 7970 SMOKES the 680. it's already gotten %50 performance improvements in BF3 since launch. And it overclocks from 925/1375 to 1125/1575 without even touching the voltage, on stock cooling! you're talking %80-90 FPS boost from most benchmark numbers in BF3 since it launched over a year ago. ATI is so happy with it, they've decided to not even refresh it this year with an 8000 series, they're still seeing such performance improvements with new drivers that they're just skipping a generation.

the GK110 was never built for the consumer market, it was built for cluster GPGPU computing on very very expensive clusters. they had contracts for this, this is why it existed.

Nvidia isn't greedy, they just know they can't compete with the 7970 this generation, and they want to keep refining their next gen consumer cards, especially with the upcoming console launches. So instead they released a consumer version of the GK110. all comments regarding this intending as a consumer release in the first place were all speculation. They did this to re-take the single GPU performance crown, even if it's incredibly expensive. (still WAY cheaper than the original GK110 model, which runs about 2.5k)

(Regarding SW:Battlefront II) Frostshite is a horrible piece of shit engine that makes games look artificial as if you were playing on a movie set instead of the actual location. -CJ_Parker

I am not well versed enough in the the techno babble to have much more to say about all of this than...."Damn, that looks pretty cool!"

If the goal of the video was to say that they have crossed the uncanny valley and taken the creepy out of facial rendering...I think they have accomplished their goal, or at the very least are well on their way.

uiboid, how many times in the game are you going to need to be this close to the face? This is perfect for talking head close ups, and then lower resolution fall backs can be used in crowds.

Btw NOIR I actually belive was a 3d digitized face. That is, the face was recorded with 3d cameras and so each polygon mesh of the face was determined and pre-recorded. I think it came out pretty good if not just a little uncanny, but the tech use useless for any on the fly animations. Also the texture rendering is half the work of being belivable. It was just awesome here. I noticed they did transluscent scattering on the ear lobes. Anyone else hate how light scattering has been around a few generations now (dx 10?) and no one uses it? They will make all kinds of glass refractions but anything more opaque then glass and they make it a solid.

[/quote: Yes, similar to what Crystal Dynamics did with Tomb Raider hair physics. It seems to process a lot less from far away which helps relive some stress on the GPU, but the second they do a closeup, it lags all to hell. It's actually the opposite of what we can expect if this technology really took off.

I can imagine practically zero facial animations during gameplay, but when in-game cinematics/cutscenes break out, then we'll get some awesome closeup shots with extremely believable dialogue between characters. It can really remove any future CGI videos implemented into a game when the in-game graphics/animations can rival them.

Quiboid, how many times in the game are you going to need to be this close to the face? This is perfect for talking head close ups, and then lower resolution fall backs can be used in crowds.

Btw NOIR I actually belive was a 3d digitized face. That is, the face was recorded with 3d cameras and so each polygon mesh of the face was determined and pre-recorded. I think it came out pretty good if not just a little uncanny, but the tech use useless for any on the fly animations. Also the texture rendering is half the work of being belivable. It was just awesome here. I noticed they did transluscent scattering on the ear lobes. Anyone else hate how light scattering has been around a few generations now (dx 10?) and no one uses it? They will make all kinds of glass refractions but anything more opaque then glass and they make it a solid.

The sheer amount of development that must go into this, in terms of modelling and texturing, it makes me wonder if we'll ever see this very often. I suppose the day will come when Hollywood's latest star gets scanned and high definition models/textures created from them and then we'll get games starring Ryan Gosling or whoever.

If this took half the card's processing power, it's a long way off anyway. How much gameplay are you going to get when a single person's head is taking half the graphics card's resources.

"The only thing necessary for the triumph of evil is for good men to do nothing." - Edmund Burke

PHJF wrote on Mar 21, 2013, 10:48:Ummm the shit they used in LA Noire was fucking tits... why the fuck isn't EVERY GAME using it by now?

Because it looked shit, was hugely exaggerated and limited the game to running at 30fps. It was like they recorded an actor and then stretched the video over a 3D-model - it plummeted right into uncanny valley. Both Far Cry 3 and Crysis 3 had better facial animations overall; not as nuanced maybe but they looked more "real".

Don't get me wrong, I respect what LA Noire accomplished but the technology just wasn't right, even though they spent an insane amount of time developing it.

Xero wrote on Mar 21, 2013, 11:01: I used to be giddy when I saw these type of technology demos during the Geforce 3,4, fx, and 6 series. Now I just brush them off like sure sure. By the time the games come out that uses that kind of technology, I'll be able to run it on mid ranged price card ($250) with decent/great performance.

Same here.I used to spend so much money to have the latest and greatest, but it was a huge waste, since games did not take advantage of that tech until much later. By then you can buy these for a lot lower price then they are now.

This is what I always preach. We are seeing what a card can do at it's maximum potential. That's all fine and dandy but I'm not buying a card to view a technology demo on it...

When developers actually implement this type of technology into their games which will be a couple years away at least, we will already be on to the next few series of graphic cards that power this technology.

I used to be giddy when I saw these type of technology demos during the Geforce 3,4, fx, and 6 series. Now I just brush them off like sure sure. By the time the games come out that uses that kind of technology, I'll be able to run it on mid ranged price card ($250) with decent/great performance.

just watched a 470/480 copilation and it seems like most of what that card added was physics stuff, lighting and overlay stuff and tesseslation, all of which are pretty ubiquitous in todays games. maybe we will see this sooner than we think.

SpectralMeat wrote on Mar 21, 2013, 10:23:Well that was sort of my point in an exaggerated way.Sure you can use a Titan now for that one or 2 games out there on multiple monitors or super HD displays but does that justify the cost of the Titan? For most people it won't.To see graphics like this in games the consoles will have to be able to do this as well, otherwise devs will not put the extra effort and work for PC gamers.I did not look at the PS4-Xbox720 specs but I doubt they will be close to anything like a modern PC with a Titan video card or even a 680 GTX.

yep, i got ya. i think i heard that the ps4 will output 4k, but who knows what that will actually mean. the current consoles all say they do hd, but its not 60 fps awesome 1080. we all seem to be saying the same thing, its killer, but wheres the game.

now that i think back on all the nvidia demos, im not sure any of them translated into a game even years after.. i may have to go back and look some up. what is the best looking game out? crysis 3? not sure that looks like even a tech demo from 2 cards back.

SpectralMeat wrote on Mar 21, 2013, 10:00:ThisBy the time devs will take advantage of this tech we will have Titan3 on store shelves. I wouldn't worry much about what that card costs today, unless you run 20 monitors you won't need a card like that for a while.

Im not sure about that. you can buy a 4k tv today, and this card supports that. I know its high end, but there are some 680 benchmarks for battlefield 3 that suggest 20 fps at 4k on ultra settings. i realize that's luxurious, but its not like there isnt a reason to get a titan now.

Well that was sort of my point in an exaggerated way.Sure you can use a Titan now for that one or 2 games out there on multiple monitors or super HD displays but does that justify the cost of the Titan? For most people it won't.To see graphics like this in games the consoles will have to be able to do this as well, otherwise devs will not put the extra effort and work for PC gamers.I did not look at the PS4-Xbox720 specs but I doubt they will be close to anything like a modern PC with a Titan video card or even a 680 GTX.

SXO wrote on Mar 21, 2013, 09:31:Unfortunately, due to a lack of competition, the extreme performance these chips produced, and nVidia's overt greed, they decided to use the GK104 as the high-end card, and saved the GK110 to the "refresh" cycle.

It was nothing to do with that. The yields were simply appalling, meaning that nVidia had to substantially scale them back - it's the reason why the Titan is only being made available in very limited quantities. If nVidia could produce a card that would blow away the competition and further increase their market share then it would.

As for the technology here, it's quite impressive but it all comes down to what developers do with it. It's useless without substantial improvements to AI, which haven't been forthcoming. We are years away from a game being able to include a single enemy that provides a challenge without simply giving them huge amounts of health or making them hard to hit - that's why games have cannon fodder enemies, cutscenes and boss fights. We are years from enemies that demonstrate true contextual emotions, like fear, anger and empathy. Developers are getting better at concealing the limitations but AI has improved at a fraction of the pace that graphics and physics have.

timesten wrote on Mar 21, 2013, 09:25:Pretty fancy. I'm not buying a thousand dollar video card to make this happen on my machine, but it is neat. Along the lines of the final fantasy movie at the time.

I doubt it will be used in any games for a while, but i wonder if it scales. If not, it cant be put into play for a few years.

This is not even using half of titan's power, so a 680 or even lower can do this with no problem.