Posted
by
timothy
on Saturday March 19, 2016 @08:22AM
from the leadering-to-better-graphics dept.

MojoKid writes: This week at GDC 2016 the team at Unity revealed their stable release of the Unity 5.3.4 game engine along with a beta of Unity 5.4. There are a number of upgrades included with Unity 5.4 including in-editor artist workflow improvements, VR rendering pipeline optimizations, improved multithreaded rendering, customizable particles which can use Light Probe Proxy Volumes (LPPV) to add more realistic lighting models and the ability to drop in textures from tools like Quixel DDo Painter. But for a jaw-dropping look at what's possible with the Unity 5.4 engine, check out the short film "Adam" that Unity has developed to demo it. The film showcases all of Unity Engine 5.4's effects and gives a great look at what to expect from Unity-based games coming in 2016. Unity will showcase the full film at Unite Europe 2016 in Amsterdam. But what's most impressive about Adam perhaps is that Unity says that this is all being run in real-time at 1440p resolution on just an upper-midrange GeForce GTX 980 card.

It's interesting that it's at the point where you probably have to actually have some experience working in the field to spot the 'cheats' going on compared to realtime. Things have come very far and it looks very impressive. The quality of the cheats have just gotten crazy over time.

No indirect lighting, no indirect lighting, no caustics, no hair/fur. Lot's of places where the lighting looks complex, but can be baked into the textures. Very carefully set to avoid those being noticable omissions/limitations.

Oh, I'm very keenly aware and have recently seen it (which had the same things avoided that realtime rendered scenes avoid today). It would be silly to claim realtime can't best Toy story. The fastest supercomputer in the world the month Toy story came out would be bested by a single quad-core Haswell 3.0 ghz (though only barely). However as the pre-rendered cinematic have been able to start featuring more and more of the things they had to originally skip, realtime hasn't been able to do the same for ev

Unity still has a way to go to catch up with Unreal Engine in terms of impressive demos. That real-time mo-cap fed straight into real-time rendering live on stage at GDC was bloody impressive in several technical areas.

Especially considering that we're not talking about an engine with a price tag that makes AAA studios stagger. This is affordable, high quality rendering.

One of the last big strongholds of AAA gaming, i.e. high speed, high quality graphics, is coming to an end. Certainly we won't see everyone who happens to have an idea for a game to crank out something over the weekend that dethrones the next clone in the Battlefield series, but this could well mean we're heading to a time when "indie games" are no longer games that have to convince with their content, wit or charm because they can't simply blow us down with effects.

Though the engines are capable, it still takes a great deal of work to actually bring out those capabilitiies. You still need laborious modeling and texturing work if you want something 'AAA' and can't just assemble things out of stock content from the asset stores.

Titles developed on a budget have not come remotely close to the aspirational demos of the engines they have used to date, and I wouldn't expect that to change anytime soon.

Sure, but from what I see there it seems the limiting factor for whether or not you get a good game off the ground shifts more and more towards modelling, texturing and general the "artsy" parts rather than programming and optimization.

Which I'd consider a good thing. That's after all what the user gets to see in the end and that's where the time and money should go.

Some game companies now have only a 10 or so programmers for the graphics, AI and gameplay, while there will be over 500 3D modelers, texture artists, concept artists, testing. What you put in is what you get out in terms of artwork in terms of detail, texture layers going all the way up to normal and displacement maps.

That's not quite true: Sure there are fewer programmers working on adapting the actual engine, since few game studios create their own engines now. But, you have more programmers in the art department instead, programming/optimizing shaders, particle systems etc

So, working in C/C++, interfacing directly with graphics API's, drivers and the renderer, programming complex shaders to simulate physical attributes in the fastest way possible, is not programming? Just because they work together with the art director and the animators?

So, working in C/C++, interfacing directly with graphics API's, drivers and the renderer

That isn't what you said. And I quote:

Sure there are fewer programmers working on adapting the actual engine, since few game studios create their own engines now. But, you have more programmers in the art department instead, programming/optimizing shaders, particle systems etc

Creating shaders isn't programming. In fact, these days it's all done in a WYSIWYG interface. Particle effects are done mostly with WYSIWYG interfaces also, but even when they are done manually, it will be an artist creating them. A programmer handles the glue logic between the engine and the driver and the particle generators in the engine, they would not know what looks correct or aesthetically pleasing when it comes to creating the actual visuals, which includes shad

Not all shaders are created with WYSIWYG interfaces (in fact Unity doesn't even ship with a graphical material editor, though there are 3rd party assets that add this feature). Still, for very complex effects sometimes there is no other way than to go in and code it in HLSL/CG/GLSL or whatever. Whether that is creating art or programming is a whole different question, people specialized in the field are usually referred to as "technical artists" as it is a bit of both. You simply aren't going to create a co

It's partly that, and it's partly just the fact that commodity game development hardware has simply gotten that insanely good. These game engines all use the exact same low-level APIs and hardware. There's nothing inherently special about one rendering engine vs another other, despite what they'd have you believe.

Interestingly enough, contrary to what most users believe, the runtime rendering engine is actually a surprisingly small part of what a game engine does. A game engine's most important feature i

Also, among the reasons cinematics (even if real-time rendered) can look so much better when specifically targeted like this:

Moreover all of the animation, camera work, story board is a custom job from start to finish. What I most saw from the short movie was the animation was exquisitely detailed (although perhaps this is common now in video game cutscenes) but try to make it a Quake clone with a human player walking around in the middle of that scene with the cyborg and none of that will work. What if you're blocking the passage or poking him with an axe or whatever else. Well you can't do that.

True, but animation in realtime cutscenes is pretty much all mo-capped custom animations these days, and the player is typically not interacting with the game while the cutscene is running. So, I think doing custom animation work is not really "cheating", per-se. Some games, including ones I worked on, used to do really horrible cutscenes (by today's standards) using scripts that direct characters around using pre-set animations, but no AAA games do that these days, thank goodness.

No it isn't. Pre-baked lighting, very slow camera movements, and nothing to render in the background. It's smoke and mirrors from the shitty Unity engine that makes fast PCs and consoles run like shit. Give a mouse/controller camera control and watch the misleading engine demo fall to pieces.

Anyone with more than 10 minutes experience of game engines can spot how much cheating is going on, and how contrived this "demo" actually is. You're on/. you should know better. So you're a shill, rumbled!

The GTX 980 is not an upper midrange card. Its $500 and about the only thing faster is a GTX 980 TI. I know it sounds like a big deal for it to run well on a so-called midrange card, but if they wanted to do that they'd need to try it out on a GTX 950 or 960.

If I were the dictator of "video card performance nomenclature," it would be more like this:

Oh yeah, i forgot about the Titan-X. Thats more expensive, but its not really any faster than a 980 ti, it just has more GPU memory. Which is definitely something to put it even higher end, but at the end of the day they're basically both "I have money and want to be parted with it" solutions.

I'd get a GTX 970 if i were in the market for a GPU, but I don't play enough brand new games to warrant upgrading currently.

The 980 is probably best for neural networks in terms $ for capability while Titan-X has more memory but requires additional extra money. Google's Deep Dreams runs 100x faster than CPU NN, but for video it runs out of memory for anything over HD. Not sure if Titan-X can do 4K. Supposedly both of them get blown away by the new stuff this year. Another benchmark, but not completely disconnected to games.

Well, if 2 980 tis are higher end than one 980 ti... three 980 tis are higher end than 2. If you just start stacking more of the same thing, its certainly a more high end overall solution, but it doesn't make the product that comprises that solution more high end.
A Porsche 911 GT3 is a high end car, but a Bugatti Veyron costs more. The fact that you can always find something even more expensive to throw your money at doesn't mean that anything that was previously high end is now midrange.
Of course, I d

This is running on the 980, nvidia's current high-end consumer card. When Pascal, the next gen of nvidia's cards comes out, I would expect their midrange card, the 1060 or at least the 1070 (the budget-highend), to be able to run it. So, in 6 months, $800-$1k.

That being said, as others have pointed out, the scene is also very custom-built to create the sense of effects that would require much more power if you could actually look around at will, so it's not exactly apples-to-apples with real games.

Even with the high-priced 980 you should be able to build something for under $1K. The next most expensive stuff is the CPU (non overclockable Core i5), then the rest is cheap : low end motherboard, 8 gigs of memory, power supply (high quality 400 watt or good 450/460 watt)

Still, that's $1K to play a few games. A 10-year-old PC with maxed out RAM does most of the other things to do with a PC nowadays.

I just upgraded to an i5 and GTX 980 from a Core 2 Duo and for my situation it was totally worth it. I can now play emulated GameCube and PS2 games with no cheats and higher resolutions and graphics options turned on. (I could play them before on ye olde C2D but I had to turn everything down and sometimes they ended up looking worse than the original.)

Most new games are so boring.

I do like The Witcher 3 though....

On the main topic, that demo looks pretty cool. I don't think it looks like a movie at all but

They really should focus on support for dynamic global illumination. Real-time GI does ten times more in terms of creating atmosphere than any localized shader effect: https://www.youtube.com/watch?v=VHbHOQ1NRuw [youtube.com]

All the HQ-graphics video increase their resolution but if one stops the video each frame is ugly because of the lossy inter-frame motion compression. They do not show anything spectacular crippling it in the final delivery stage. Although I do not know what format to choose myself, MJPEG is too big, maybe some MP4/VP9 can tune the motion compression impact.