Originally Posted by Sthu:But how do you respond to clients that will say something like, "the other company could do all this in real time, and it looked just fine to me?"

Or for 10 bucks? Not an argument. For some projects it might work, for some not. I don't know about the engine much, but the lighting is quite usual, just a regular skylight outdoors. It looked just like ingame. Animation and direction is good, but not the lighting or shading by today standards. But it might become better in 5 years.

What is possible nowadays with realtime is stunning for sure. But there will always be a need for offline rendering and realtime rendering. That's simply two different goals, which ends in two different solutions.

Realtime content is optimized for frame rate. The goal here is 60 fps, not the very best realistic look. You have to fake a lot. GI, prebaked Lightmaps, Normalmaps, etc.

Stills and movies aims for realism. Time does not matter so much here. You can work with megapolycount, use time consuming render calculations, and so on.

Originally Posted by Sthu:But how do you respond to clients that will say something like, "the other company could do all this in real time, and it looked just fine to me?"

A workflow that allows you to preview in real-time and then render in high quality? I'd show them the real time first, if they say it looks good you've saved yourself further work. If they can spot the difference show them the high quality and explain the time requirements for each.

Originally Posted by mister3d:It's artistically great, but technically it's scanline of 1990.

"it extensively uses global illumination and physically based rendering... Combine that with the dynamic lighting, which has been rasterized twice for depth pre-pass, and a custom ambient occlusion solution which rivals HBAO, delivering soft shadows without any dithering"

I don't remember seeing anything like that in 1990 that was not ray-traced, and definitely not on the desktop. Trying to think of a single example.

Originally Posted by moogaloonie:"it extensively uses global illumination and physically based rendering... Combine that with the dynamic lighting, which has been rasterized twice for depth pre-pass, and a custom ambient occlusion solution which rivals HBAO, delivering soft shadows without any dithering"

I don't remember seeing anything like that in 1990 that was not ray-traced, and definitely not on the desktop. Trying to think of a single example.

It doesn't say it could be pre-baked, which probably is. But if it's UE4, it uses some clever techniques. It's just obvious it doesn't use raytracing at all. All diffuse and shadows. AO also helps to get rid of obvious lack of global illumination.
No, lighting definitely made a great step forward. I personally enjoyed MGS 5 prologue. It's just so far away from offline rendering yet.
I perhaps should say 1990-2000 timeframe and the end of it, when raytracing techniques were still too expensive to use even for cinematics.

Originally Posted by Tiles:What is possible nowadays with realtime is stunning for sure. But there will always be a need for offline rendering and realtime rendering. That's simply two different goals, which ends in two different solutions.

Realtime content is optimized for frame rate. The goal here is 60 fps, not the very best realistic look. You have to fake a lot. GI, prebaked Lightmaps, Normalmaps, etc.

Stills and movies aims for realism. Time does not matter so much here. You can work with megapolycount, use time consuming render calculations, and so on.

For the Kite demo it was running on the Titan X, they were shooting for the quality first and then performance after, it wasn't designed for the average PC to run at 60fps.

There are doubts that for "serious" rendering, game engines will become the norm of the future.

Basically for example, for something like Marmoset or Unreal, I believe there is no full raytracing implementation save for reflections. Everything else is based on Direct-X or OpenGL. This is sort of viewed as a "lower benchmark" against true full-calculation rendering.

So, I doubt that's going to happen. What has happened is a shift of resource usage to capitalize on GPU's. But this does not mean an active chase towards the kind of OpenGL/Direct-X solutions used by games to "fake" things done instead of using actual calculations (which will always be the goal of full renderers).

That said, there will be elements that make the transition. For example, the Blender Foundation, just announced that there will be a PBR-mode implemented into Blender's viewport, which may allow more detailed and real time rendering of models in the application's viewport and lead to a faster lighting model.

There are also developments to enable PBR for finished renders.

PBR, of course, got its start in games. Or at least, it was in video games where the rendering method became popular.

__________________
"Your most creative work is pre-production, once the film is in production, demands on time force you to produce rather than create."My ArtStation

Originally Posted by CGIPadawan:Basically for example, for something like Marmoset or Unreal, I believe there is no full raytracing implementation save for reflections. Everything else is based on Direct-X or OpenGL. This is sort of viewed as a "lower benchmark" against true full-calculation rendering.

Hmm, no, it's not.
The fact it uses OGL or DX for certain things doesn't mean all the engine does is predicated on those libraries, this is a pretty gross misunderstanding of what the libraries do, what the tech does, and how those things mesh and are implemented.

Quote:So, I doubt that's going to happen. What has happened is a shift of resource usage to capitalize on GPU's. But this does not mean an active chase towards the kind of OpenGL/Direct-X solutions used by games to "fake" things done instead of using actual calculations (which will always be the goal of full renderers).

You haven't really seen or used much the tools discussed here, have you?

Quote:That said, there will be elements that make the transition. For example, the Blender Foundation, just announced that there will be a PBR-mode implemented into Blender's viewport, which may allow more detailed and real time rendering of models in the application's viewport and lead to a faster lighting model.

That is what the unreal rendering engine with the Disney approximation model for shader description does already.
There is also nothing in the new unreal PBR/PPS engine, or Unity's counter-offer, that forces you to run it at 60fps or something like that. They can be used for offline if you so wish.

Quote:PBR, of course, got its start in games. Or at least, it was in video games where the rendering method became popular.

It certainly hasn't, none of the acronyms people like to toss around PBR, IBL, or PPS got its start in games, or even an early adoption.
Debevec and others were doing those things offline a full decade before games even started catching the smell of it, and film was adopting them almost as fast as they made it out of the oven.

The title of the thread is a silly question to begin with. What is happening is, quite simply, a mix of convergence in technology, and the world catching up to heterogeneous hardware paths. Rendering is rendering these days, and a lot of efforts are going towards unifying things (with varying degrees of success) in terms of models and descriptions. The main issue is these efforts started late, and there is still a big gap between a lot of players, but something is better than nothing, and it will eventually smooth itself out.
Eventually things will converge and scale and the occasional specialization will determine what product you use on what platforms.

None of those problems have much to do with the games vs film approach people have when they look at this thing (although there is obviously such a gap because needs are different, but certainly not in the form or for the reasons perceived).
Common models, universal descriptors, ideal hardware path abstraction etc. are the current set of problems preventing full convergence. The whole games vs film thing or CPU vs GPU is being transcended and is only going to remain incidental for a few more years.

The current distinction, if you really want to draw one, is more across the shaders than it is the engine, and things like the post and how they mesh into deferred rendering shaders, which is something offline rendering doesn't care that much for while it's pretty important for 30/60Hz.

Follow Us On:

The CGSociety

The CGSociety is the most respected and accessible global organization for creative digital artists. The CGS supports artists at every level by offering a range of services to connect, inform, educate and promote digital artists worldwide. More about us on TheArtSociety.com