An intro to 3D on the Mac, Part II: Animation and Rendering

In Part II of our epic introduction to 3D on the Mac, you'll learn the basics …

Rendering and compositing

This is a pretty huge section and there's a lot to cover, so let's jump right into the features offered by professional 3D renderers. We'll look at how they work with compositors, often the final stage of rendering and color grading.

32-bit Floating-point Rendering

If you've ever done a series of levels or brightness/contrast tweaks in Photoshop, you've probably seen the kind of problems that arise when you do a lot of harsh edits on an 8-bit image:

Colors start getting patchy and posterized, noise is more prominent, and everything just goes to crap eventually. This degradation is due to "rounding errors" with integer-based images. It's less problematic with 16-bit integer images but, to get the most out of a render, you should render to a floating point image, so rounding errors are no longer a problem and white or black pixels don't get thrown away once they reach either extremity. Making the same change on a 32-bit rendering of the image above yields much better results and you can darken and lighten the whole thing without posterization. I often render to 32-bit images for my different lights so I can finely control lighting without having to re-render:

My can light composite in Nuke. Any banding you see is just video compression.

The other advantage in using a 32-bit frame buffer is that your rendered image will have a wider dynamic range, which is very important if you have a high-contrast daylight scene. But working with 32-bit images creates a separate problem: how to see them, since they contain data way beyond what can be displayed on the screen. This means that your renderer needs to do the job of an HDR conversion program like Photoshop or Photomatix, tone-mapping the resulting HDR image for display on the screen.

Linear Workflows

In the lighting section, I mentioned that once you involve a sun/sky system, you invariably get into the complexities of a linear workflow. Basically, a linear workflow is a solution to a problem in dealing with physically simulated light. All monitors display images with a gamma correction curve of around 2.2 (sRGB). Since this is the accepted norm, all devices like digital cameras work around this standard (let's ignore larger color spaces like AdobeRGB for now). All the images you see are encoded accordingly, so that when they are shown on your screen, they appear natural. When you add light in a 3D workspace without compensation for this brightened gamma, getting natural dynamic range and contrast becomes more difficult because of the way that light is factored by the brighter gamma. The end result is that you can spend a bunch of time swinging light values back and forth, trying to get an image that's not washed out. This is especially problematic when dealing with sun and sky systems, which deal with light in realistic intensities.

The solution is a linear workflow. "Linear" here doesn't mean the opposite of non-linear like games or video editing, it means working with a flat, linear gamma within your renderer. Sometimes this process of removing the sRGB gamma curve is called a "degamma."

If you're finding this hard to picture, imagine the sRGB gamma curve is the water current in a stream: trying to do anything in that stream is going to done while fighting with the current. It's not that you couldn't get something done while fighting it, but your work is going to be made a whole lot easier if you just turn off the current. This is the appeal of a linear workflow: it may not be essential to get a good rendering but, once you sort out the workflow, it will be a lot easier to get accurate light and contrast since everything is working in the same linear space. So that involves dealing with the gamma problem. The exact workflow is particular to the renderer you're using but, for most applications, the process is similar: apply a gamma correction to the textures and colors used in your shaders.

The sphere at the left without a gamma correction and the sphere at the right with a gamma-correction of 0.454 (the inverse of the sRGB 2.2 gamma) applied to the same sRGB image. Notice how blown out the highlights are and how washed out the uncorrected image looks overall.

If you're ever opened an HDR image in Photoshop and it looked washed out, that's because it's encoded as linear but Photoshop assumed it's sRGB so it applied the 2.2 gamma to it. You can apply a degamma in Photoshop with an Exposure adjustment:

You could do your 3D linear workflow degamma in Photoshop but it's better to do this non-destructively in your 3D program.

Does that seem tedious, having to gamma-correct every single color and texture node in a scene? Yes, it's very tedious. The better implementations of a linear workflow are in renderers like Maxwell Render, V-Ray, Cinema 4D R12, and Modo—these renderers do linear workflows behind the scenes and all aspects are handled without you having to concern yourself with per-texture gamma correction:

Cinema 4D R12's linear workflow is activated by default for new scenes.

Even color swatches and procedural textures are corrected. Blender 2.5, currently in beta, has a similar linear workflow. Autodesk started to implement a linear workflow for Mental Ray in Maya 2011 but it's only half done.

3D Stereoscopic rendering

Whether you're a hater or a fanboy, 3D film and television is here and it's increasing in popularity. I'll be honest—I don't know much at all about stereoscopic 3D rendering. If you're interested in learning how to do stereo 3D, it's best to learn this from a well-reputed source like fxPhD since they work closely with people in the film industry. The workflows will likely be biased towards certain programs like Nuke and Maya but the theory is the important thing to understand about 3D stereoscopic rendering. If you don't understand the theory correctly, you won't be producing awesome animations, you'll be producing migraines.