The aim was to push for quality, so very high resolution textures were used and the model has just over 1 million faces.

There is no baked lighting in this scene

The first part of the demo has a fast moving sun. The second part has more localized lighting; a spot light from a fellow maintenance robot lights up the environment in addition to the headlight of the robot the viewer is piloting. In both parts there is considerable environment lighting.

Due to how the scene is laid out, there is a lot of bounced lighting and also quite distinct penumbrae caused by indirect lighting. For example, the v-shaped columns cast a very sharply defined indirect shadow onto the ceiling, which is especially visible in the night time part of the video.

Using high resolution real-time lightmaps

When the lighting changes, these penumbrae and the overall lighting gradients have to change significantly. In order to do this with global illumination, the Enlighten powered real-time lightmaps feature was employed. Traditionally, Enlighten is used in-game at relatively low resolutions (1-2 pixels per meter). This works well because the bounced lighting is generally quite low-frequency.

In this demo, a much higher density is used to capture the fine details in the lighting. An overall density of 5 pixels per meter was used. There is about 1.5 million texels in the real-time lightmaps in total. In the resolution screenshot below you get a sense of the density in relation to the scene size.

At this resolution, the precompute time spent was about 2.5 hrs. The scene is automatically split into systems in order to make the precompute phase parallelizable. This particular level was split into 261 systems. The critical path through the precompute (i.e. the sum of the most expensive job in each stage along the pipeline) is about 6 minutes. So there are significant gains to be made by making the precompute distributed. And indeed going forward, one of the things we will address is distribution of the GI pipeline across multiple computers and in the cloud. We will look into this early in the 5.x cycle.

Interactive lighting workflow

Once the precompute is done, the lighting can be tweaked interactively. Lights can freely be animated, added, removed and so on. The same goes for emissive properties and HDR environment lighting. This demo had two lighting rigs; one for the day time and one for the night time. They were driven from the same precompute data.

“I’m able to move the sun / time of day and change material colors without having to rebake anything. I can play with it in real-time and try combinations out. For a designer like me, working iteratively is not only easier and faster, but also more fun,” says Alex Lovett.

Lighting 1.5 million texels with Enlighten from scratch takes less than a second. And the lighting frame rate is decoupled from the rendering loop, so it will not affect the actual rendering frame rate. This was a huge workflow benefit for this project. Interactive tweaking of the lighting across the animation without interruption drove up the final quality.

To make this a real-time demo, some rudimentary scheduling of updating the individual charts would have to be added, such that visible charts are updated at real-time, while occluded charts and charts in the distance are updated less aggressively. We will look into this early in the 5.x cycle.

Acknowledgements

A big thanks to Alex Lovett owner of shadowood.uk. He has been tirelessly stress testing the GI workflow from when it was in alpha. Also thanks to the Geomerics folks, especially Roland Kuck.

We use standard PBS in this demo. For the previous mobile demo we used custom handwritten PBS shaders since it was made before the PBS feature was ready. If we had made it today we would probably have used the standard PBS shaders.

Man, this looks really nice. Would love to see tutorials when Unity 5 comes out, as even looking at the blog posts on many Unity 5 features like those in lighting are quite beyond my reach. Granted, I don’t have much experience, but I would love to be able to walk through how to implement some of these great features from a basic level (or, relatively so, might need to learn some things beforehand as long as I can figure out what that would be).

The Learn team are doing a great job at creating tutorials see unity3d.com/Learn. There will be a great deal more coming that will spill the beans on how to get the most out of PBS/GI/RefProbes in Unity 5. The team has created some awesome content that will be shown at GDC, so stay tuned.

Several Reflection Probes with and without Box Projection, getting them to look good is an exercise in frustration but with enough fiddling it can get good results, theres a bunch of reflection tech coming that will add to the arsenal which I can’t wait for. Though I really want Aras’s Planar Reflection to be adapted to support roughness ( either overall or masked with a map ) or at least specular map for use with large floor/flat areas.

No Camera Plugin, just anchoring to other objects so you can pivot the camera / rotate it around and a lot of curve editing, a bug in Unity allows you to drag the curves and handles while staying in a fixed place in the animation which helped a lot ( normally it would follow your edit ) that should be coming as a real feature at some point :-)

Thanks for your reply! It is very helpful.
I will definitely share my project at some point with you. And yes reflection probes are a bit stressful :)) especial after 3d packages and traditional visualisation.
Speak to you later.

The exposure changes from shot to shot with key framing, auto exposure is definitely something that is needed though; even in a rudimentary form, generally the exposure is only slightly adjusted by hand here.

Posted Effects, Amplify Color with custom LUT done in Photoshop, Amplify Motio for motion blur, Vignette / Chromatic Aberration from Standard Assets, Sonic Ethers Bloom / Dirty Lens, a bit of Noise, and the two Ambient Occlusions included with the standard assets, Sonic Ether has a cool looking SSAO coming soon also but I could’nt get hold of it for this. Simple sustom exposure post effect to do the fade to black

Really nice visualization with Unity 5. I’m really curious how all this is done and I can’t wait to ever get my hands on that scene somehow. Is there a slight chance we could access that scene for learning purpose? Might Unity use that scene for creating there GI tutorials? I would love to, because I’ve been playing with Unity 5, but didn’t get near that quality and I really want to know how he archived that by looking about everything. I know people that would kill to get there hands on such a scene…. ( looking around like… it wasn’t me ) ^_^

A custom Light Cookie that I spent way way too much time on ;-) IES in some form or another is coming I think, but how useful they will really be I don’t know as they are hard to edit, and they are infinite fall off so they have can’t be done 100% real for performance reasons, better to have a tool that turns a IES light into a 3D Light Cookie and allows you to edit it, but yes Cookies do quite a good job of getting most of the look of a IES Light, just takes patience in making a good map.. a lot of patience :-P

As discussed in the post the resolutions were set so high that updating all the pixels leads to a lighting framerate that is interactive and not realtime. However, as the lighting is running asynchronously it does not affect the rendering framerate much, and this means that Alex was able to use a very nice lighting workflow for authoring the demo. Once we add scheduling such that we only update the visible parts of the scene we will be able to get this running in realtime.

I have the Unity 5 RC, and I too have got amazing results. But I have two problems with the rendering system, the first is that Realtime indirect shadows for point and spot lights aren’t yet supported. I know that it might not seem that important, but in many cases you can see artifacts like light at the corners of walls. Second, and I think the worst issue with Unity 5 yet is the reflection probes. The reflection baked by then it’s only good if the camera is at the origin of the reflection probe, but otherwise they look really bad. A workable hack is to set the reflection probe type to custom, and using scripts from the community (also available in my free Asset Realtime Reflections) to render a cubemap from the camera and assign it to the reflection probe. And I just remembered, the Resources folder in Unity 5 RC1 seems to have no effect, which means Assets like Shadow Softener don’t work at all! I will file a bug report tomorrow. Shadow Softener or its alternatives are a must for me.

Arav, the hack for the reflection probes that you mention might work for your specific use case, but it’s not something you should generally recommend. In general reflection probes are great for two setups: 1. objects reflecting environment that is assumed to be infinitely far away (say, a car reflecting the sky), 2. objects reflecting the environment that is assumed to be placed somewhere on the faces of the box gizmo (say, floors and walls of a room, cause rooms are typically cuboid). For the second case you need to enable box projection on the Reflection Probe component.

Screen space reflections can improve a bunch of cases not handled well by the current system. More on that in the near future.

Question: How soon before we can start playing with (true) multithreaded scripting?

There are no plans ATM to make the entirety of the Unity API thread-safe as we don’t think that is a good approach to the problem. Of course, on a case-by-case basis, it can make sense for specific APIs.

There are, however, plans/thoughts/ideas to provide increased opportunity for jobifying scripted logic and running those jobs on Unity’s own job system.