Media Molecule’s PS4 exclusive Dream is extremely interesting, not only due to the creativity that pervades its gameplay, but also due to the intensive experimentation in the field of rendering that brought to the unique look we saw at E3.

That look is the fruit of many failures that brought to the final success, and Media Molecule Co-Founder and Technical Director Alex Evans showcased them in a talk at Siggraph titled “Learning from Failure: a Survey of Promising, Unconventional and Mostly Abandoned Renderers for ‘Dreams PS4’, a Geometrically Dense, Painterly UGC Game.”

As an high level introduction, here’s the description of the talk, that explains how the game’s rendering engine works:

“Over the last 4 years, MediaMolecule has been hard at work to evolve its brand of ‘creative gaming’. Dreams has a unique rendering engine that runs almost entirely on the PS4’s compute unit (no triangles!); it builds on scenes described through Operationally Transformed CSG trees, which are evaluated on-the-fly to high resolution signed distance fields, from which we generate dense multi-resolution point clouds. In this talk we will cover our process of exploring new techniques, and the interesting failures that resulted. The hope is that they provide inspiration to the audience to pursue unusual techniques for real-time image formation. We will chart a series of different algorithms we wrote to try to render ‘Dreams’, even as its look and art direction evolved. The talk will also cover the renderer we finally settled on, motivated as much by aesthetic choices as technical ones, and discuss some of the current choices we are still exploring for lighting, anti-aliasing and optimization.”

At the beginning, Evans worked on PC using PS Move controllers, the idea was to use platonic shapes with simple distance field as primitives. Each primitive was called an “edit” and it was possible to add and subtract from them, and also color them in different hues. It was also possible to blend them in a hard and soft way.

Models were made of 1 to 100,000 edits, and below you can see, among others, the “Dad’s head” model, made of 8,274 edits.

Then it was time to do animation tests for the models, but unfortunately the result proved too big, and that wasn’t convenient for uploading, which is important for this kind of game. In order to address that, an “evaluator” was written to evaluate the number of edits affecting each voxel of the model.

The first rendered was called “including the original histopyramids marching cubes, engine 1: the polygon edition.” It worked, but it created very dense meshes with mushy edges and slivers. Edges were eventually improved, but problems continued to persist.

Another attempt was made with Gigavoxels.

This led to the second engine attempt, named the “Brick Engine,” and born from a hybrid between gigavoxels and volumetric billboards. It ended up being the main engine a lot of the studio’s artists used for a couple of years. Unfortunately it was too slow and it created unavoidable artifacts.

The third engine attempt was the “Refinement Renderer.” It didn’t last long, and as such there aren’t many screenshots of it, but Evans believes it could be used for future projects.

Unfortunately it was four to ten times slower than what was needed to run in PS4, even if it barely ran at 30 FPS in the scenes showcased below. Memory for all the gigavoxel bricks was also an issue.

At this point, Evans realized that directly rendering the distance field sculptures left very little room for imagination, and made creating good looking sculptures difficult, clashing with the game’s art direction. Results just looked like untextured models in unreal engine, but slower. While promising, it was not right for Dreams.

It was the beginning of 2014, and all the engine prototypes had been rejected, while the Art Director believed that none of them actually fit the game.

Some assets were produced using the hard variant of the brick engine, and the ones that looked good were mostly due to the art than due to the engine itself. They still looked like an untextured polygonal engine, with all the additional memory and time costs. The studio was growing uneasy about it.

The disparity between the concept art and the finished product was also growing. This caused a showdown between the Art Director Kareem Ettouney and Evans. The first pointed at an oil painting, mentioning that he wanted the game to look like that. Evans pointed out that the game engine is a re-interpretation of the art, but Ettouney insisted that he wanted the game to look literally like the oil painting. The argument took hours.

That caused Evans to go back to check out the results of the evaluator, and he looked at point clouds instead of gigavoxel style bricks. The new plan was to generate dense point clouds on the surface of the sculpts.

This is the current solution and below you can see an example, with points thinned by a factor of two to see what’s going on.

As a result, a scene in Dreams is now composed by whole cloud of culpts, each of which is a point cloud, and each point in that cloud becomes a mini-point cloud as well when it gets close to the camera. Basically, the current engine is “a cloud of clouds of point clouds.”

Interestingly, this allows for an interesting implementation of depth of field. Instead of blurring in post-production, you simply “explode” the points a little bit. It’s not a blur, but Evans considers it cool, and it’s the only depth of field effect in the game at the moment. You can see a test below.

Here are a few example with pretty “vanilla” lighting made with deferred shading and a cascaded shadow map sun.

Below you can see the results of the screen space ambient occlusion used at the moment.

At this point, we’re finally in 2015, and the screenshots below are pretty recent work-in-progress material, with more complex lighting, and a view of the point cloud (with one side thinned to see how it works).

Finally, you can check out some more artwork, and images from the presentation.

Ultimately, this is extremely intriguing, and I’m actually surprised that this kind of tech runs on the PS4. If I was quite excited about Dreams before, now I’m rabidly eager to see it in action, even just for its value in the exploration of new visual technologies.

Just remember that all the images in this article are strictly work-in-progress, including the ones at the bottom, which belong to the current rendering engine. It’ll be interesting to see how they’ll compare with the final game, down the line.

Hailing from sunny (not as much as people think) Italy and long standing gamer since the age of Mattel Intellivision and Sinclair ZX Spectrum. Definitely a multi-platform gamer, he still holds the old dear PC nearest to his heart, while not disregarding any console on the market.
RPGs (of any nationality) and MMORPGs are his daily bread, but he enjoys almost every other genre, prominently racing simulators, action and sandbox games.
He is also one of the few surviving fans of the flight simulator genre on Earth.