Realization

"I only had a few seconds to realize it: if this is me, this is not my world. The broken headset, real or not. My wife and my son, both blind as I was. And then, them: observers or masters, two for each one us, two for each puppet. Staring at them as they stare at me, probably for just a few seconds before I and my feelings will simply become another failed iteration. And then, a new instance of my consciousness will be rendered again in this metaverse, but I won't know it."

Humanity has been lost: a few decades have passed since the last human being, a woman, died. But we were ready: we found a new home and its nearly infinite energy. So we built them to reach the ultimate goal for us, even after our death: bring human species back to life. And now, a very last step is missing: render back our feelings, gathering them from our memories.

The concept

Realization is a personal project I conceived to explore visual storytelling using VR content. The occasion was provided by the Oculus & OTOY "Render the Metaverse" contest, which required the submission of a huge 360° stereo panorama image, to be rendered in OTOY's Octane and viewed using Samsung Gear VR. I spent on this project all the free time I had in a week.

The final image is a first-person view of someone inside a futuristic nest, surrounded by a world that he can't recognize. A broken virtual head-mounted display in front of him is being scanned, memories that he was living are now interrupted. He discovers holograms of his wife and son in other nests around him. They are blinded by visors, unaware of mechanical observers that are guarding them. So he discovers that he's observed too. He finally realizes that he's actually the product of a failed simulation of the life he once lived. These are probably the last instants before he will be shut down: another instance of his consciousness will replace him.

The setting is a world that may not be the Earth. The artificial intelligence we once developed is now trying to get us back from our extinction. After years of work, there's only a thing that machines can't properly recreate: our feelings. So they continue researching, while solar and geothermal energy sustain them in this quest. But sometimes little accidents happen: the simulated human instances become aware of what's going on outside their metaverse.

Inside the nest: the diagnostic scan panel, the virtual Oculus and the memories load status

Making-of

I did this project with Maya 2014 and Octane Maya plugin. I modelled using a sub-division workflow to obtain perfect smooth lines, even in high resolution renders. I had very little time to complete it, so I left some triangles in certain spots :) Every nest counts 2.7M triangles and the whole scene was around 35M triangles. Anyway the scene was still fully interactive because I carefully used both Maya instances and Octane geometry scattering. I also layed out simple UVs for all the meshes, in order to later use a UDIM texturing workflow.

The Observers are obviously inspired by modern industrial robotic arms. I also added a few custom ideas to the classic "6 degrees of freedom" design. One is the hydraulic Stewart platform in the base, to allow some unexpected grade in their pose.

The disassembled head-mounted display hologram is a tribute to Oculus Rift designs dating from 2013. The scanning sheet highlights the malfunction in the left lens that led to the accidental freedom of the viewer.

The UI in the panels has been painted and assembled in Photoshop. I created a small library of custom little widgets to be combined. The left diagnostic panel shows the scan of the Oculus logic scheme. The right one is monitoring the viewer's emotions and shows the memories to be loaded. In this case, a sunny day I spent with my wife and my son in the country some week ago :)

To create the hologram of the woman, I scanned a pose of my wife using Agisoft PhotoScan: it proved very efficient to get a rough base mesh in a matter of minutes. Then I retopologized the scanned geometry using TopoGun and projected back the texture on the new mesh.

Most surfaces have a glossy material to get a "painted coat" feeling. First I baked some maps (curvature, AO, object space normals, etc.) and used Quixel DDO to quickly obtain diffuse, specular and roughness textures. To get more freedom at render time, specular and roughness maps were meant to be multiplied with procedural values (Octane RGB Spectrum or Float textures). Finally, to get more dust and dirt in some places, I added a fully procedural layer that combines several different Octane Dirt and Noise textures.

The Oculus hologram material uses a curvature map and some falloff nodes to control its emission and opacity. I used similar materials on the UI panels and on the holograms of my wife and my son.

The sky has been a funny challenge. I chose a great HDRI panorama from Dutch Skies 360° as starting point. To add the "cloud sea" I followed a quick HDR compositing workflow. First I created an Octane scene with an infinite plane and the chosen sky. Then I assigend to it a foggy-like specular material featuring displacement and subsurface scattering. I rendered the scene on Octane Render Cloud using a spherical camera. Then I composited the clouds layer on top of the HDRI in Photoshop, fixed the horizon, painted some storm and voilà.

The final stereo image (to be seen in GearVR) is a huge 28 Mpixels render (18432 x 1536 pixels, John Carmack VR Cubemap format). On my workstation the render time estimation was over 10 days. Fortunately it required less than 2 days on Octane Render Cloud :)

Conclusion

This project has been a great way to approach visual storytelling using VR content. I also improved my Octane skills and found it once again an invaluable tool to create great renders very quickly. I would have spent more time on properly designing and texturing all the elements, but I'm quite happy of the outcome of this 1 week spare-time project. See you in the Metaverse! :)