When virtual reality (VR) content is experienced using a VR headset the outcome is a virtual world. In that world, the possibilities are limited only by the imagination. This approach has given education an opportunity to leapfrog existing teaching practice by providing educationally beneficial immersive experiences. The first among these ...

Facebook has released a new set of resources and tools for 360-degree video creation. The 360 Director page will offer a range of creative tools for creating and editing 360 videos on Facebook itself. 360 Director is now available for both pages and profiles. The new tools include: Annotations Users ...

Sony’s E3 press conference saw a rush of reveals coming to the company’s PlayStation 4 headset PSVR, including a full release of Skyrim VR coming in November and a list of VR exclusives soon to follow. Among them is the stylized and the ever so cool Superhot VR (2016) finally coming to PSVR. While currently ...

Last week at Google’s I/O 2017 developer conference the company made a number of new announcements which are very exciting for VR consumers. Within the industry though, the announcements represent somewhat of a repositioning of industry players, one that puts Oculus’ mobile strategy in a rough spot. Oculus was first ...

Artists frequently bemoan the difficulty of drawing horses. This single animal is used so often in animation, concept art, illustration, and all forms of entertainment art. Yet it’s also one of the trickiest creatures to get right from anatomy and structure to locomotion. Thankfully there are tons of books out ...

Close Menu

Google’s newly announced Seurat rendering tech purportedly makes use of ‘surface light-fields’ to turn high-quality CGI film assets into detailed virtual environments that can run on mobile VR hardware. The company gave Seurat to ILMxLab, the immersive entertainment division of Industrial Light and Magic, to see what they could do with it using assets directly from Star Wars.

Google says Seurat makes use of something called surface light-fields, a process which involves taking original ultra-high quality assets, defining a viewing area for the player, then taking a sample of possible perspectives within that area to determine everything that possibly could be viewed from within it. The high-quality assets are then reduced to a significantly smaller number of polygons—few enough that the scene can run on mobile VR hardware—while maintaining the look of high quality assets, including perspective-correct specular lightning.

As a proof of concept, Google teamed with ILMxLab to show what Seurat could do. In the video above, xLab says they took their cinema-quality CGI renders—those which would normally take a long time to render each individual frame of final movie output—and ran them through Seurat to make them able to playback in real-time on Google’s mobile VR hardware. You can see a teaser video heading this article.

“When xLab was approached by Google, they said that they could take our ILM renders and make them run in real-time on the VR phone… turns out it’s true,” said Lewey Geselowitz, Senior UX Engineer at ILM.

Star Wars Seurat Preview

When I put on the headset I was dropped into the same hangar scene as shown in the video. And while there’s no replacing the true high quality ray-traced output that comes from the cinematic rendering process (that can take hours for each frame), this was certainly some of the best graphics I’ve ever seen running on mobile VR hardware. In addition to sharp, highly detailed models, the floor had dynamic specular reflections, evoking the same sort of lightning you would expect from some of the best real-time visuals running on high-end PC headsets.

What’s particularly magic about Seurat is that—unlike a simple 360 video render—the scene you’re looking at is truly volumetric, and properly stereoscopic no matter where you look. That means that when you move your head back and forth, you’ll get proper positional tracking and see parallax, just like you’d expect from high-end desktop VR content. And because Google’s standalone headset has inside-out tracking, I was literally able to walk around the scene in a room-scale sized area with a properly viewable area that extended all the way from the floor to above my head.

I’ve seen a number of other light-field approaches running on VR hardware and typically the actual viewing area is much smaller, often just a small box around your head (and when you exit that area the scene is no longer rendered correctly). That’s mainly for two reasons: the first of is that it can take a long time to render large areas, and second is that large areas create huge file sizes that are difficult to manage and often impractical distribute.

Google says that Seurat scenes, on the other hand, result in much smaller file sizes than other light-field techniques. So small that the company says that a mobile VR experience with many individual room-scale viewing areas could be distributed in a size that’s similar to a typical mobile app.