The Heretic, Megacity release, real-time ray tracing, and more news from GDC 2019

We kicked off GDC 2019 in the best way possible – with a keynote filled with major announcements, jaw-dropping demos, and appearances from some of the industry’s most storied franchises. Read on to get the highlights.

Create, operate, monetize

Whether you’re a developer opening the Unity Editor for the first time, or an artist reshaping the world with the power of real-time rendering, you inspire us to keep making Unity a better, more performant, and empowering platform. We believe the world is a better place with more creators in it and we’re here to build the Unity platform that serves you. Today we support your ideas and success by providing: the technology to create, the services to operate, and the platform to monetize.

We help you CREATE the best product. You can work efficiently across teams worldwide with developer and artist workflows that support you throughout the process.

Once your game is live, we help you OPERATE in the real world with services that welcome all creators, regardless of engine.

It’s your passion to create that drives you, and we support that passion with one of the most successful video ad networks in the world so that you can keep creating. We help you drive success with the solutions to MONETIZE your game or to acquire new users and grow your community.

Technology to create

The Heretic

The Heretic is a short film by Unity’s award-winning Demo Team, the creators of Adam and Book of the Dead. We unveiled a teaser during the GDC keynote, with a full short film release coming later this year. The current demo runs at 30 fps on a high-end gaming PC.

The Demo Team uses and stretches Unity’s latest and greatest improvements in graphics. The work on The Heretic has helped us evolve our next-level rendering capabilities, and is our exploration of the problem of creating realistic CG humans.

The demo is built on Unity 2019.1, with various powerful customizations on top of Unity’s Scriptable Render Pipeline (SRP) rendering architecture. Using the latest installment of Unity’s High Definition Render Pipeline (HDRP) with its integrated post-processing stack allowed the team to achieve a cinematic look that closely emulates how physical cameras work – and renders in real time.

The Heretic is also Unity’s first ever showcase of real-time digital humans. The internal production team combined data acquired through both 3D and 4D scanning, and then built a complete pipeline from data acquisition to real-time rendering in Unity, aligning services offered by various commercially available service providers. On the rendering side, the team created all necessary shaders on the basis of Unity’s HDRP to achieve the desired look.

﻿

Next-generation graphics

Lightweight Render Pipeline

Since introducing SRP last year, we’ve been listening to your feedback and evolving our render pipelines to give you even more control and flexibility. We’re happy to say our Lightweight Render Pipeline (LWRP) is now out of preview and will be ready for production use in the 2019.1 release.

This rendering pipeline has been designed from the ground up to offer the most scalable solution for Unity graphics, with a lean and optimized approach that allows optimal performance on the maximum amount of supported Unity platforms. We reworked shaders, batching, and rendering to be lighter on bandwidth and give you the best performance.That’s why LWRP delivers visual quality as well as faster rendering on performance-constrained platforms.

LWRP is now out of Preview and Production ready with 2019.1. See how much of a difference it can make for your project. During 2019, we’re adding a lot of cool features and new renderers. We’re excited to see what you can build with LWRP!

Real-time ray tracing

We are working closely with NVIDIA to solve the hard problem of bringing ray tracing to everyone. Once thought impossible to achieve, performant real-time ray tracing delivers, via the NVIDIA RTX platform, photorealistic image quality and lighting for any task where visual fidelity is essential, such as for design, engineering, or marketing, at a fraction of the time of offline rendering solutions. The technology provides a key advantage to creators because it mimics the physical properties of light, allowing developers to make and dynamically alter creations that blur the line between real-time and reality.

To highlight what is achievable with real-time ray tracing, we’ve collaborated with NVIDIA and the BMW Group to showcase the 2019 BMW 8 Series Coupe in our demo called “Reality vs Illusion: Unity Real-time Ray Tracing”. In total, there are 22 sequences in this video. Half of them are filmed on location with a real-world car and the other half are a CG vehicle. The CG car is rendered in Unity at 4K interactive frame rates, using the power of ray tracing in HDRP on NVIDIA RTX 2080Ti.

Unity is leveraging the full power of HDRP combined with the best implementation of algorithms utilizing real-time ray tracing for global effects and high-end rendering for best in class visuals. Unity’s ray tracing technology is built on top of HDRP with C# scripts and shaders, quick to reload and iterate. We provide a wide variety of ray trace effects but if you want to create your own, with hybrid HDRP, you get full access to all it’s runtime resources and can swap between rasterization and ray tracing at any time.

We are delivering production-focused real-time Ray Tracing with a full Preview coming in HDRP in Fall 2019, including an early Preview of a full-frame path tracer.

Even better, you can start working with an Experimental HDRP build on GitHub on April 4.

Real or rendered – can you tell the difference between the real-time car and the physical car?

Data-Oriented Technology Stack (DOTS)

Burst Compiler

Our new Burst compiler technology takes C# jobs and produces highly optimized machine code. It’s out of preview with 2019.1, and ready for production. Join the 2019.1 beta to try it out now.

Megacity shows how huge the impact of DOTS is for big productions already today, with Unity 2019.1 beta, and the new prefab workflows released with Unity 2018.3. We’re now releasing it for you to download. Take it apart, and experiment with it, and get inspired for your upcoming projects.

A few developers from Nordeus took on the Megacity demo to show how DOTS and LWRP help seamlessly scale a PC project down to mobile platforms. Read our case story to learn more about the production of the mobile version of this demo.

Physics for DOTS projects

We’re excited to announce our partnership with Havok to build a complete physics solution for DOTS-based projects in Unity.

We wanted to give the power back to you the creators, and give you the options that best fit your production needs. Whether you need a simple, lightweight physics engine, or a robust, high-end physics engine, you have what you need to create rich, interactive, and dynamic worlds.

These solutions both use the DOTS framework, allowing us to create a single data layer that is shared between Unity Physics and Havok Physics. This allows you, as creators, to build your content and game code once and have it work seamlessly with both solutions.

Game Foundation

In the past, some of the most common game systems, such as inventory, currency management, in-game stores, player progress, and persistent game states have taken a lot of valuable engineering time to develop. These game systems are present in almost every successful free-to-play mobile game. Time spent building these common systems could be better used focusing on unique gameplay instead.

We are going to help you get this time back so that you can focus on what matters – making your game truly unique and fun. We want to save you weeks or months of engineering time with Game Foundation.

AR Foundation

It’s an exciting time for developing AR as new features and technology are being improved and released across both handheld and wearable devices. People are getting used to the idea of using their phones for more. From taking pictures with dinosaurs at the park to visualizing what furniture would look like in their home before they buy it, we’re just scratching the surface of the potential of AR.

AR Foundationis a framework specifically for AR creators that gives you the power to build your app once and deploy to both ARKit and ARCore devices. We’ve added additional features to help you overcome some of the common challenges for AR development such as anchoring digital objects into the real world, the visual fidelity of those digital objects and more.

One of the features we’re most excited about is the AR Remote. Delivered as part of AR Foundation, it significantly reduces iteration time by streaming sensor data from an AR-enabled mobile device directly into the Unity editor. This allows developers to see changes on their target device in real time without having to publish to the device.

AR is just another build platform for Unity. All of the workflows, features, and the new tech we’re building for Unity, from Lightweight Render Pipeline (LWRP) to the Entity Component System (ECS), all work for mobile AR development. Not only can you use the workflows you know and love to create mobile AR experiences, but as Unity mobile game creators, you can use the assets you’ve created for your non-AR experiences and use them to create an AR experience that looks great, runs smoothly, and is available on the one billion mobile devices that are AR-enabled around the world today.

Services to operate

Vivox

The most recent addition to the Unity family, Vivox is the leading provider of voice and text chat and is trusted by some of the biggest multiplayer online games like Fortnite, PUBG, and League of Legends. Vivox is a managed, hosted service, and the proven solution for delivering the best player communication experience. The Vivox SDK supports all major platforms and is easy to integrate with any engine. During the keynote, we announced that Unreal developers will be able to start integrating Unity’s Vivox service directly into their Unreal games in a few weeks.

Multiplay

Multiplay is the leading game server hosting provider for some of the largest online multiplayer games like PUBG and Apex Legends. Delivering the most resilient and scalable server solutions, and backed an elite team of infrastructure experts, Multiplay offers customized game server hosting solutions that work on any engine and any platform.

﻿

Adaptive Performance

We continue to invest in mobile game performance. Adaptive Performance will provide developers with unprecedented data about thermal trends, including information on whether your game is CPU- or GPU-bound, at runtime. The data this feature provides will give you the flexibility to adapt your game dynamically and proactively to best utilize the hardware. The result is a decrease in thermal buildup and a more predictable frame rate, enabling a much more enjoyable player experience.

Our partnership with Samsung will give you the ability to use Adaptive Performance on Galaxy devices from the latest, powerful flagship devices, like the Samsung Galaxy S10 and the Galaxy Fold, and will also expand to many more devices in the future.

In a few weeks, we’ll be releasing Adaptive Performance through the package manager as a preview for 2019.1, together with the Megacity demo mobile sample code, to get you started.

Platform to monetize

Unity Distribution Portal

We’ve also announced the open beta launch of the Unity Distribution Portal or UDP today. UDP allows developers to scale their business and expand their player base globally by distributing mobile games on apps stores in emerging markets.

Distributing a mobile game to different apps stores comes with a lot of hurdles, complexities, and challenges. UDP automatically creates builds for each of the participating stores and enables distribution regardless of individual store requirements. UDP is a one-stop-shop for game developers to configure and deploy all publishing requirements for global distribution channels, helping you reach millions of players around the world.

UDP enables you to publish on Catappult (Aptoide), and Moo Store at the moment and will soon be connected to ONE Store, and Jio GamesStore, with more to follow.

Delivering for developers today

Some of the industry’s most storied franchises joined us at GDC to talk about how they are using Unity to create, operate, and monetize their newest games. Beyond sharing the technical process of creating these games, they debuted some never seen before game footage and made some exciting announcements!

Disney’s Sorcerer Arena

Like Disney, Glu Mobile is all about storytelling and with some of Unity’s latest features, they are able to quickly iterate and make changes to the story rather than spending hours reworking. With Lightweight Render Pipeline (LWRP), and Shader Graph, Glu Mobile captured the iconic hand-drawn look of Disney, while expanding it over a roster of over 100 characters. All while serving 60 frames per second to mobile devices 2+ years old. During this segment, Franz Mendonsa, Senior VFX Artist at Glu Mobile demonstrated how Shader Graph can be used to quickly create a range of sophisticated visual effects without any coding.

Call of Duty: Mobile

One of gaming’s most popular franchises is going all in on mobile. Chris Plummer, VP of Mobile at Activision, revealed Call of Duty: Mobile in action for the first time. Call of Duty signature multiplayer experience requires great visuals as well as great performance; high frame rate, low latency. Achieving this was made possible by the incredible talent at Tencent, and of course, Unity. After some never before seen footage Shadow Guo, Director of Technology at Tencent, cracked open the editor to show some of the ways they captured the Call of Duty look and feel for mobile; all thanks to the power and flexibility of Unity. We can’t wait to try it. If you can’t either, you can pre-register for the game.

System Shock 3

Warren Spector (Studio Director) and Elizabeth Legros (Graphics Programmer) from OtherSide Entertainment gave us an exclusive look at their highly anticipated upcoming game, System Shock 3. Warren shared insights into the world players will experience in the game – a world where players are alone in a strange and dangerous place, underpowered and unprepared for the challenges posed by a renegade AI called SHODAN. «Horror» and «dread» were used to describe the overall vibe and visual style necessary to deliver the mood they were after. He described it as «noir-like,» with deep shadows punctuated by areas of intense, focused light. Unity’s latest graphics innovations bring this world to life. HDRP and its Unified Lighting Model, Volumetric Fog, Deferred Decals, and Shader Graph have helped the team quickly and reliably create dark, highly atmospheric environments, and add details without hurting the framerate too much. Overall, the new features made game graphics better and easier to implement.

Oddworld: Soulstorm

Industry legend, Lorne Lanning, took the stage to introduce Oddworld: Soulstorm which is an all-new sequel to New and Tasty. It’s a hero’s journey that tells an epic story about a species on the brink of extinction and a volatile society pushed to its limits. The team is making a big visual and cinematic leap, and Bennie Terry, Executive Producer for the game, gave the audience insights into their pipeline, from conception to completion. He highlighted how they’re testing the new features of HDRP to bring their characters to life before talking about using Timeline and post processing V2 for creating cinematics in real-time.

I wonder if «Game Foundation» will be hack and cheat protected to some degree. Currently I’m using stuff from the asset store to protect variables and save data instead of having it in clear text or easily available to cheat engine. I think that’s pretty important, especially when combined with IAPs.

I see that Burst is getting out of preview. Will it work on old CPUs, the ones with no SSE4 instructions? We’re using ECS in a game in early access but we can’t enable Burst compilation because the game crashes on those old machines.