Unite 2016 Keynote Wrap Up: News on Graphics, Platforms, VR and More

Unite ’16 Los Angeles has officially begun! With three jam-packed days of learning, idea sharing, networking, showcases, and more ahead of us, you won’t want to miss any of the action. Be sure to follow @unity3d and official hashtag #Unite16 to stay updated throughout the week.

The opening keynote finished just moments ago, and we have heaps of exciting news to share with you. Here’s a brief recap of the morning for those of you who missed it.

Quality and workflows

On stage, our CEO John Riccitiello explained how we’re improving the stability and quality of our releases. We’ve decided to not merge any features to trunk until we have enough real-life usage feedback from Unity developers such as yourselves. To that end, we’ve been releasing experimental builds that you can try and give feedback on. It’s thanks to our excellent community and your willingness to test and send valuable feedback that we’re able to continually improve.

We’ve also changed the way we work. QA is now embedded in our engineering teams, enabling us to can catch bugs earlier. And we’re working directly with game development teams to better understand your needs and issues.

Our CTO and co-founder Joachim Ante came on stage to talk about some of the long term work we’re doing to improve the way Unity scales across the board, from performance to optimizing workflows. One project we’re making good progress with is rewriting the asset import pipeline. Eventually, you’ll no longer see a progress bar when you import a project folder – it’ll open instantly. Unity will ensure the asset or scene you’re working on is imported, and those you aren’t currently using will quietly import in the background.

Another initiative is a hot-reload to device feature, which will speed up your iteration time when testing on mobile devices, which is especially crucial on mobile VR platforms. And we even have a team that’s dedicated to optimizing load times in general.

Performance

Joachim also talked about our work on multithreading parts of the engine, especially the renderer. We aim to get to 100% multicore utilization. In order to improve the performance of all simulation code, we’ve also rewritten our transform component so it can be safely accessed from jobs. With the rewrite, we’ll not only be able to jobify the code but also optimize the transform component’s performance when used only on the main thread.

Joachim then jumped into a demo showing off the impressive new C# job system. You won’t want to miss all the details, so be sure to tune into the keynote video to check it out.

Graphics and Artist Tools

By now many of you have seen our short film Adam. Now, you can download the standalone executable of the demo and play around with it yourself in real-time. We’ve also released the character, environment and GFX packs, all available now on the Unity Asset Store. For more info and all the download links, check out the blog from earlier today.

As part of our commitment and investment in Graphics, we’ve also recruited a talented team of technologists in Paris, highly specialized to solve some of the toughest problems for artists from advanced particles to lighting to photogrammetry. We’re hoping to build a set of new tools that artists will love using.

Apple Strategic Partnership Manager and R&D DeveloperFilip Iliescu spoke about how we’re deepening support of Metal API for iOS and macOS. In Unity 5.5 we’ve updated our shader cross-compiler, enabling us to use advanced features like instancing. Full Metal compute shader support, tessellation, native shaders and the Unity Editor running Metal will be coming later. Filip then ran a demo on an iPad Pro running several post processing FX that show the impact instancing has on performance.

Our enhanced Vulkan support is also coming soon, which is very exciting news for developers who want to take full advantage of the latest generation of devices, as out-of-the-box testing has shown performance improvements of up to 30 to 60% across all platforms.

Want to get early access to these Metal features or interested to try Unity’s Vulkan renderer yourself? Download our experimental preview build and give it a test:unity3d.com/experimental

Matt Dean from our Graphics teamcame on stage to demo Unity’s post-processing stack, showing off effects like Antialiasing, depth of field, bloom, color grading, and more. It combines a complete set of image effects into a single post-process pipeline, which allows combination of many effects into a single pass and features an asset-based configuration system for easy preset management. It’s currently in beta and all open source, so check it out on GitHub.

Technical Director Lucas Meijer showed off our new VFX Image Sequencer tool, which provides VFX artists with an in-engine toolchain, enabling them to use techniques like flipbook image authoring. This should make your VFX iteration times much faster. To check out our experimental build of Image Sequencer, head to the dedicated forum.

Our Head of Cinematics Adam Myhill showed off a demo of our new visual tool, Timeline, which allows you to create dynamic shots using procedural cameras, tracking, and composition blending. It has the standard features you’d expect from a sequencer, like support for animation and audio, allowing artists to focus on storytelling, not coding. A preview for this will be available soon.

Otoy Founder and CEO Jules Urbach also came on stage, to show off some of the work they’ve been doing in movies and television, like the opening intro to HBO’s West World – all done with their impressive Octane renderer, the world’s first and fastest GPU-accelerated, unbiased, physically correct renderer. We’re excited to announce that we are partnering with Otoy to ship Unity with full Octane integration next year. You’ll be able to import any cinematic asset, drop it into your timeline and have it automatically rendered in a physically correct way. The integration with Otoy’s Octane renderer means you’ll be able to render and publish ORBX VR media from Unity.

VR

Our new head of AR/VR Tony Parisi came on stage to touch on how we’re planning to invest in the future of AR/VR.

And we were thrilled that lead product manager for Daydream Developer Platform Nathan Martz was able to come on stage to show off the latest from Google Daydream, from Daydream Home, to accessing Google Play right in VR, to confirming the official street date for Daydream View — available in just a few days on Nov 10! We’re so excited to have mainline support for Google Daydream in Unity 5.6. Until then, check out our preview!

We also heard from our VR Principal Engineer Dioselin Gonzalez who announced our completely new video player. We’ve rebuilt this from scratch with performance in mind, so we can support 360 VR videos and smooth 4K playback. The video player will be coming to Unity 5.6, with a preview coming soon.

Our Labs team, Principal Designer Timoni West and Principal Engineer Amir Ebrahimi took to the stage for a live demo of our VR authoring tool called EditorVR. (Thanks to Campo Santo for letting us use assets from their game Firewatch!) EditorVR is built as a Unity package, so you can import it into your project using the open Unity Editor API, just like you would any other tool you find on the Asset Store. We’re pleased to say that EditorVR is fully extensible and open source, and a preview will be coming in Dec 2016. For more information on EditorVR, keep an eye on http://labs.unity.com.

Platforms & Services

We’re aspiring to make developers’ lives easier with services like Unity Collaborate, enabling your entire teams to create together. Since its soft launch in March, over 12,000 creators have signed up for the closed beta. Now, we’re pleased to announce that we’re opening the beta to everyone. To sign up, please head tounity3d.com/collaborate.

At this point, our Chief People Office, Elizabeth Brown, took the stage to talk about our new marketplace for work. If you’re looking for a way to showcase your skills and connect with interesting projects and potential employers, Unity Connect aims to do just that. And what’s more, Unity Connect goes into open beta today! To learn more and sign up, head toconnect.unity.com.

John Riccitiello then returned to the stage to talk about our platform strategy. Did you know that this year alone, we added support for Amazon FireOS, Microsoft HoloLens, Google Cardboard, Google Daydream, and SteamVR? That brings us to 28 platforms and technologies. Not stopping there, we announced today our exciting partnerships with Xiaomi, the Nintendo Switch, and Vuforia.

Director of Global Games Partnerships at Facebook Leo Olebe then came on stage to officially unveil the name for Facebook’s new dedicated PC gaming platform — Facebook Gameroom. Their export-to-Facebook feature in the Unity Editor is now going open beta, so be sure to check out the public preview.

Made with Unity

We also brought several Unity developers up on stage, to talk about how they’re using Unity to create their upcoming titles. We’re so humbled that the games and experiences you have all been making span across every

A big thanks to Visionary Realms’ Brad McQuaid and Corey LeFever, who came on stage to demo Pantheon: Rise of the Fallen, an open world high fantasy, group-focused MMORPG that shows just what can be done with Unity.

We’ve also been working with some developers already to test out our new progressive light mapper. We’re thrilled that Lucky’s Tale developer Playful’s Juan Martinez could come on stage to show off how they’ve been using new features from the progressive light mapper with a sneak peek at Playful’s next project. We’ll be releasing a preview of the progressive light mapper preview soon, so keep an eye on our experimental feature forum for updates.

We were excited to welcome Gamevil President Kyu Lee to the stage to show off Royal Blood and talk about their development process, from the various graphics techniques they used for more efficiency, to how they’ve used custom built editor extensions to great success. Royal Blood is scheduled for global release in the first half of 2017.

Ben Cousins took to the stage to debut The Outsiders’ upcoming viking RPG called Project Wight. The impressive demo’s graphics were made in only 3 months, using off-the-shelf Unity 5.4 with assets from the Asset Store.

And finally, our VP of Marketing Katrina Strafford closed out the keynote by thanking all our wonderful Unity developers for making such a diverse range of games and experiences. With great excitement, we are thrilled to say that Super Mario Run is made with Unity.

Interested about the “…talented team of technologists in Paris, highly specialized to solve some of the toughest problems for artists from advanced particles…”, any disclouse about the new particles features?

Well, i don’t know how are these jobs suppose to be used, but it is great that will be able to use them… I am actually wondering for a specific thing – am i going to be able to call the SkinnedMeshRenderer.Bake method using a C# job but from another thread. This is to be used in multi-threaded mesh-raycast system. I wan to move the whole procedure from mesh baking to intersection testing and returning results. The system is Multi-Threaded already except the baking part…
Thanks for your respond!

Would this job access potentially include Read-only Physics spatial Queries as well? For What I’m working on all I need is:
* The ability to Read/Write access Transform on a job thread.
* The ability to do Read-Only access to Physics.Raycast (and other) queries on a job thread.
* The ability to supply each job with it’s own copy of buffer object.
* Join the Jobs and apply calculated changes back on the main thread.

My use case is mostly custom platformer Physics and AI for a VR, but the ability to scale that to a larger number of AI character agents would be handy.