Session 609WWDC 2016

SceneKit is a fully featured high-level graphics framework enabling your apps and games to create 3D animated scenes and effects. Witness the biggest leap forward in SceneKit yet with the introduction of its new Physically-Based Renderer (PBR). Dive into new APIs for accurate materials, physically-based lights, HDR effects, and enhancements in Model I/O. Walk through an example game using PBR and see how to integrate its workflow into your development.

[ Music ]

[ Applause ]

Good morning.

Good morning and welcome to Advances in SceneKit Rendering.

My name is Amaury and I'm delighted to be hereto present you how we brought SceneKit to the next levelwith state of the art graphics.

So we have a lot to cover today.

So I will start with a quick intro on SceneKit before we diveinto this new rendering advances.

Next Jean-Baptiste and Sebastien will join me on stageto present a cool demo, explain how we built it,and present all the new featuressuch as great new camera effects.

And finally, Nick will join us to present a base to Model I/O.

So in a nutshell.

As you know, SceneKit is a high level APIunder the GameKit umbrella and it focuses on 3D graphics.

It plays nicely with [inaudible] and it's builton top of Metal [inaudible].

And use SceneKit in any situation where you needto disperse with the graphics on screen.

And when you start to think about it they are usedin a great deal of places.

For instance, we just introduced Swift Playgroundswhere SceneKit makes scenes more visual and helps kidsin the first steps in learning how to program.

In Xcode we use SceneKit to create an innovativeand extremely useful interfacethat helps you develop your apps view hierarchy.

In iBooks and iBooks Author people can create rich bookswith enhanced illustrations which are interactive.

And of course SceneKit can be used for games.

Last, but not least, thank you.

You guys found of so use cases for SceneKit and 3D graphics.

You published thousands SceneKit-based applicationsto the Store.

So thank you.

[ Applause ]

Now, as you know, SceneKit is tightly integratedwith the system.

It works seamlessly with all the Apple technologiesand it takes the most of macOS and iOSwhere it's been available for a few years now.

And sense we last talked at WWDC we also introduced SceneKiton tvOS.

All we had to do for the [inaudible] sample code wasto add two ports for game controllers and it was readyto be played on the big screen.

So it's absolutely fantastic to see how the same gameand code can run on macOS, iOS, and tvOS.

And this year we are closing the loopwith SceneKit coming to watchOS.

[ Applause ]

Thank you.

[ Applause ]

So SceneKit on watchOS is a great opportunity to startto think about new interactions and a wayto present content on your wrists.

Now, as you might imagine, there's a lot to sayabout the [inaudible] for the Apple Watch.

And we won't have time to cover this today.

But we have a [inaudible] sessions, "Game Technologiesfor Apple Watch," on Friday, where you will learn moreabout what's available, how to play with SceneKit, SpritKitand other technologies.

And if you are new to SceneKit and want to learn more,you can always go online to check previous WWDC sessionswhere we explained basic,but also really advanced, features of SceneKit.

Okay. So now let's dive into this new rendering capabilities.

Well, this year SceneKit puts physically [inaudible] renderingin the hands of everyone.

It means that developers, you guys,get to have stunning graphics for the arts and games.

Now the thing is that when shading is [inaudible] all thelighting informationand equations are expressed in a linear space.

So in a non-linear pipeline what you end up with is colorthat [inaudible] that is richwith gamma encoded [inaudible] textureand then it's processed using many [inaudible]and the resulting enrichment is writtento a texture or framebuffer.

And, as you might imagine, that's not correct.

For the final image to be correct all the operations needto happen in enough space.

So, as an illustration, here is a scenewith lighting occurring in gamma space.

And here is the same scene with shading in linear space.

And if you compare them, you will see how light fall-offsand edges appear harsher in linear rendering.

Now linear rendering is essentialfor physically based rendering but it actually applies to anyof the SceneKit lighting modelbecause it just makes the [inaudible] white.

Now, as you know, color is a big [inaudible] thing this yearat WWDC so in addition to gamma corrections [inaudible]management automatically.

So what does that mean?

It means that the color profile that is assignedto a texture will now be [inaudible].

Any operation that happens between the image is loadedfrom disk to the moment it's handed to the systemso that it can be displayed on screen,we will respect the integrity of the color data.

So a SceneKit-based application will pull this wondersof the color accuracyof a professional [inaudible] application.

Now, as you know, [inaudible] just stop raw data that happento be stored as colors.

And SceneKit knows thatand so it won't color match such images.

Now to help you with that there's a great new featuresin Xcode 8 asset catalogs and they are texture sets.

In a texture set one can specify whether an image holds colordata or raw data and then Xcode can automatically convert theseimages to CPU and GPU efficient texture formats.

But to learn more about that we have a session rightafter lunch, "Working with Wide Color,"where the Metal team gets into [inaudible] details.

Now, in addition to textures,color management also applies to color objects.

So color components are no longer assumed to be sRGB.

And so if you are creating colors programmatically,it's now really important that you use the right initializer.

So here is an illustration with two color objects,one Display P3 and the other sRGB,that were created using the same components.

When working with color pickers pay attentionto the color space that you choose.

Above the menu we let you choose from [inaudible] color spaces,including device independent ones,such as [inaudible] and [inaudible].

And there's also a handy option to display valuesas [inaudible] rather than integersso that they can be easily copy-pasted to code.

And speaking of which, as you know,shader modifiers are a great feature in SceneKitthat will allow you to customize our rendering.

Now, as I said, this year shading appears in linear space.

So you must be sure to convert your colorsto the linear extended sRGB color space before thesecomponents are used for color [inaudible].

Now a few notes about backward compatibility now.

Linear rendering and color management are automaticallyenabled whenever you link your app against the new [inaudible].

There's no performance cost in enabling thembut they will dramatically change the lookof [inaudible] scenes.

So, for instance, here's last year's demowhich did not use linear rendering.

But if you want to deploy your application to older versionsof the system, or want to update a linear renderingin color management for some reason,we have found a way to do that.

You can [inaudible] level by specifying a keyin your apps Info.plist file.

And then there is wide gamut content.

So, as you know, with wide gamut color spacessuch as extended sRGB [inaudible] existsand they are really importantwhen working for modern hardware.

The new iPad Pro and iMacwith Retina Display have wide gamut displaysthat SceneKit can [inaudible] automatically.

All you have to do is to bring your wide gamut content,so textures or colors,and SceneKit will enter that transparently.

Now wide gamut texturesand framebuffer will require more memory to hold that dataand that will lead to an increased bandwidth usage.

So may you experience any performance issue,we offer a way to upload again at the [inaudible] app level.

Now let me mention the color gamut showcase sample codethat we built in collaborationwith the Cocoa and Cocoa Touch team.

It's a synching-based application that will allow youto see the out of gamut color componentsand it's also really usefulbecause on the wide gamut display you will be ableto see what this display bringsbecause you can simulate a non-wide gamut display.

So to learn now about working with wide colors and howto convert color components from between color spaces,again we have a great session this afternoon.

And so that works for accurate rendering which is a requirementof physically based rendering.

Now what is physically base rendering and why?

Well, [inaudible] scenes with detailed models.

And that is definitely true.

But shading is what makes objects tangible.

So all you see here on the screen usedto be a soup of polygons.

And shading is a process of finding the right colorfor each detail on the screen.

So all the highlights, shadows, and the sense of depth,it comes from shading.

Shading is that magical operationthat can bring a scene to life.

Now how does it work?

Well, first there is light which is emitted from a source.

And when light hits an object it interacts with matter accordingto properties of the surface and then light is reflectedto find your eye or a camera in this case.

Now this interaction between light and matter is somethingthat is really complex.

And over the years many mathematical models weredeveloped to try to best describe it.

Physically based rendering is an approximation of light transportthat relies on such mathematical models and they takeinto account the physical properties of light and matter.

But, as you know, SceneKit is a high-level API and we wantto allow anyone to benefit from this new lighting model.

So we will expose a super easy-to-use APIso that you can use physically based renderingthat artists love.

So at the end of this session you will be able to getfrom this rendering, which is standard,to a physically based one.

Okay. So in SceneKit we will export physically basedrendering from two angles.

First, physically based materials and then,physically based lights.

So first, physically based materials.

Here is a description of a point on the surfacewith normally indicating its orientation is for the space.

And when light hits that point, it's split into two terms,diffuse reflection and specular reflection.

Now diffuse reflection corresponds to lightthat goes underneath the surface and is scattered so many timesand in so many directions that it appears uniform.

The color of the diffuse reflection is albedoor the base color of the object.

So when designing the interface for physically based materialin SceneKit we will want to use an albedo map.

Now specular reflection does not follow that way.

Specular reflection is just made of lightsthat bounces off the surface and so it'sof the color of the incoming ray.

So here is what we call a cube map.

It's a collection of six spaces that represent the environmentaround the location in 3D space.

And when we place a perfectly specular objectin such an environment we will see that acts like a mirror.

Now let's take a more realistic example with a plastic ball.

As you can see, it's not a perfect mirror.

At the center is the reflection is dim but as you move closerto the edge it gets brighter.

And actually for raising angles all light is reflected.

Now not all materials have the same reflectivity amount.

What you see on the top is a curvewhich represents the reflected values in functionof the incident angle from zero to 90 degrees.

And you will see that these reflectivity values stays almostconstant from zero to 45 degreesand actually we can use this valueto reconstruct the whole curve.

Now gold is an interesting examplebecause it has different reflectivity values of the red,green, and blue components.

The one last thing to note here is that metals, such as aluminumand gold, have high reflectivity values whereas non-metalsor dielectrics have low reflectivity values.

And this differencein reflectivity is actually essentialfor the final look of the object.

So in SceneKit we want to expose a metalness mapwhich will indicate which parts of the object is metallicand which part is not.

So in addition to reflectivityto different type reflectivity values,also note that metals will absorb all light beneath thesurface where dielectric will scatter with light.

So the visual effect of this isthat metals have a wide specular reflectionand no diffuse reflection and dielectric will have a lotof diffusion and specular reflection will almost be seenonly at raising angles.

So in SceneKit we will reuse the diffuse Metal propertyto store the reflectivity values of metalsand the albedo of dielectrics.

And for the reflectivity valuesof dieletrics we just use a global low constant.

So we just reuse the diffuse Metal propertythat we brought from [inaudible].

Now one last aspect I would like to talkabout is the surface roughness.

So, as you know, no surface is perfectly smooth.

As a microscopic level you always have tiny bumpsand cracks that will affect the specular reflection.

So the rougher the microsurface is,the blurrier the reflection will be because reflected raysof light are no longer aligned.

So again in SceneKit we would want to expose the roughness mapwhich will indicate which parts of the surface is roughand which part is smooth.

And this one is a [inaudible] image.

So we just saw how we can divide three fundamental propertiesand each of them has a clear meaning and is derivedfrom [inaudible] properties of the surface.

Now creating a physically based materialin SceneKit is straightforward.

You first create a material, then set its lighting modelto the new physically based lighting model,and finally you provide your maps.

So let's take an example.

We start with a mine cart and only a diffuse map.

We will then add a roughness map.

So, for instance, take a look at coal.

Coal is rough so there is no [inaudible].

And finally we will add a metalness map.

So, for instance, take a look at rails and wheels.

Let's take another example.

We have a fire truck.

Again, we start with a diffuse map.

Now we will add a metalness map.

And finally a roughness map.

So for instance, take a look at tires.

Now one thing I would like to mention.

For the metalness, roughness, and ambient occlusion maps,please use grayscale images.

Having different channels for the red, green,and blue would just be a waste of memory.

And even more if you add another function in.

Now, furthermore, if you want to use the same valueover the whole surface, you can use the color object,or even better, for these metal properties we know[inaudible] numbers.

So we just saw how we can create a really simpleand high-level API to create a wide variety of materials.

Here is the same objectand on one axis we changed the roughness valuesand on the other axis we changed the roughness value.

Now remember how we said that we would export physicallybased rendering.

Let's now have a look at physically based lights.

Well, in SceneKit lights can be split into three categories.

I will start with image based lighting, or IBL,then cover light probes, and finally point lights.

So image based lighting.

As I said, you can use a cube map to describe the environmentaround a location in 3D space.

So when shading a pointon the surface we can consider the finish hereabove the end point according to its normaland the right lighting information on the colorthat is [inaudible] in this cube map.

So for instance, here is an objectwhich is lit only using image based lighting.

There is no light in that scene.

And you can see how changingto cube map dramatically affects the look of the object.

Using image based lighting all the objectsin your scene will have a coherent lookand will work nicely together.

By using image based lightingin SceneKit is really straightforward.

We added a lighting environment property of the scene.

And you can simply set a cube map to its contents.

And what's great is that it works perfectlywith the background property.

So for instance, if you take an object and set the same imageto the background and lighting environ properties,you will be able to display an object in its context.

Now cube map, it can show the distant environmentand the aesthetic.

So when shading a point on the surface it's possiblethat this environment is not visible because you're in a caveor there's another object between them.

And that can be taken into accountwith image based lighting.

So it does not work very well for occluded objects.

Luckily we have a solution for that: Light probes.

Light probes are local lights that are faced towards the sceneand they capture the local diffused contribution.

So when shading a pointon a surface we can find the four closest light probesand interpolate lighting from these probes.

So as I said, light probes, they are local lightsand so they can account for occlusion.

And they are implemented in such a waythat they are really lightweight and efficient.

You can have dozens of light probes in the scene.

And we actually recommend that.

Because the more probes you have,the finer the [inaudible] will beand the better local lighting information you will have.

So creating a light probe is easy.

You create the light and then change its type.

That can be done either programmaticallyor within the Xcode SceneKit scene editor.

Now just like cube maps,light probes capture static lighting information.

And this information will be bakedinto the probe easily using the Xcode scene editor of this API.

So we just saw how using IBLor light probes you can have indirect lighting in the scene.

But of course if you want direct lighting, you still have accessto all the other kind of lights.

So omnidirectional, directional, and spot lights workwith physically based rendering.

And actually we have [inaudible]so that you can be a better configure.

For instance, we added the light's intensity.

A light's intensity is expressed in lumens which a defaultof watt 1000 which is in the orderof magnitude of a light bulb.

We also added a light's temperature which is expressedin Kelvin and from which we can divide for color.

And one great new feature, we added a new kindof lights, IES lights.

So IES lights, or photometric lights,can account for any attenuation shape.

So while the spot lightor omnidirectional light has a really symmetrical attenuationcurve, IES lights can better accumulate the behaviorof a theater world light.

And, for instance, it can account for [inaudible].

It can account for shadows.

For example, due to the frame of the light.

Now creating photometric lights in SceneKit is really easy.

Again, you create the light.

Then you change its type.

And finally you provide the URL to put them into profilewhich can, for instance, be downloadedfrom the website of a manufacturer.

So as a quick recap, we just saw how simple it isto create a physically based material in SceneKitand all these properties derive from here where propertieson the surface so they are really easy to understandand how we can work with lights in the contextof physically based material.

So with that please welcome Jean-Baptisteand Sebastien for great demos.

[ Applause ]

So thank you, Amaury, for this great presentationof the new rendering capabilities of SceneKit.

So let's see them in action, the [inaudible].

So, as you will see, almost everythingthat has been presentedby Amaury is actually very [inaudible] availablein the [inaudible].

You will be able to tweak propertiesand see the result in real time.

So I have a very simple scene ordered herewith just one light on this truck.

I go to the Materials inspector.

As you can see we have just two materials of the subject.

One for the body and one for the accessories, et cetera.

So I'm going to select those two.

We are continue using the Blinn lighting model so we'll switchto the physically based lighting model.

Now I've set the two materials as metallic.

And, as you can see, there is an issuebecause we don't see the reflection of the environment.

So we can go to the Scene inspector and we haveto set the lighting environment for project.

So for that I will use the cube map, for example, this cube mapof a parking as the lighting environment.

So. Shortly I'm focusing on those three main propertiesof the physically based lighting model.

So let's now move to the roughness value.

The roughness is indicate how smooth the surface is.

So you will see that the rougher the surface is,the blurrier the surface will be.

So if I move the value of the roughness closer to one,I have a blurry reflection.

And then almost no reflection at all when we reach one.

So if I move back to zero, I have a very smooth surface and,as you can see, everything is,the whole environment is reflected in the metal.

So now I'm using just one constant valuefor the roughness.

And I would like to be able to use, to specify a valuefor each part of the object.

For that I just have to use a roughness map.

So let's use a roughness map for the body.

And a roughness map for the accessories.

So we have the same kind of issue with the metalness.

So we want to be able to specify which partof the object is metallic or not.

So for that we [inaudible].

So let's set the metalness map for the accessories.

A different map.

So, as you can see, the body partsof the object is nonmetallic while the front radiator grillis completely metallic.

The final touch is to add the albedo.

And we will be done.

So that's it.

So we have a full [inaudible] rendering of this fire truck.

I can now switch to the [inaudible]and change the cube map.

For example, this cube mapof the lighting environment with trees.

I can set it in the background.

So that's it.

So, as you've seen, it's very simpleto use the new SceneKit scene detail.

And, you know, now to demonstrate this kindof rendering in action we've built a cool demo that I'm goingto show you now while Sebastien is presenting it.

Hello.

[ Applause ]

Thank you.

[ Applause ]

So I'm delighted to present you our new furry friendsfor this year.

Bub. Bub is a badger.

He rides in the mining cart.

And he tries to catch gems and boosters for speed.

So everything you see is renderedwith the new SceneKit's renderer.

All the materials are physically based.

All the lights, too.

We also used the usual properties of SceneKitssuch as action, animations, and everything you used to have.

It's a Swift application that runs on macOS, iOS, and tvOS.

It's fully built with Swift, about 700 lines of code.

We placed light probes along the track to takeinto account the change of local light.

And pay attention to the light that changes when we goin the caves or in the tunnels.

We have also added new effects such as motion blurwhich you can see when Bub catches a speed bonus justlike this.

We have a new HGI camera which is why the light changesand when there is a bright light or when the environment changes.

We also use IDL's for the light environment.

Again some new, some, we love the motion blur.

You can also see bloom when there are bright lights.

And all the materials, as you see, are completely PBRso we have free reflections for the crystals,and for all the bonuses, and the gems.

Once again, you will see the light change.

It's tone mapping doing the work.

Thank you.

[ Applause ]

So, let's go to the slides now.

I will tell you a bit more about this demo.

And the first thing that we're very gladto tell you this year is that as usual the demo is a simple code.

Yes. Thank you.

[ Applause ]

You can download the code on all the assets from the website,from the developer website, and play with it,inspect the scene code, see how we build it.

And it's 700 lines of Swift code.

We think it's pretty simple to understandand we hope you really like what you see and learn a coupleof things from the demo.

So this year we had to decide if, for the demo,we discussed it with our artists.

And we produced some drafts to takeinto account the design idea we had.

And once we agreed on the design ideasand the workflow the artists started to model the world.

And as it's an interactive process we really needed toolsto be able to ingest the models as they were builtand to start programming right away,without waiting for the final assets.

So we have a custom tool written in SceneKitthat uses the full power of SceneKitin a common line application.

To involve the tools from the DAE files they convert the unitsto meters and they also place light probes automaticallyin the scene because there are more than 200 light probesand we don't want, we didn't want it to placedin by hand each time the scene changes.

We have used image based lighting.

So we have a cube map for the background image,another cube map for the lighting environment.

We used the lighting environment to add the reflections.

And it's, as you've seen, great for outdoor scenes.

We have also used light probes.

You can see these light probes as they were displayed in Xcodeand we've highlighted them.

You see that only from this pointof view there are already many light probesso you can imagine how many there are for the whole scene.

So the custom tools placed themin the environment and compute them.

You can also do it by hand in Xcode but, of course,the more light probes you have, the more tedious it gets.

It's essential for the inside but it's also adds a nice touchto the view in the outsideto detect small changes in the scene.

We have added light maps for the insidebecause it overrides the environment,the lighting environment, which is very important for the cavesas the light is very different in the cavesso we have the probes and the light maps that change the lightand the mood of this part of the scene.

Of course we use normal maps as usualto add details to the models.

We also use baked ambient occlusion mapsfor a very much better lighting view and rendering.

We have one big point light to simulate the sun.

It's very high in the sky in the scene and we use itto create dynamic shadows and to improve global lighting.

As I said, all the materials you seein the demo are 100% physically based materials.

So we get the nice water palms reflecting the environmentas well as the crystals.

Talking about crystals, this is very simple materialthat we built.

It has no texture map so it's very simple to create.

It's fully metallic and has no roughness at all.

And just a diffuse color.

So it's a very nice way to create a gemthat reflects the environment almost for free.

On the other side of the spectrum you can see this towerwhich is one object with metallic partsand nonmetallic parts.

We used, of course, metalness and roughness maps,texture maps to create that.

And, as you see, we still have diffuse coloron the normal map to add detail.

So basically the demo used all the new capabilitiesof SceneKit.

Physically based shading, all the SceneKit APIfor materials, lights.

We used Xcode integrationand also new custom tools we built for the work flow.

And we think it's a great showcasefor this year's capabilities and a great sample codefor you to learn new things.

But we also had to change how the camera behaves because nowthat we have great materials and light,we also needed a much better camera.

And now that we have light that are realistic we neededto have an HDR camera or High Dynamic Rangebecause the usual camera used to have Low Dynamic Rangewhich is 8-bits per components.

Now we have float per components so we can have very small,very unbright light such as a candle or a light bulb going to,for example, the sun which is very, very bright light.

So this creates a very high dynamic range that we needto remap to the dynamic range of the screen.

And for that we used tone mapping.

Tone mapping is the action of remapping part of the renderingto a smaller capability device.

So we need to enable the HDR camera.

It's not automatically set by default.

You can set that in the API or in Xcode.

And you can configure the tone mapping.

You can change the gray point, the white point,and the range you want to expose.

And you can also frost the exposure offset.

So, for example, you can have this nice look of the scene,but you can create a low key one with underexposed renderingor overexpose it, well, but just by changing the offset.

It's very simple.

We have added also very nice effects thanksto the new HDR camera.

The first one is bloom.

Bloom is a way to simulate being blinded by very bright lightsfrom the scene and reflections.

And it will be created by bleeding the ejectionon the light on the surrounding pixels.

So you can see in this example it's a very nice effectsand we can see how it looks in action with a reflectionon the roof of the tower.

I think it's a very nice way to see how the light bleedson the surrounding pixels.

And it adds a very nice touch to the rendering.

Next we have added motion blur as you've seen in the demo.

So it smoothens the camera movements.

And the thing is when you just add motion blurto the whole scene this is what you getso sometimes we don't want to blur everything.

For example, we wanted the badger and the cartto be sharp and crisp.

So we have a new API that enables usto exclude some objects from the motion blurand the result gives you a nice, crisp look for the subjects.

We have added a couple of variationsfrom real life camera lenses this year.

The first one is vignetting.

Vignetting is a way to, is an aberration in real life lensesthat creates shades on the corner of images.

So you can change it from this image to this one.

And you can also change environmentalsto change the filtering going from the center of the imageto the border of the image.

Another aberration we have simulated this year iscolor fringe.

Color fringe is a defraction of lights that happensin real lenses, in the glass of real lenses.

So it creates a magenta and sienna shadowof the lights in the rendering.

And we go from this look to this look.

This is a very exaggerated one.

You can go more subtle to get a nice look.

We have also added a very nice way to change the moodof your scene with color correction.

So you can change the saturation,go for an almost black and white lookor overblow the colors if you want to.

And you can also change the contrast of the scene.

So you can have the normal look or a desaturated one,or oversaturated image, and change the contrast.

And the last one we have, it's a really very great effect.

It's color grading.

Color grading enables us to completely remap the colorsof the scene to completely different colors.

So we use a strip of a square imageto create the 3D color cube that we use as a lookup tableto remap the original colors in new ones.

For example, in this case would remap normal color that you seeon the upper side to a sepia tone.

So we get this looking like that, like on, in sepia.

And it's very simple to use and we think it's great.

It's a very nice look.

So that's all we have for HDR camera this year.

We think it's a very nice upgrade for cameras.

We can't wait to see what you do with that.

We got brand new effects that are cumulative so you can use,you don't have to choose in between,for example, bloom or motion blur.

You can use all at the same time.

Of course it has a cost but you can really create a very niceimage and very cool looking scenes.

So now I hand over to Nick to tell youabout Model I/O improvements for this year.

Thank you very much.

[ Applause ]

All right.

Hi, everybody.

So I'd just like to startout by covering a little bit that's improved on inputand output of models and SceneKit.

So this year SceneKit can import modelsin their native authored format, i.e.,not necessarily just triangles as before, but in the topologyof quadrilaterals or arbitrary polygonsthat the authors originally created their content in.

SceneKit, if necessary, will automatically triangulatefor you in order to perform rendering.

And the thing is if you wantto use our new tessellation facilities you're going to wantto have accurate tessellation for good shapes.

So you'll need to opt-in,using the preserved original topology flag.

That flag corresponds to the same flag in Model I/Oand you bring in the assets and you specify this.

It will preserve holes, and creases, and all the thingsthat are importantfor an accurate rendition of the object.

Now this year we have improved our subdivision algorithmsto the new system and OpenSubdiv 3 from Pixar.

You can see in this example herethat previously we would have imported as trianglesand when you do the tessellation that box which we wantto smoothly subdivide comes out a little bit lumpy.

Now if you bring it in preserving its topology,you can see that the quads go to a uniformly round surfaceand it looks very nice.

So this kind of facility is greatfor having lightweight objects that can scale the resolutionto your scene, and so on and so forth.

Now the other aspect of input and output that I wantto emphasize is that last year we introduced physically basedmaterials and things to Model I/O.

They bridge naturally onto all of the SceneKit stuff.

So if you have a high dynamic range camera specifiedin Model I/O, it will come acrossand without losing any attributes.

So on to Model I/O.

Yeah. Quick refresher.

As it says on the [inaudible], it's for input and outputof models onto our frameworks and systems.

Need this so obviously to bring your data from your appswhere you've created things, translate objectsbetween frameworks, such as SceneKitand MetalKit, and so on.

And we provide support for a number of standard file formats.

Now file formats are the method by which things comefrom your art program into your tools.

And historically the formatsthat we had have been quite narrowly specialized.

For example, they might just bring in a model.

Or they just might bring in bulk data.

Now really exciting thing that I'm bringing to you this year,we're bringing to you, is in conjunction with our friendsat Pixar we're introducing supportfor Universal Scene Description.

Now Universal Scene Description is a new open standard.

And the thing that's really interesting and excitingabout it is it's not only a file system and a formatthat can be either easy to read in ASCII or efficientfor loading in binarybut it also includes a scene composition engine.

And it introduces, once again unique to this formatas an open format, is file layeringto enable concurrent workflows.

Now concurrent workflows is kind of an awesome thing.

Here is a representation that might getin Universal Scene Description for a typical scene in a film.

We have a shot layer, the layer, the shot is layeredfrom components, background, characters.

The characters themselves might be made out of many components.

Now you can see there's layers in that imageof a shot layer there.

That's because not only can you just create the scene with allof these things composed, but you can make variations.

And so the scene description will know that this islike take three, maybe the characters comein a little bit faster or a little bit slower.

And you can have all of those variations embodiedin one file and for review.

Now another really unique aspectto Universal Scene Description is, as far as I know,it's the only open source file formatthat allows the specification of classesin variations of objects.

Now, you can imagine that you might have some sortof a situation where you have lots of monsters,and they all want to go to university and stuff,and there's like books.

Now in a traditional workflow you're probably goingto find yourself creating your books and your program,slaving out millions and millions of different filesfor every little book, and then placing them on your bookcase,and getting it out for rendering like that.

Now that is tedious.

In games you have things like teams of characters,maybe they all differ in like hairstyle and shirt.

And you might have to bake those all out.

Now Universal Scene Description allows you to specifyin a single file classes of objects.

So the class represented here obviously is a book.

So the file can represent many different geometricalinterpretations of books.

Like you obviously got a wide one,and a tall one, and a thick one.

And when you instantiate your bookinto the bookcase you can tell Universal Scene Description,"I want this book and I want it to be this wide and that tall."

And it will provide the information that you needto instantiate that into your runtime,or your shot, or whatever.

The variations that you can havein a single file can vary along many axes.

In this case I'm changing some shading properties.

So previously I had all those books.

I can make them whatever color I want as well.

And the magic of that is I place the book and when I finally ask,for the purposes of rendering, "What color is the bookon the shelf in this place?"

it will work out, according to all of the logicabout scene composition that file and the engine embodies,the way that it should be represented.

Now beyond that you can also represent a simple,in a single file, different capabilities.

So what I'm showing here is that on the very low end,like say for a wearable device, I might have a low poly version.

The same file can have one that's suitable for usein the highest rendering capability that you've got.

If you import a Universal Scene Description fileinto Model I/O I don't expect you to be able to read thatyou're going to be able to get a hierarchyof familiar Model I/O objects with all the propertiesthat were in the Universal Scene Description file exactlyrepresented so that you can use our tools that are providedin Model I/O, such as placing light probesand evaluating them towards optimal positions.

However, beyond that, let's say that you're working on a projectand your art team just gave you a folder full of stuff.

You can just open that window in the binder with all the stuffthat you just got and Finder will prepare thumbnails for youso you can see what's there.

And Quick Look works with it as well.

So you can select one of these things, whack the spacebar,and it'll pop up and you can tumble it.

Now, of course Quick Look shows you one thing at a time.

If you want to hold things up for comparisonor maybe your USD file has multiple cameras or somethingin it that you want to inspect individually, you can bringthat up in preview and Universal Scene Description is workinggreat there.

And if you're bringing Universal Scene Description fileinto Xcode, it imports via Model I/O into SceneKitwith an exact representation of what was in that fileso that you can inspect it in the hierarchy browser,you can look at the properties,you can move things around, you can add cameras.

You make edit scenes, send it back out to USD.

And then you can send it back to your artists and say, "Hey,you know, I've got some edits for you.

Can you, do you know the rep?"

So finally, it's incorporated into SceneKit.

And so friends at Pixar suppliedup with Mr. Ray from "Finding Dory."

And this is just stock out-of-the-box SceneKitwith the new physically based shadingthat you just heard all about.

And we're just playing the movie asset with three secondsof animation, and it looks really, really nice.

So plugins are the thing that you're going to need in orderto incorporate Universal Scene Description into your workflows.

So that will enable the motion of your assets between people,your content creation programs, the apps that you make.

Now the plugins, and the open source information,and all availability, and schedules, et cetera,are available on the openusd.org website which I encourage youto go visit to find out how you can use thisin your pipelines and processes.

So that's Universal Scene Description.

[ Applause ]

So a quick summary.

SceneKit is available across our entire ecosystemon every platform.

It's kind of an amazing thing.

We have physically based rendering for any stateof the art looks and state of the art representation,just a beautiful look.

And HDR cameras and effects give you controlover how things are representedand how they look, really high quality.

And we've got support for Universal Scene Descriptionwhich we're really happy to get behind and think it's goingto make a big difference in workflowsin coming days and months.

More information on this session which was 609,is available on the site.

There's related sessions: Visual Debugging with Xcode,Wide Color, Game Technologies and Apple Watch"that you can attend today and tomorrow.

And thank you very much.

[ Applause ]

Apple, Inc.AAPL1 Infinite LoopCupertinoCA95014US

ASCIIwwdc

Searchable full-text transcripts of WWDC sessions.

An NSHipster Project

Created by normalizing and indexing video transcript files provided for WWDC videos. Check out the app's source code on GitHub for additional implementation details, as well as information about the webservice APIs made available.