Search Unity

Unity 2018.2 beta: What’s new

The recently launched Unity 2018.1 marks the start of a new cycle that will enhance Unity’s core technology. It is built upon the early stages of two innovations: the Scriptable Render Pipeline and the Entity Component System. Together, they make it easier for creators to deliver beautiful graphics while unlocking the performance of modern hardware to make richer experiences possible.

The beta for 2018.2 is now available and continues the 2018 cycle with a range of improvements across all of these new systems. We’re also adding several brand new features.

Texture Mipmap Streaming

The Texture Mipmap Streaming system gives you control over which mipmap levels are actually loaded into memory. Normally Unity will load all the mipmap levels that are stored on the disk, but with this system, you can take direct control of which mipmap levels are loaded.

Typically a system like this is used to reduce the total amount of memory required for textures by only loading the mipmaps needed to render the current camera position in a scene. It trades a small amount of CPU cost for potentially large GPU memory savings.

Package Manager updates

We improved the Package Manager in a number of areas, including the UI and status of package label, the ability to dock the window and easy access to both documentation and the list of changes.

You can see more of the latest developments and join the discussion on the Package Manager forum.

Particle System improvements

Unity now allows up to eight texture coordinates to be used on meshes and passed to shaders. Particle Systems will also now convert their colors into linear space, when appropriate, before uploading them to the GPU.

We’ve added two new modes to the Shape module, in order to emit from a Sprite or SpriteRenderer component. Finally, Unity 2018.2 introduces some new scripting APIs, for baking the geometry of a Particle System into a Mesh. Starting with Unity 2018.2, the tail of the Trail Renderer will now retract smoothly, as opposed to vertices vanishing abruptly.

Camera Physical Properties

In 2018.2 beta, we added a real-world physical camera model that can drive a standard Unity Camera object. This allows you to have a familiar interface offering cinematographers a ‘real camera’ experience.

High DPI Monitor Support

If you are an owner of a 4K monitor, you can now enjoy High-DPI scaling support on both Linux and Windows in the Unity Editor.

Animation Jobs C# API

In Unity 2018.2, we are improving the AnimationPlayables by allowing our users to write their own C# Playables that can interact directly with the animation data. This allows integration of user made IK solvers, procedural animation or even custom mixers into the current animation system.

Other improvements

We also added support for managed code debugging on iOS and Android for IL2CPP, added support for serializing MinMaxCurve and MinMaxGradient in scripts, and added Vulkan support in the Unity Editor on Windows and Linux.

Check the release notes for the full list of new features, improvements and fixes. We’ll be happy to hear your feedback on the new features on our forum!

How to get early access to the new features

You can get access to all of above now simply by downloading our open beta. Joining our open beta will not only give you access to all the new features, you’ll also help us find bugs ensuring the highest quality software.

As a starting point, have a look at this guide to being an effective beta tester to get an overview. If you would like to receive occasional emails with beta news, updates, tips and tricks, please sign up below.

In Unity 2018.2.0b3 release note mentioned as TLS 1.2 is added and working with all platforms. When we tested it, still its not working and throws some error. Same code is working in .Net application 4.5 framework.

Error: SecureChannelFailure (One or more errors occurred.) UnityEngine.Debug:Log(Object) CheckTLS:GetAuth() (at Assets/CheckTLS.cs:31) UnityEngine.EventSystems.EventSystem:Update() Is there any dependency or any specific settings to do so, please do help to resolve this issue.

Hence.it is meaningless that the Texture Mipmap Streaming caches the Texture Data on the CPU-side to save the GPU memory. unless it reloaded the Texture Data from the disk everytime, but the latency would be unacceptable.

Hence.it is meaningless that the Texture Mipmap Streaming caches the Texture Data on the CPU-side to save the GPU memory. unless it reloaded the Texture Data from the disk everytime, but the latency would be unacceptable.

There is no concept of “the Portion of RAM that GPU used” in the UMA architecture.
The “CPU Memory” and the “GPU Memory” are just the same thing.
The CPU can use the Portion of RAM that GPU used as well as the GPU can use the Portion of RAM that CPU used.
There is no diffierence whether the Texture Data is on CPU-side or GPU-side, because they are just the same thing.

I think the feature “Texture MipMap Streaming” is meanless for mobile GPU.
Because the architecture of the mainstream mobile GPU is UMA.
In the UMA architecture, there is no Video Memory.
Hence, CPU and GPU shares the System Memory and there is no difference between the “CPU Memory” and the “GPU Memory”.
For more information about UMA, you can search “VK_MEMORY_HEAP_DEVICE_LOCAL_BIT” by google or read the “Decive Memory” Chapter in the Vulkan Specification.

Why?It can still save the memory even in UMA architecture(the portion of the RAM that GPU used).The memory of texture will not remain on the RAM(CPU Side) when textures are upload to GPU in UMA archhiteture(the portion on RAM)

The “GPU Memory” and the “GPU Memory are just the same thing.
There is no concept of “the Portion Of RAM that GPU used”.
The CPU can use the Portion of RAM that GPU used as well as the GPU can use the Portion of RAM that CPU used.
There is no difference whether the Texture Data is on CPU-side or GPU-side.

There is no concept of “the Portion of RAM that GPU used” in the UMA architecture.
The “CPU Memory” and the “GPU Memory” are just the same thing.
The CPU can use the Portion of RAM that GPU used as well as the GPU can use the Portion of RAM that CPU used.
There is no diffierence whether the Texture Data is on CPU-side or GPU-side, because they are just the same thing.

Why?It can still save the memory even in UMA architecture(the portion of the RAM that GPU used).The memory of texture will not remain on the RAM(CPU Side) when textures are upload to GPU in UMA archhiteture(the portion on RAM)

There is no concept of “the Portion of RAM that GPU used” in the UMA architecture.
The “CPU Memory” and the “GPU Memory” are just the same thing.
The CPU can use the Portion of RAM that GPU used as well as the GPU can use the Portion of RAM that CPU used.
There is no diffierence whether the Texture Data is on CPU-side or GPU-side, because they are just the same thing.

Hence.it is meaningless that the Texture Mipmap Streaming caches the Texture Data on the CPU-side to save the GPU memory. unless it reloaded the Texture Data from the disk everytime, but the latency would be unacceptable.

SRP is not really ready to be used as marketing material for this engine. You are misleading your customers that they can get either beautiful visuals (HD) or universal mobile support (LW) just by downloading the latest version. Never mind that their entire projects will have to be rebuilt, and all sahders will have to be rewritten. Then rewritten again in a few weeks/months when the SRP spec changes. Furthermore, without having a SurfaceShader-like abstraction layer for SRP so that one shader can be used for BOTH Pipelines you are going to fragment your community and run the risk of killing the AssetStore. Your engineering and graphics teams need to think harder about this. SRP is not ready.

Well it’s either that or we stay stuck with the terribly archaic legacy renderer forever. Big improvements like this take time, and it’s unreasonable to think that things are just supposed to get automagically converted from past projects. That would be absolute nonsense.

If you keep complaining about massive new features like this not being ready instantly, then I hope you have an actual solution to offer instead

My solution is for them to develop a shader code abstrction layer, like Surface Shaders do for current shaders. This is really a mandatory feature in my opinion, and I don’t see any reason they can’t do this given they are essentially doing the same thing with their visual shader node system (which is also not ready for making complicated shaders yet). I’m not complaining about new features, I’m complaining about the lack of vision when implementing them. This is a huge change that eventually will become the default, and I don’t think some people realize the magnitude that will have in the interim. Being able to write 1 shader and material and apply that to both pipelines when necessary is a worthy goal and should be considered very seriously by the engineering team.

What I don’t understand is we can still use the standard renderer and keep using our Standard shader materials, right? Is it only if we switch the render pipeline that we have to use new materials? Can’t we just use the standard shader and have Unity strip out or add what’s missing?

When were we going to get the Tilemap tools added in by default? While I’m on the topic, is there any news about the isometric grid? I’m glad to see the various performance improvement for sure, but it would be nice to have proper Tilemap support instead of needing to grab it off GitHub and add it and the extras to each new project that uses it.

Hi Mick,
As soon as the 2D tools are a little bit more mature, we’ll push the packages to our production repository and you’ll be able to install them from the Package Manager UI. Moreover, we are planning on having templates to create new 2D projects that will automatically add these packages. Thanks for your feedback!

Hi Mahdi,
Yes texture streaming should speed up initial load times when enabled. The system will load less data initially as it will not load the higher mips until the system detects that the current camera location requires them.