Mozilla and Unity today announced that Unity 5, to be released later this year, will include an early access preview of a version of the 3D engine that supports WebGL and asm.js, enabling plug-in-free access to the Web.

The Unity game engine has found huge success among game developers as it can target Windows, iOS, Android, OS X, Linux, PlayStation 3, Xbox 360, and more. Unity games can also be deployed on the Web, but this function currently uses a browser plug-in, the Unity Web Player. The early access will remove the need for the plug-in. Initially, it will only support desktop Firefox and desktop Chrome, due to their performance and (in Firefox's case) explicit support for the high performance asm.js subset.

A bunch of Unity games running in the browser with WebGL.

While the WebGL/asm.js version of the Unity engine is not as fast as the plug-in version, the companies say that it's still good enough to hit 60 frames per second in a range of games. Unity is looking for feedback from developers during this early access preview to refine and improve the engine prior to producing a final version. The generated code takes advantage of some of the new browser APIs that Mozilla is pushing, such as support for gamepads.

Mozilla first approached Unity to investigate the possibility of porting its engine to Web technologies two years ago. Unity has seen considerable developer interest in a WebGL version of its engine, so the two companies worked together to make it happen. Unity's long-term goal is to enable portable, safe code that will run on any of its supported platforms, using Unity's C# scripting. The native Web support converts that C# to C++ code and then produces asm.js using the Emscripten compiler.

56 Reader Comments

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

Modern browsers have the concept of Local Storage which allows a developer to store data in a more permanent and manageable way (rather than automagical browser caching).

Also, Firefox (and I think Chrome as well) supports application manifests as well, which further extends the Local Storage concept to allow offline applications as well.

I think this may have been an issue several years ago, but with network speeds as fast as they currently are, I don't think this is going to be a problem. Besides, I'm no programmer, but I would assume that browser games could be designed to "load as you play", or to only load the next required section after certain checkpoints, kind of like ajax for WWW browsing.

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

Modern browsers have the concept of Local Storage which allows a developer to store data in a more permanent and manageable way (rather than automagical browser caching).

Also, Firefox (and I think Chrome as well) supports application manifests as well, which further extends the Local Storage concept to allow offline applications as well.

The other part of the question is what limit may this put on web based games? I see many simple games that are 1GB+. The smallest games I have are 2D vector graphics, but everything else is 3D and have huge texture packs.

This is really about Chrome breaking support for native plug-ins, so Unity and Netflix and everyone else has to scramble around looking for a workaround that works almost as well. The companies' saying "good enough to hit 60 frames per second in a range of games" is disingenuous, since "a range of games" could mean Rogue and text adventures.

As a developer, it's frustrating that I need to dial back poly and entity count to compensate for all these layers of transcompilation and emulation, because the browser developers keep yanking the rug out from under us. Yes, hypothetically one day WebGL and asm.js might be faster -- but right now, for anything but carefully limited tech demos, they're only a fraction as efficient as native code.

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

Cross-platform support is a plus alright, especially now that Microsoft have relented and allowed WebGL support in IE. Another plus is that you can visit a site and be playing within seconds, no install, no commitment, no fuss.

And no you wouldn't have to stream assets every time, just the first time, they can even be streamed in a low lovel-of-detail first, then upgraded to better quality during the first few seconds of play. BF4 does something similar on the PS4 (the first ~20 seconds of matches have muddy textures, low-poly models and no sound), so I wouldn't see why it'd be a problem for a more low-tech browser game.

Besides, I'm no programmer, but I would assume that browser games could be designed to "load as you play", or to only load the next required section after certain checkpoints, kind of like ajax for WWW browsing.

You're bang on actually, I'm making a WebGL/Three.js game myself and I'm implementing dynamic loading to allow instant play. As the platform is the web, the protocol used is just plain ol' AJAX/XMLHttpRequest.

Hm... good timing, if the rumors about Amazon putting out a set-top Android box for gaming are true.

That thing's likely to have a web browser, likely to have a gamepad, likely to be hooked up to a living room TV, but not likely to ship with support for Flash or other browser plugins.

I'm thinking specifically of stuff like the web-based games for Hub TV shows, targeting little kids, or maybe also Adult Swim web games. An Amazon set-top box won't be able to play today's Flash-centric versions, but could perhaps be able to play Unity-based stuff via this new mechanism.

All i had going through my head while reading this was Kerbal space program in my browser and imagining the possibilities...

I wouldn't get too exited about this just yet...

KSP uses the Unity-provided physics engine, which currently can only calculate physics on a single core. The game can still run like a dog even on well-specced systems due to this limitation, add the fact that Javascript executes much slower than C++ (despite improving a lot in recent years)... and yeah, you get the picture.

All i had going through my head while reading this was Kerbal space program in my browser and imagining the possibilities...

I wouldn't get too exited about this just yet...

KSP uses the Unity-provided physics engine, which currently can only calculate physics on a single core. The game can still run like a dog even on well-specced systems due to this limitation, add the fact that Javascript executes much slower than C++ (despite improving a lot in recent years)... and yeah, you get the picture.

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

Modern browsers have the concept of Local Storage which allows a developer to store data in a more permanent and manageable way (rather than automagical browser caching).

Also, Firefox (and I think Chrome as well) supports application manifests as well, which further extends the Local Storage concept to allow offline applications as well.

Unless these games somehow get around it, the limit of local storage is about 5MB. Also local storage must be saved as a string, which is terrible when you need to store some binary data like textures and models.

Application manifests however, will allow an app to 'cache' itself for offline mode (or just subsequent loads later) so hopefully that gets around any local storage limitations...

This is really about Chrome breaking support for native plug-ins, so Unity and Netflix and everyone else has to scramble around looking for a workaround that works almost as well. The companies' saying "good enough to hit 60 frames per second in a range of games" is disingenuous, since "a range of games" could mean Rogue and text adventures.

As a developer, it's frustrating that I need to dial back poly and entity count to compensate for all these layers of transcompilation and emulation, because the browser developers keep yanking the rug out from under us. Yes, hypothetically one day WebGL and asm.js might be faster -- but right now, for anything but carefully limited tech demos, they're only a fraction as efficient as native code.

Look into the history of the JRE browser plugin, or the Flash plugin, or the Adobe PDF plugin. Chrome did not break plugins. Plugins broke themselves, and it was up to browser developers to find a better way to present content than the proprietary bunch of security holes that plugins end up being. To my knowledge, Unity's webplayer is one of the better-behaved major browser plugins-- but its virtues do nothing to cancel out the fact that plugin architecture is a maintenance-heavy headache that creates more problems than it solves.

To my knowledge, Unity's webplayer is one of the better-behaved major browser plugins-- but its virtues do nothing to cancel out the fact that plugin architecture is a maintenance-heavy headache that creates more problems than it solves.

Well, what should be the alternative? What should we use instead of plugins that lets me write code that runs exactly as fast as native C++ inside a browser?

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

Modern browsers have the concept of Local Storage which allows a developer to store data in a more permanent and manageable way (rather than automagical browser caching).

Also, Firefox (and I think Chrome as well) supports application manifests as well, which further extends the Local Storage concept to allow offline applications as well.

Unless these games somehow get around it, the limit of local storage is about 5MB. Also local storage must be saved as a string, which is terrible when you need to store some binary data like textures and models.

Application manifests however, will allow an app to 'cache' itself for offline mode (or just subsequent loads later) so hopefully that gets around any local storage limitations...

(Disclaimer: I am probably completely wrong)

They'll likely be using IndexedDB, as it's designed for large objects. It's supported on all modern browsers and platforms except Safari and iOS (which, I'll admit, is a pretty huge limitation for mobile).

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

Modern browsers have the concept of Local Storage which allows a developer to store data in a more permanent and manageable way (rather than automagical browser caching).

Also, Firefox (and I think Chrome as well) supports application manifests as well, which further extends the Local Storage concept to allow offline applications as well.

Unless these games somehow get around it, the limit of local storage is about 5MB. Also local storage must be saved as a string, which is terrible when you need to store some binary data like textures and models.

Application manifests however, will allow an app to 'cache' itself for offline mode (or just subsequent loads later) so hopefully that gets around any local storage limitations...

(Disclaimer: I am probably completely wrong)

They'll likely be using IndexedDB, as it's designed for large objects. It's supported on all modern browsers and platforms except Safari and iOS (which, I'll admit, is a pretty huge limitation for mobile).

Another option is WebSQL, which works in Chrome, Opera, Safari, and the Android browser. If some framework abstracts the difference so you can store stuff via either WebSQL or IndexedDB, that'd get you an awful lot of total coverage.

To my knowledge, Unity's webplayer is one of the better-behaved major browser plugins-- but its virtues do nothing to cancel out the fact that plugin architecture is a maintenance-heavy headache that creates more problems than it solves.

Well, what should be the alternative? What should we use instead of plugins that lets me write code that runs exactly as fast as native C++ inside a browser?

asm.js currently has 67% of the performance of native C++ (up from 40% last year). I think that is enough for a similar experience.

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

Modern browsers have the concept of Local Storage which allows a developer to store data in a more permanent and manageable way (rather than automagical browser caching).

Also, Firefox (and I think Chrome as well) supports application manifests as well, which further extends the Local Storage concept to allow offline applications as well.

Unless these games somehow get around it, the limit of local storage is about 5MB. Also local storage must be saved as a string, which is terrible when you need to store some binary data like textures and models.

Application manifests however, will allow an app to 'cache' itself for offline mode (or just subsequent loads later) so hopefully that gets around any local storage limitations...

(Disclaimer: I am probably completely wrong)

They'll likely be using IndexedDB, as it's designed for large objects. It's supported on all modern browsers and platforms except Safari and iOS (which, I'll admit, is a pretty huge limitation for mobile).

I seriously want to know what the deal is with IndexedDB. It truly is the worst designed modern API that I've ever had the displeasure of using. So verbose and convoluted. What was wrong with a Redis-like key-value store, or the SQLite-like WebSQL?

What was wrong with a Redis-like key-value store, or the SQLite-like WebSQL?

The "official" reason WebSQL ended up getting rejected was due to lack of completely independent implementations. Everyone just ended up using it to wrap sqlite, so there was only ever really one implementation that everyone used, which the standards-folk didn't like.

I got the impression that a big factor was that Microsoft just didn't like it for some reason. (Maybe because Apple was so quick to implement it? Or does sqlite perform badly on Windows or something?)

I'm curious why this supposedly will not work with Internet Explorer 11 since it uses WebGL, IndexedDB, asm.js, and has a reasonably fast JavaScript engine. Perhaps I'm wrong, but all of the press about this only mentions Firefox and Chrome.

Lol im sorry, what exactly are Mozilla and unity releasing ? which will primarily only work on desktop with chrome and firefox?

No. HTML5 Unity (or whatever they're calling it) should run on every modern platform -- with the current exception of iOS. (WebGL is locked behind a flag on Safari at the moment.) All you'll need to do is enter a URL and start playing.

Instead of rolling their own engine, game developers should be able to use Unity to achieve a similar experience. This will make targeting a platform that doesn't (currently) have good browser support (i.e. iOS) easier.

asm.js is JavaScript, but it only allows language features that can be aggressively optimized by browsers. It was originally created by Mozilla, but Google's has updated V8 to improve asm.js performance too.

Emscripten is a source code transpiler that can convert languages like C++ into asm.js compliant JavaScript -- often with much less work than you'd fear. (That's likely why Unity is going from C# to C++; it makes sense to use emscripten to do most of the hard work!)

This is really about Chrome breaking support for native plug-ins, so Unity and Netflix and everyone else has to scramble around looking for a workaround that works almost as well. The companies' saying "good enough to hit 60 frames per second in a range of games" is disingenuous, since "a range of games" could mean Rogue and text adventures.

I think it's unfair to call it "disingenuous" just because you think it might mean something else.

But more to the point, it is in fact 100% accurate, the 60fps figure includes full 3D games like the ones seen in that video. They run just as good as the video shows, even on standard hardware (I'm on linux with intel graphics, for example).

To my knowledge, Unity's webplayer is one of the better-behaved major browser plugins-- but its virtues do nothing to cancel out the fact that plugin architecture is a maintenance-heavy headache that creates more problems than it solves.

Well, what should be the alternative? What should we use instead of plugins that lets me write code that runs exactly as fast as native C++ inside a browser?

asm.js currently has 67% of the performance of native C++ (up from 40% last year). I think that is enough for a similar experience.

That was the first thing that came to my mind, but then I thought about SIMD instructions. I highly doubt JS is going to make use of SSE/etc, so C++ can still be dramatically faster for games. But I am excited to see how this goes.

To my knowledge, Unity's webplayer is one of the better-behaved major browser plugins-- but its virtues do nothing to cancel out the fact that plugin architecture is a maintenance-heavy headache that creates more problems than it solves.

Well, what should be the alternative? What should we use instead of plugins that lets me write code that runs exactly as fast as native C++ inside a browser?

asm.js currently has 67% of the performance of native C++ (up from 40% last year). I think that is enough for a similar experience.

That was the first thing that came to my mind, but then I thought about SIMD instructions. I highly doubt JS is going to make use of SSE/etc, so C++ can still be dramatically faster for games. But I am excited to see how this goes.

Aren't the SIMD instructions precisely what's going to be covered by the WebGL part of this? Which means JS is just going to be used to pass them off to the GPU, and those bits should end up as fast as in C++ applications.

I'm just trying to figure out how the caching/downloading will work. If you have to download a significant amount of data to play anyway, I'm not seeing the advantage of running in a browser. Maybe easier cross-platform support, but we'll see.

Downloading/preloading a large amount of assets each time you want to play though doesn't sound like a great idea.

Modern browsers have the concept of Local Storage which allows a developer to store data in a more permanent and manageable way (rather than automagical browser caching).

Also, Firefox (and I think Chrome as well) supports application manifests as well, which further extends the Local Storage concept to allow offline applications as well.

Unless these games somehow get around it, the limit of local storage is about 5MB. Also local storage must be saved as a string, which is terrible when you need to store some binary data like textures and models.

Application manifests however, will allow an app to 'cache' itself for offline mode (or just subsequent loads later) so hopefully that gets around any local storage limitations...

(Disclaimer: I am probably completely wrong)

I think newer browsers can ask if you want more storage. There's a lot of options now at least. You have the ability in JS to create databases (which you could, if you were crazy, use to cache info), and local storage, and cookies, and offline manifests.

The future is exciting (as much as I still feel the web isn't the right place for high performance games).

To my knowledge, Unity's webplayer is one of the better-behaved major browser plugins-- but its virtues do nothing to cancel out the fact that plugin architecture is a maintenance-heavy headache that creates more problems than it solves.

Well, what should be the alternative? What should we use instead of plugins that lets me write code that runs exactly as fast as native C++ inside a browser?

asm.js currently has 67% of the performance of native C++ (up from 40% last year). I think that is enough for a similar experience.

I looked around, but I can't find any quote for what kind of benchmark they used to arrive at that number.

Without specifying that any number such as 67% is rather useless, because depending on the benchmark that may range from really bad, to pretty good.

asm.js currently has 67% of the performance of native C++ (up from 40% last year). I think that is enough for a similar experience.

It really isn't. As an engine programmer I spend most of my time trying to eke out 2% improvements in performance so that the designers can get just a few additional enemies onscreen, or enable shadows, or run at 1080p. People take to forums with torches and pitchforks when they learn that one platform's version of a game has a few less pixels than another version! The idea of losing 33% just to overhead is alarming.

Also, it's not clear where that 67% number comes from. What exactly were they benchmarking? Are the results the same on other interpreters and other machines? Were they CPU or GPU bound?

The issue isn't just the SIMD used in the heavy math hot loops. It's the little efficiencies everywhere. Memory management and garbage collection is a huge part of JS's cost -- easily more than 10% of total CPU time. In our engine it's <1% because we use a tuned block allocator and preallocate as much as possible. We unroll loops, manage L2 cache, remove indirect functions and pipeline bubbles.

All of this hypothetically could be done with asm.js and a perfect optimizing compiler. But until those exist, I either need to work to the lowest common denominator and compromise my console experience for the browser, or give up on in-browser Unity and ship just on console and Steam.

To my knowledge, Unity's webplayer is one of the better-behaved major browser plugins-- but its virtues do nothing to cancel out the fact that plugin architecture is a maintenance-heavy headache that creates more problems than it solves.

Well, what should be the alternative? What should we use instead of plugins that lets me write code that runs exactly as fast as native C++ inside a browser?

asm.js currently has 67% of the performance of native C++ (up from 40% last year). I think that is enough for a similar experience.

I looked around, but I can't find any quote for what kind of benchmark they used to arrive at that number.

Without specifying that any number such as 67% is rather useless, because depending on the benchmark that may range from really bad, to pretty good.

It's a bit old but I think that's what they are referring to. I'd imagine current performance is better.Also they are working on accessing SIMD, amongst other things. Check out the Mozilla hacks blog, if you're interested.

The first is a set of benchmarks ranging for specific microbenchmarks up to realistic codebases like box2d, bullet, zlib and lua. The second is a further confirmation on box2d by a person who has been benchmarking that engine extensively for a few years, and who I would consider the authority on the subject.