Mozilla making the Web a gaming platform with Unreal 3 engine in a browser

High performance JavaScript and WebGL could put high-end games on the Web.

Mozilla wants the Web to be a platform that's fit for any purpose. That's why the company is investing in Firefox OS—to fight back against the proliferation of platform-specific smartphone apps—and it's why the company has been working on WebGL, in order to bring 3D graphics to the browser, Emscripten, a tool for compiling C++ applications into JavaScript, and asm.js, a high performance subset of JavaScript.

The organization doesn't just want simple games and apps in the browser, however. It wants the browser to be capable of delivering high-end gaming experiences. At GDC today, the company announced that it has been working with Epic Games to port the Unreal 3 engine to the Web.

The Unreal 3 engine inside a browser.

With this, Mozilla believes that the Web can rival native performance, making it a viable platform not just for casual games, but AAA titles.

However, there's more to a game than JavaScript and WebGL. One problem with current WebGL applications (most tending to be proofs of concept rather than fully developed games) is that of load times. Even though traditional games have high-speed access to textures and models stored locally, on a hard disk or optical medium, their load times are significant.

Streaming a gigabyte of map data and texture from a Web server just to play a game is obviously a non-starter; you wouldn't be waiting 30 seconds for a level to load, you'd be waiting 30 minutes. As an example, BioShock Infinite, an Unreal 3-powered high-end title, takes about 17 GB on disk, the vast majority of which is game data. That's not something that you want to have to wait for mid-game.

The organization that's responsible for the development of OpenGL, WebGL, and other related specifications, the Khronos Group, has set its sights on this problem. It's early yet, but the group is planning to develop a common set of data formats for 3D models, textures, and other resources that 3D applications need, as well as a system for negotiating these resources.

With this in place, an online game could tell a remote asset server information such as how much bandwidth it had, what its screen resolution was, and so on, and be sent a set of resources that were appropriate. So, for example, a system with a slow Internet connection could be sent simpler models and lower resolution textures, in order to load quickly.

Every time I see people talking about WebGL I can't help but to think what a huge security hole it must be. Graphics drivers are hugely complex and optimized for speed, not security. I suppose as long as you can lock it down and only allow it to load on certain sites it will won't be so bad, but it still makes me nervous.

Then again, I thought Javascript, Java, and ActiveX were going to be huge security nightmares when they first appeared in browsers back in the old days. I guess we know how that turned out.

Every time I see people talking about WebGL I can't help but to think what a huge security hole it must be. Graphics drivers are hugely complex and optimized for speed, not security. I suppose as long as you can lock it down and only allow it to load on certain sites it will won't be so bad, but it still makes me nervous.

Then again, I thought Javascript, Java, and ActiveX were going to be huge security nightmares when they first appeared in browsers back in the old days. I guess we know how that turned out.

Well javascript is not a big security problem, and GPU are different each other with different driver configuration so it shouldn't be easy to exploit it, and on windows everything is going through ANGLE so another layer between you and the metal. And it would make driver makers think more about security.

This seems kind of silly. The article mentions the data-size issue for games, and then says something like "but they are working on solving it by making standard formats for this stuff" like that actually even matters. Unless they invented some OMG crazy real-time compression for this stuff its not going to be useful. This is the opposite end of the cloud gaming problem, if you render it on the server you have latency issues (like on-live) if you render it locally but host it in the cloud you have bandwidth issues. Neither are pleasant for games.

Intelligently loading level 2 while you're still playing level 1 would help too.

Well you would still have to load level 1 the first time. Also you would have to balance traffic needed to play the game with downloading the next level. Supposed it all worked, it would still likely eat bandwidth like nuts. People with download caps would be in a real poor position too.

Conceptually, this is pretty cool. I'm a big fan of playing with data distribution and finding new ways to do things, but from a practical standpoint, I have to ask, "why?" Who is the target market for this? The article wasn't clear on specifically what this is trying to accomplish, so maybe the goal is to be able to run games on any OS by using the browser as a medium? If that's the case, I could see this being useful to Mac and Linux users, since it's really rare for games to be ported to these systems (and those that are, often are much later than the PC release and buggier). But I can't see any Windows users ever making use of this and if the end result of this project is still OS-specific, I can't see anyone using it.

Intelligently loading level 2 while you're still playing level 1 would help too.

Some games do this. I wish I had a list--I can't think of any examples off the top of my head--but I was under the impression that dynamically loading areas as you go is a fairly common practice, particularly for open-world games. The only time I can think of when there are loading times in Fallout, for example, is when you enter buildings. The overworld loads in chunks as you go. There's also the approach the Metroid Prime games took, where the entire level is loaded into memory during an obfuscated loading screen (elevator cutscene). If more games used cutscenes as a mask for loading it would do a lot to make those transitions seem quicker.

The download thing is obviously an issue, but why not just skip it? Install the game and all its assets. Why bother with WebGL then? In theory, it's totally cross-platform. Download the game as usual (through Steam? Sure.), and it runs on Linux or Mac OS or QNX or whatever can support WebGL and has the hardware behind it.

Simple games can still stream their assets, but downloading 17 GBs once up front clearly isn't that big of an issue now: how many people are downloading Bioshock Infinite off of Steam right this moment?

I think talking about AAA titles is a bit premature, but you could certainly use Unreal Engine 3 to make some impressive titles that are more specifically designed for streaming play with lower bandwidth requirements etc. You can use Unreal Engine 3 to create artistic, but relatively primitive, games just as much as you can use it for high-detail realism.

The key to making this kind of thing popular is to get access to huge caches on the user's machine, and lots of intermediate caches to serve up the data at the highest speeds possible. It's a pretty tall order though.

For the raw downloading I'd love to see a peer network like bittorrent used to solve this kind of problem, as they are able to grow in capacity with demand while keeping data being served from a range of sources; optimise it for geolocation and skew the emphasis towards downloading in order (lots of badly behaved bittorrent clients already do this) and you could have yourself a solution. The key problem here is allowing browsers to create that kind of connectivity.

I don't know, I kind of LIKE downloading games and not being on the web. A web browser is just one of many internet clients. Why do we have to shoehorn EVERYTHING onto the web? The whole point of the internet is that you aren't tied to one solution. Sounds like Mozilla is just trying to establish its own walled garden (of sorts), governed by a committee.

That said, I'm all for having 3D on the web that actually delivers (we've gone from VRML to Webgl to Unreal... pretty nice)...I just don't think it needs to be (or should be) at the expense of non-web-based software.

The download thing is obviously an issue, but why not just skip it? Install the game and all its assets. Why bother with WebGL then? In theory, it's totally cross-platform. Download the game as usual (through Steam? Sure.), and it runs on Linux or Mac OS or QNX or whatever can support WebGL and has the hardware behind it.

Simple games can still stream their assets, but downloading 17 GBs once up front clearly isn't that big of an issue now: how many people are downloading Bioshock Infinite off of Steam right this moment?

Exactly. I seem to recall reading about some sort of FireFoxOS phone on this very site which had "web apps" which pointed to local resources. Instantly cross-platform (if not particularly optimized) does have some appeal. Download the files, play in your web browser, and finally same-day launch for Windows, Mac, and Linux.

I don't know, I kind of LIKE downloading games and not being on the web. A web browser is just one of many internet clients. Why do we have to shoehorn EVERYTHING onto the web? The whole point of the internet is that you aren't tied to one solution. Sounds like Mozilla is just trying to establish its own walled garden (of sorts), governed by a committee.

That said, I'm all for having 3D on the web that actually delivers (we've gone from VRML to Webgl to Unreal... pretty nice)...I just don't think it needs to be (or should be) at the expense of non-web-based software.

There's no reason your web browser can't load and run stuff from a local file system. It doesn't have to be remotely stored and streamed.

Every time I see people talking about WebGL I can't help but to think what a huge security hole it must be. Graphics drivers are hugely complex and optimized for speed, not security. I suppose as long as you can lock it down and only allow it to load on certain sites it will won't be so bad, but it still makes me nervous.

Then again, I thought Javascript, Java, and ActiveX were going to be huge security nightmares when they first appeared in browsers back in the old days. I guess we know how that turned out.

LALALALA! -the posters downvoting this.

This and many other similar security related topics have been downvoted lately. Just because you've never been hacked, or don't want to be, doesn't mean it isn't possible, or happening this very minute to someone. And even if it does happen to you, I guarantee you will NOT know it happened. LALALALALA!

Drivers are utterly insecure (because their devs never had to worry about it as only local user had access until lately) and drivers have direct access to hardware. Add the two together, insecure and remote access, and what do you get? Tell me?

Intelligently loading level 2 while you're still playing level 1 would help too.

Well you would still have to load level 1 the first time. Also you would have to balance traffic needed to play the game with downloading the next level. Supposed it all worked, it would still likely eat bandwidth like nuts. People with download caps would be in a real poor position too.

Perhaps this kind of thing would cause prospective smartphone buyers to flip the bird to companies which only offer plans with said caps (i.e. AT&T, Verizon) and instead turn toward carriers that offer unlimited data plans, like Sprint and T-Mobile (among various others, both in the U.S. and abroad).

360p video to show-off graphics? As a hardcore gamer (from the master PC race of gamers) I want to throw up every time Mozilla talks about gaming. They are just so sad. It seems that they last played a real game in the late 90s and are not aware that graphics and gameplay have advanced in the meantime.

For those who are worried about the streaming of gigabytes of data, probably the best bet for the moment would be to move to Kansas City. Second-best would be to push your ISP to upgrade their pipes.

This isn't technology that'll be ready for mainstream use in a year. Mozilla is preparing for the world as it is in ten years, and doing a fine job of it.

Of course, developers would want to prepare their games with this kind of delivery in mind. Blizzard (and probably other companies) already do some "prioritisation" of downloads - you can play World of Warcraft long before you've downloaded the last pixel painters. And there will presumably be some decisions to make about what assets need to be on the user's PC as opposed to what game assets can stay in the cloud. But Mozilla is at least thinking about how to deliver quality content, using a standard tool (Unreal Engine), through a web browser.

Isn't there already a "universal" web-connected platform? I won't say that Java so much as failed as never quite lived up to its promise, Nonetheless, Java would be far more suited for such a project, and without all of the drawbacks of being saddled atop a single-threaded 32-bit browser. Investing in this would be a total waste of time. However, with a seemingly endless steam of security issues, Java is dying a slow death in the hands Oracle

There's no reason your web browser can't load and run stuff from a local file system. It doesn't have to be remotely stored and streamed.

Just because we CAN doesn't mean we SHOULD.

Won't someone think of the carbon!

Wasting CPU cycles on run this stuff in a browser, when it could run natively with far less resource consumption is a waste.

Wasting bandwidth on streaming data over the network when it could be downloaded once and run natively is a waste. Sure, it could be downloaded permanently and stored locally, but then, what's the point of THAT vs just doing it natively anyhow?

Opening up the browser to run highly complex GPU code when it could run better natively is a huge security risk, for what? So mozilla can compete in the marketplace? Sounds like it's all about feathering mozilla's nest, rather than being a good thing for users.

There's no reason your web browser can't load and run stuff from a local file system. It doesn't have to be remotely stored and streamed.

Just because we CAN doesn't mean we SHOULD.

Won't someone think of the carbon!

Wasting CPU cycles on run this stuff in a browser, when it could run natively with far less resource consumption is a waste.

Wasting bandwidth on streaming data over the network when it could be downloaded once and run natively is a waste. Sure, it could be downloaded permanently and stored locally, but then, what's the point of THAT vs just doing it natively anyhow?

Opening up the browser to run highly complex GPU code when it could run better natively is a huge security risk, for what? So mozilla can compete in the marketplace? Sounds like it's all about feathering mozilla's nest, rather than being a good thing for users.

But then, that seems to be what mozilla is all about lately.

Because of fragmented market technology ( wait, Ubuntu's coming next year to unify ), this is the way to follow.

Argument give an interesting question to work on, some of them I like.

Quote:

Wasting bandwidth on streaming data over the network when it could be downloaded once and run natively is a waste. Sure, it could be downloaded permanently and stored locally, but then, what's the point of THAT vs just doing it natively anyhow?

Firstly : download 600Mo make me feel lose patience ( imagine xxGo )

Secondly : between 600Mo of data and xxGo, we can find a compromise

So, I luv the good compromise : winner-winner, download the core of the game, let the server do the work of display, the high calculation of IA, the actions of other players and restore only the result. The definitive settings are another question to debate. And this introduce the security question.

Quote:

Opening up the browser to run highly complex GPU code when it could run better natively is a huge security risk, for what? So mozilla can compete in the marketplace? Sounds like it's all about feathering mozilla's nest, rather than being a good thing for users.

In the case of the architecture Terminal-Server through a secure browser as Firefox, the only security hole for the end-user is the bandwidth : at the games developer to think about to optimize and rethink their work. Agree, in this case, everyone need a very high connection. But isn't it still the fact with native software yesterday ?

Anyone who can afford the internet connection needed to make this worthwhile can afford a computer that makes it entirely unnecessary. And with no one intelligent using it for games, WebGL's success will have to depend on a return to web page splash videos.

Downloading a AAA title can be done, and people often do it on Steam. The biggest part of any game is the 'content' (levels, textures and audio). The code part is actually quite small. So with this, the browser could download and store all of the 'content' and check for updates on the server. If a developer re-uses textures for other games, they can save on bandwidth for subsequent titles, sequels and expansions.

Apart from AAA titles, there are also much smaller indie games where downloading the whole game wouldn't be out of the question. And it'd enable the devs to support any OS (potentially, including mobile) where Firefox is installed. Mozilla just need to make sure Firefox can talk to the GPU through DirectX, OpenGL, etc on whatever OS it's on.

I believe the majority of gaming today is done through Flash, and often on Facebook. This technology could easily capture that market and take it forward tremendously. There is absolutely a huge market for web hosted, asset light games, and these users haven't begun to see what WebGL could do for them with current bandwidth limitation.

Unless the engine is used to just play games in the MB size ranges, which is a possibility, they aren't going to be workable or practical with AAA titles.

First, as some have pointed out, you're going to have terrible latency. Latency is still bad in MMORPG and they design those games specifically to deal with latency issues to reduce its impact on the quality of the experience.

Then there's trying to stream or preload Gb of data/files. Even with super magical compression, this won't solve the problem. Nor will setting "common objects" or scaling quality down. For an example of how playable this actually is, you only need to look at Blizzard's implementation to their new patcher for WoW. At around the 50% downloaded mark, you can log in and play... You would be excused if you literally threw your PC out the window. You can log in but it's not exactly playable.

Heck even at 75% completion, you're getting choppy game play, long stalls and freezes. In an RPG, ok, still acceptable since much of the activity is pretty static; but in an action game/FPS, it's literally unplayable.

And we're talking about maybe 1Gb or 2GB patches here.

Local installs are still the best solution. Take time to download it once, process the game locally for the life of the game - giving you the best experience. I'm just unconvinced the streaming direction is the best use of the internet for this type of activity. I truly believe that the MMORPG market has shown the way forward here and used the medium to best effect ; as the layer that connects players. You offload certain types of processing to the cloud but the actual assets need to be processed and loaded locally.

Edit: I'd also add that if this solution caches the assets, this then becomes no different from actually installing the game locally but with the caveat that it could possibly remove those assets at random - say, I play game X today. Tomorrow I play game XX. But what happens when I want to go play Game X after that? Are we to expect the cache to store 10-20GB of assets per game?

Edit 2: Also, are we to expect that current or even near future browsers are robust enough to handle this type of activity? Hell, RIGHT NOW, FF still crashes on certain types of web pages. It still chokes when it has to load a large gallery of thumbnails. I can see incremental improvements over time but to expect reliability and stability for gaming intensive use is rather unrealistic in even the medium term.

So now the quality of my graphics will depend on my connection speed? No, thanks.

How about Mozilla take the money they're spending on this and use it to buy some Congressmen/lobbyists to push telcos/ISPs to move in a consumer-friendly direction?

Stupid as I think this idea is, there are myriad other reasons I'd like a fat, cheap pipe to the internet. The ISPs and telcos are moving in the exact opposite direction, and there's no indication that things will shift back in the direction of the people.

While Mozilla thinks they'll get somewhere putting the Unreal engine in the browser, ISPs are scheming on ways to keep jacking up prices and capping bandwidth. These two things won't mutually co-exist.

For an example of how playable this actually is, you only need to look at Blizzard's implementation to their new patcher for WoW. At around the 50% downloaded mark, you can log in and play... You would be excused if you literally threw your PC out the window. You can log in but it's not exactly playable.

Heck even at 75% completion, you're getting choppy game play, long stalls and freezes. In an RPG, ok, still acceptable since much of the activity is pretty static; but in an action game/FPS, it's literally unplayable.

And we're talking about maybe 1Gb or 2GB patches here.

Agree. Test done. But.

The patch or install is done the same time you play, so the latency will be down when you stop upgrade or install. Test done too and the server send the complement. Test done. After, action game/FPS completely unplayable : an example of game fully playable since end 90's : Battlestar Galactica, no lag, no connection issue, no install.

Edit :

Quote:

How about Mozilla take the money they're spending on this and use it to buy some Congressmen/lobbyists to push telcos/ISPs to move in a consumer-friendly direction?

We have the same issue in Europe and at what I see it seems worse in some parts of America but in Europe the situation is changing because the enterprises need a better QoS.

So, I luv the good compromise : winner-winner, download the core of the game, let the server do the work of display, the high calculation of IA, the actions of other players and restore only the result. The definitive settings are another question to debate.

Wait, are you talking about something very similar to cloud gaming (a.k.a onLive) here but that you have a small core of the game installed (effectively a dedicated client)? if so then no, just no.

The more you "offload" on the remote server the more you are at your connections mercy for reactions and and other such fast paced actions. For many countries this would also put a effective cap on how much they can play per month due to the usage of data caps.

This will work perfectly fine for real time financial graphing. Already porting flash graphics to WebGL just for that purpose. Stochastic graphs of the major indexes look great. Click on a time region to get a 3d cube of volumes versus moving average. Probably not for gamers and hardcore physics.

Compiling machine code to JavaScript, which is going to compiled back into machine code. Must be an idea of the Department of Redundancy Department. I think time and effort would be better spent on defining (yet another) bytecode VM with a JIT to put in the browsers, if portability is a concern (and bars the idea of Google's NaCl), that you can target directly.