Posted
by
Unknown Lamer
on Monday November 12, 2012 @07:50PM
from the because-they-said-we-couldn't-do-it dept.

Emscripten is an LLVM-based compiler from dozens of languages to JavaScript (previously demoed as a repl and used to port Doom to the browser), and some recent changes have made it a bit faster, and allowed it to compile itself. Some highlights include a redundant variable eliminator, parallelization of the optimizier and compiler, and a new relooper. From the developer's weblog: "With all of the emscripten optimization passes now in JavaScript, I then worked on parallelizing that. ... The speedup can be close to linear in the number of cores. ... For the LLVM to JS compiler, I made the emscripten compiler parallel as well: It splits up the LLVM IR into 3 main parts: type data, function data, and globals. The function data part is unsurprisingly by far the largest in all cases I checked (95% or so), and it can in principle be parallelized - so I did that. Like in the optimizer, we use a Python process pool which feeds chunks of function data to multiple JavaScript compiler instances. There is some overhead due to chunking, and the type data and globals phases are not parallelized, but overall this can be a close to linear speedup. ... [On the new relooper] Note that this update makes Emscripten a 'self-hosting compiler' in a sense: one of the major optimization passes must be compiled to JS from C++, using Emscripten itself. Since this is an optimization pass, there is no chicken-and-egg problem: We bootstrap the relooper by first compiling it without optimizations, which works because we don't need to reloop there. We then use that unoptimized build of the relooper (which reloops properly, but slowly since it itself is unoptimized) in Emscripten to compile the relooper once more, generating the final fully-optimized version of the relooper, or 'relooped relooper' if you will."

now what we need to do is compile virtualbox with it so that we can run any pc operating system in the browser even several of them at a time. this could be a great way to bench mark you browser and its js engine of course it would be hell of a slow but a fun project to attempt

Browsers don't typically spamming around tight in timer loops making heavy use of array buffers. So while it might be an interesting test it is not reflective of what browsers are doing for most of their lives.

It's neat to see it working and there will be lots of ways it could be applied but really browsers need something like PNaCl - where LLVM bitcode can be compiled or interpreted by the browser with as little overhead as possible in a sandboxed environment.

Empscripten compiles a low level bitcode into a high level language which ends up being parsed, compiled / interpretted inside the browser as a JS app. It's inherently wasteful.

Aside from the overheads of the above, JS is executed from inside a browser through events. You might have seen a message in a browser along the lines of "Javascript is taking too long to return should I kill it?" if a JS snippet does not exit quickly enough within some preset duration. The browser is essentially frozen while it's

I also didn't mention that the time slicing above would also have to simulate threading for C/C++ apps which need it - a bit like the way Java used to do pseudo threading circa 1.0.2. Everything would be running on the one thread though.

If you checked out the demos, such as bananabread, emscripten is fully capable of doing javascript threading.That is, any work that does not need DOM access can be done on background threads.The "freezing" and "time slicing" are not particularly interesting objections. Any application that has to interact with the DOM has the same issue, which is why JavaScript hasn't solved it on the main thread, apart from the perfectly reasonable approach of event based code. And yes, this means using setTimeout and re

PNaCl *is* bitcode - LLVM bitcode with some APIs that it can see for audio, display and minimal interaction with the browser surrounding it. So instead of translating bitcode to JS and compiling / interpreting that again through the constraints of the browser, JS and the DOM, the browser could execute it directly or hand over execution to a plugin, even one running with reduced privileges in its own process space, and only interacting with the browser for repaints. That's the point. Performance would be a L

Chrome is already available as an IE browser helper object. If the goal is to improve the standards compliance of Internet Explorer, the only advantage of making a browser-within-a-browser in JavaScript is to allow it to run on systems where the user lacks the privilege to install Chrome Frame.

Any serious management team will discard the "Chrome Frame" option as it requires the user to go through extra hassles, and like you said, in some cases where the user does not have the proper privileges it simply does not work. Therefore, this solution is actually a non-solution, unfortunately.

In accordance with Mozilla’s copyright infringement policy, this is to notify you of activity occurring on the Mozilla site listed below which infringes on the exclusive intellectual property rights of Id Software LLC, a wholly owned subsidiary of ZeniMax Media Inc. The copyrighted work at issue is Id Software’s proprietary software game DOOM® (“DOOM”). The link below offers an unauthorized derivation or version of Id Software’s DOOM game.

DOOM is a registered trademark and the game assets are copyrighted material. Use of the mark DOOM and copyrighted assets without our authorization and consent, directly violates our trademark and copyright rights in and to such intellectual property. We hereby demand the immediate removal of all such links from your website and written assurance that you will prevent any further infringement of Id Software’s intellectual property rights. I have a good faith belief that use of the copyrighted materials described above as allegedly infringing is not authorized by the copyright owner, its agent, or the law. I swear, under penalty of perjury, that the information in this notification is accurate and that I am the copyright owner or am authorized to act on behalf of the owner of an exclusive right that is allegedly infringed.

The wikipedia page you reference states explicitly that he was acquitted of all charges in the criminal suit (and is therefore NOT a known criminal), and he agreed to be banned from banking to make a civil suit go away. That suit was never resolved, so we don't really have any idea whether he was guilty or not - he might have agreed on the basis that he was tired of defending himself in court and didn't care much for being a banker anyway. Let's face it, settlements in the face of expensive yet questionable government prosecutions are hardly rare, especially in the USA.

The rest of the wild claims you assert aren't supported by any of the wiki pages either. Maybe it's all true but it's impossible to tell from your post. At any rate, it has nothing to do with the fact that DOOMs artwork and level data is still under copyright. If you want to get pissy at somebody about that, go complain to Carmack - he's the one who decided to open source the code but not the game data. Or not: he does a lot more for open source than most game company CEOs do.

The problem is they cant distribute the wad files. Other Doom engines such as prboom, zdoom, jdoom (my favorite), etc. have done just fine for over a decade as they only provide a "doom compatible" engine. The user must provide the wad files themselves even if they intend to play the shareware wad files that are freely available. They should have made or searched for a freely distributable total conversion wad to avoid any problems. I am sure there are plenty to be had off Doom fan sites.

The reason is, there is no one at iD that knows who owns the copyright to what on the Doom era games. It would take way too much effort and resources to just figure that out, just to let some indy license it for a loss

Yes, total PR disaster. I remember being unable to pick up a newspaper or turn on a news program without hearing the outrage. I'd be walking around on the street or in a store and everyone was complaining about how evil Zenimax was for their takedown notice.

The big issue is probably sandboxing, so you'd need a VM which was sandboxed like Java or Javascript, and then to deal with the constant security headaches as people find vulnerabilities. If it was a trivial problem someone would have done it by now, but please do go ahead:)

No, the squabbling companies have agreed on JavaScript and the DOM as the runtime, and agreement on additional Web APIs for audio, game pads, fullscreen mode, orientation, etc. is likely soon. It works, it's spectacularly compatible in the latest browsers, and there's little need for anything more.

LLVM isn't a runtime, NaCl or better PNaCl talking to the Pepper API is a runtime, but it's an underspecified Google-controlled approach that Apple, Mozilla, Microsoft and Opera will never adopt . For all the all

Won't client-side runtime "compilation" limit the complexity of software which is going to be written, similarly how GLSL run-time compiled shaders lost (on PC) versus offline-compiled HLSL? Not to mention problems with keeping your source code to yourself.

These are two aspects where bytecode (or native) runtime would be certainly better. I don't see benefits of source level runtime... it's no easier to audit sources security-wise than to audit VM and it can break in so many more ways.

emscripten is compilation in advance of massive complex C code into JavaScript that truly runs anywhere. In most cases keeping your source code to yourself is unimportant; Facebook and Google ship MBs of JavaScript source to billions of browsers every minute. My impression is Java has more security vulnerabilities than JavaScript, but then again it's able to do more.

Java in the browser did deliver benefits; yet Java in the browser is almost completely dead. Maybe a bytecode or native runtime would have del

I don't feel any "astounding progress", in my humble opinion user experience of doing things in browser didn't change much since Java applets day: it's still slow, unresponsive and unreliable interface and it loses with native clients each time there exists one. Perhaps some progress has been achieved throughout the years, but it's unobvious for me how Javascript's source-levelness has had anything to do with that. If Microsoft weren't so Java-averse (and if the rest of industry weren't so Microsoft-averse)

Remember, machine language is just another language except it is one tailored to processors. So, I should have said "and/or" instead of just "or" in the parent post. For operating systems, the APIs can be high-level thunked so you could do something like translate DirectX calls on Windows 7 to an equivalent library on an ARM Linux machine. The program code is recompiled but the libraries the program code calls are native to the destination device. DirectX is also just another "hardware feature." You ca

(Clue allows C to be compiled, badly, into a variety of scripting languages including Lua, Javascript and Perl as well as Java. Some nutter even contributed a Common Lisp backend. It was an experiment to see whether exploiting certain vaguenesses in the ANSI C spec concerning pointer representation was useful. Unlike Enscripten, Clue doesn't have a big array of bytes representing the C memory; instead pointers are represented as object-offset tuples. It worked really well, but unfortunately nearly all existing code out there doesn't work right on a system where sizeof(int)==sizeof(double)==sizeof(char)==1 and sizeof(void*)==2. Plus, the compiler frontend I was using had a number of major issues. But it works well enough to run benchmarks.)

(And before you ask, yes, compiling C into Perl 5 is a total, utter, complete waste of time.)