Andreas Ritter has managed to encode JPEGs in Javascript. This blog post explains how he did it, shows some benchmarks, and provides a demo and a downloadable library so you can play along at home.

It was surprising that it was that easy to get the first js-encoded jpeg displayed in the browser. Of course I didn’t want to stop there. I wanted to optimize things as much as I could to make the encoder fast. This took me several days. I found optimized encoder versions for flash and haxe floating around the net (Faster JPEG Encoding with Flash Player 10) and tried the optimizations used there in my javascript version. As you can seen in the benchmarks below I was quite successful.

Another idea was to use the new web workers to do the heavy lifting in an separate thread, not blocking the gui. This is something flash can’t do. So I created a version using a web worker for the encoding.

The API gives you a JPEGEncoder or an alternative JPEGEncoderThreaded. Usage is straightforward:

I think the results show that JavaScript is quite fast (at least in Safari and Chrome). A little over 4 seconds for the non-threaded version is a very good [sic] result, when compared to the 3,3 seconds the optimized flash jpeg encoder takes. Please note, that JavaScript has no static types, no byte array, no Vector-class and is not pre-compiled. Taking these facts into account Nitro and V8 are faster than the ActionScript 3 VM.

Comparing the different browsers Nitro and V8 are a magnitude faster than TraceMonkey. Firefox 3.6b2 shows some improvements, but it’s still a long way. Probably the Mozilla guys should consider adopting Nitro or V8?

Errr.. Why would anyone do that? Are there any use cases where this would be practical?
I’m beginning to get bored with “this js engine is so fast that it could _____”… You can encode an image server-side in a dozen milliseconds and then send much smaller pic to the browser. That way you save time on dealing with the pic and sending it.
I don’t remember ever having problems with the speed of javascript on any application.

@Schill: This is good, because the js based encoder does not work in Opera currently. The window.btoa() method is not available.

@YangombiUmpakati: The native encoder should be much faster, than the js version, but I haven’t tried it. Please note Firefox and Webkit also support .DataURL and can create jpegs, the only problem is, that the quality parameter didn’t work there. I think it should be a matter of days until this has changed ;-)

@Breton: A PNG encoder wouldn’t make much sense, and porting the AS3 version would be not that simple, because most of the encoding is done in the native zip compression method of the AS3 ByteArray.

@mare: Originally I just wanted to have a jpeg encoder with variable quality for an Appcelerator Titanium application. Encoding unchanged images on the client side indeed doesn’t make sense, but a canvas created image is something different. Mostly creating the encoder was a) pushing the frontier b) practice in optimizing js code and using web workers.

@Breton
JavaScript is being used for more then just websites these days. It’s becomming a language for desktop/mobile apps (together with html/css) as well.
.
But for webapps, I can imagine an online service to edit your photo’s and save it to your desktop using clientside processing only, avoiding http requests as much as possible. Server traffic to a popular photo editting service can be veeeery expensive if actual computations had to be done server side, use the clientside processor I say ^^

@mare: This is an experiment to see what’s possible. Very often advances like this are created without any end product in mind, only to be used later in ways the original creator never intended.

Penicillin was a neat lab trick for a decade before scientists began realizing its true potential. Rubber was a useless natural substance until Charles Goodyear found a way to make it more durable. While this doesn’t rank up there with these advances, step back and take a breath before tearing down someone’s work.

Really good work !
It is sad that it doesn’t work on IE (well, IE doesn’t support canvas, so it would be hard to get a CanvasPixelArray, anyway).
This could be a start for a multiple format image encoder : why not GIF now ? ;) (not sure it would be possible because of the license), or even better : aPNG (and/or MNG). Even if BMP is a really bad format, an encoder already exists : http://www.nihilogic.dk/labs/canvas2image/

@ajaxianreader123 While I see your point I think you’re presenting a weak analogy. Encoding images is a solved problem. Penicillin solved an unsolved problem. This solution does not improve the current encoding solutions. Goodyear improved tire rubber. My point is simply that the audience of Ajaxian knows there are inquisitive programmers who like to experiment — and I would agree that, in general, experiments (like Andrea’s!!) can lead to new, useful, ideas. However, being that the audience has several of these smart programmers in its ranks, maybe Ajaxian could focus on useful things mostly, not ideas that have no practical application now, and probably never will.

How is this not useful? its a shame the fundamentals of js aren’t great with images, I’d love to see quality image resizing ability, today even with the latest firefox and its ‘quality image rendering’ the quality is poor, its still designed in terms of rendering performance instead of being powerful.

If a user comes to your site with a 30meg 1080p bitmap, would you rather tell them to get lost and make the image smaller (potentially requiring them to research how exactly they should rescale and save in a more compressive format).. or would you rather your client side code saw the problem, downsampled and resaved in jpeg and ‘just worked’?

I know what I would rather.. just because we understand the limitations that exist today, doesn’t mean we should accept them, the fact that users have had to learn such technical details in order to use websites today isn’t a good thing.

@meandmycode
Your example is ridiculous. I am pretty sure a JavaScript engine would die/choke processing 30meg of data (browsers chug bad enough with more than a few meg of HTML). In this case you could let them upload it and resize it on the server.

row1.. clearly you missed the point where I said “javascript should be able to”.. you cannot even resize realistically today, so how you decided it would probably die/choke on 30mb of data I have no idea..

JS needs to be able to expose more control over image manip, and then yea processing 30mb of data would be faster than uploading 30mb of data.. sure the server could resize 30mb of data a ton faster but uploading 30mb of data takes a LONG time, most internet connections today still have terrible upload capacity..

Not to mention the shear reality here.. regardless of today’s JS ability, do you think it makes sense that a client program needs to upload data to another machine in order to manipulate it.. that’s ludicrous.. apparently you missed the whole point where I said ‘just because that is today’s reality doesn’t mean we should accept it’..