Posted
by
CmdrTaco
on Monday September 28, 2009 @08:39AM
from the hate-when-that-happens dept.

Nithendil writes "guyhersh from reddit.com describes the situation (warning: title NSFW): Based on what I've seen today, here's what went down. Reddit user Empirical wrote javascript code where if you copied and pasted it into the address bar, you would instantly spam that comment by replying to all the comments on the page and submitting it. Later xssfinder posted a proof of concept where if you hovered over a link, it would automatically run a Javascript. He then got the brilliant idea to combine the two scripts together, tested it and it spread from there."

Hi there - you must have just popped in from some alternate universe... did Michael Jackson die there too? Was he black?

In this universe, the speed with javascript is noticeably slower - in many cases it's so slow as to be unusable. I've tried it from both my home and work desktops (quad-core, 4 and 8GB of RAM respectively), and from my Netbook (EeePC 901). It's *always* slower with javascript enabled.

Yep. It's called Google Chrome -- or, more accurately, the Chromium nightly. Javascript executes quickly, and I don't have to wait for an entire separate page to load. Additionally, if I have to wait, the "submit" button has a countdown timer.

And regardless of speed, it is convenient to have that much more context on the page. For example, right now, I can see your post and mine, and I can expand the parents if I need to. If I was replying from the main discussion, I could scroll up to see the whole discussion. Yes, I know about tabs, but even switching with keyboard shortcuts isn't as nice as being able to actually see a few posts of context as I type.

In this universe, the speed with javascript is noticeably slower - in many cases it's so slow as to be unusable.

Just as exploits in the image processing components of web browsers will hopefully educate people to surf in Lynx? Or exploits in their HTML rendering will hopefully educate people to surf by piping wget through less?

This was not because of Javascript, nor is Javascript going away because of this.

Just as exploits in the image processing components of web browsers will hopefully educate people to surf in Lynx? Or exploits in their HTML rendering will hopefully educate people to surf by piping wget through less?

There's a huge difference in complexity between image/HTML renderer and Javascript. Image file formats and HTML pages are not Turing complete, while Javascript is. Consequently, the former are "safe" in that it's possible to prove that a particular implementation is free of exploits that would allow running arbitrary code, while Javascript by definition can never be; the whole point of Javascript is to allow arbitrary code execution, so the best you could ever prove is that the code never leaves the confines of the Web browser - but having a script post comments does not require that.

This was not because of Javascript, nor is Javascript going away because of this.

Yes, this was because of Javascript, but no, sadly it won't be going away.

No it's not. The Reddit hack was a Cross Site Scripting [wikipedia.org] attack made possible by bugs in their markdown implementation which let javascript through the parser. It was not a SQL injection attack, it did not attack the database directly, no commands were ran to directly put data into the database. It's an entirely different vector and an entirely different vulnerability, all the stored procedures, escaping of apostrophes and parametrised SQL in the world would not have stopped this.

Filtering user input properly would have stopped this though. It is not an attack which relies on a flaw specific to javascript - the flaw is a very general one - using untrusted user input without aggressive filtering.

What exactly does being Turing complete have to do with it? If a scripting language weren't turing complete, but had direct read/write access to your file system, would it be any safer than JS?

The problem with Reddit isn't JavaScript but rather their markdown implementation. And the security threat here isn't to the user whose system is running the JS, but instead to the Reddit site. If you're using an up-to-date & secure browser, there's typically minimal risk to enabling JavaScript. That JavaScript ca

The problem with Reddit isn't JavaScript but rather their markdown implementation. And the security threat here isn't to the user whose system is running the JS, but instead to the Reddit site.

Yes that's what makes this case special. Most javascript security problems are externalities to the websites that over-use javascript - they don't normally suffer the consequences of enabling javascript in the browser - the users do. This time the website is paying the price for their poor decisions. Finally the gander is getting goosed.

If you're using an up-to-date & secure browser, there's typically minimal risk to enabling JavaScript. That JavaScript can sometimes be used to do mischievous things...

No. Javascript vulnerabilities come in two flavors - exploited bugs and deliberate abuses. All of the web-tracking systems enhance their tracking of people via javascri

Then what makes the code for implementing a Turing-complete sandbox inherently less secure than the code for a less-than-Turing-complete sandbox?

And what does any of this have to do with the exploit TFA mentions? It wasn't about Javascript escaping the client-side sandbox, nor is there any particular reason for users to enable noscript. It was entirely Reddit's fault.

Yes, this was because of Javascript, but no, sadly it won't be going away.

So, all bots that crawl forums to spam them are Javascript? Honestly, if Javascript could do this, I wonder what a more complex bot could have done. Are we all going to lament about the programming language that some forum bot was written in? C? Python?

"Yes, this was because of C, but no, sadly it won't be going away."

Can't see why people get such a hardon bashing Javascript. "Because it's not a real programming language!"? I guess it's the same mentality that leads people to bash PHP, Perl, Ruby, ASP, etc. etc.

I look at it this way. Javascript is a tool and bad programming is bad programming and sadly, bad programming won't be going away.

``There's a huge difference in complexity between image/HTML renderer and Javascript. Image file formats and HTML pages are not Turing complete, while Javascript is. Consequently, the former are "safe" in that it's possible to prove that a particular implementation is free of exploits that would allow running arbitrary code, while Javascript by definition can never be; the whole point of Javascript is to allow arbitrary code execution''

Err, no. There is a huge difference between being Turing complete, unsaf

Just as exploits in the image processing components of web browsers will hopefully educate people to surf in Lynx?

How many exploits in image processing components of web browsers have there been? I count 4 [about.com] for raster images. (of course that article is a few years old, have there been any recently?) If there were as many holes in JPG rendering libraries as there have been in javascript, then yes disabling images would be an entirely reasonable solution.

If your security model is built on everyone else playing nice, you're fucked.

The problem here is in the browser allowing the hijack.

It's not the browser here that's assuming everyone else is playing nice. It's Reddit's site. How were you modded insightful? You're just wrong.

I agree with your sentiment (that you shouldn't assume everyone else is playing nice) but blaming Reddit's problems on browsers misbehaving is like blaming potholes on cars. Sure, nobody crashes if nobody's driving, but potholes are usually caused by ice breaking up the asphalt, not by drivers driving on roads.

Back in the old days, there was a mod button at the bottom of the screen. You had to mod all your comments in batch.

Lower ranked comments were hidden separate "Too many comments" pages and if you clicked one of those links to read them, you would lose all your mod selections. When I got Firefox (Phoenix at the time), tabbed browsing made the process so much easier.

Years ago I actually proposed to the W3C and the mozilla bunch to add a tag to disable dynamic stuff like javascript.

Basically it would work something like this:

<shield lock="some_random_hard_to_guess_string_here" enabled="basic_html_only">The browser will only recognize basic HTML stuff here, it won't recognize javascript or any _future_ dynamic stuff that the W3C or browser people think off</shield unlock="some_random_hard_to_guess_string_here">

The some_random_hard_to_guess_string_here would be different for each page.

The idea is while the website should still have filters, even if in the future the W3C or browser wiseguys create some new fangled way of inserting javascript or some other dynamic content that the filters do not protect against (since it's new and the filters have not been updated), the browser will just ignore the new stuff that some hacker inserts when it's between the tags.

To me the current state of things is a bit crazy - basically it's like having a car with 1000 gas pedals (tags) and to stop the car you have to make sure all 1000 pedals are not pressed (escaped or filtered). There is not a single brake pedal! And worse, the W3C or MS or Mozilla or whoever could introduce a new gas pedal, and you the website operator have to filter out the new gas pedal when it's introduced.

With something like this tag there is a brake pedal, so even if you don't manage to filter out all the 1000 gas pedals, the brake helps to keep stuff safe.

If they had implemented such a tag, the google and myspace worms would not have worked for so many browsers.

FWIW, these sort of worms are not new. I managed to find a hole in advogato some years ago (iframe worm) - and hence my suggestion to the W3C and Mozilla.

But it seems to me than NONE of them are really interested in improving security. They're all just interested in inventing new gas pedals for people (and hackers) to step on. They're not even interested in creating a single brake pedal. They just pay lip service to security.

See the thing is - it's not too difficult to code a browser to go "OK from now on there's no such thing as javascript till I see a valid unlock tag", so even if there is a browser parsing bug and a hacker manages to insert javascript via a stupid browser bug (that the website filters naturally do not and cannot cater for) it does NOT matter - since javascript will be disabled - between those tags the browser will be respecting the flag that says "I do not know javascript, java and all that fancy stuff" - it does not even have to parse javascript - since for all intents and purposes between those tags, the browser does not know there's such a thing as javascript (or activex or flash etc).

This is very useful for sites that have to include 3rd party content - sites like slashdot or webmail sites or even sites that serve up ads from 3rd parties.

No, the entire point is that any exploit would need to know the "unlock" code in order to operate, yet the unlock code is generated when the page is sent to the browser, long after the exploit has been submitted.

You can't assign attributes to end tags. XML/HTML won't let you do that and extending it to be able to do so would be a bit of a revolution. Too many existing parsers rely on the current behaviour. But maybe you could possibly do something along '''<startshield key="lalala"/> stuff <endshield key="lalala"/>''', although I believe that'd also be a bit of a hack.What we actually really need, and what is the real solution, is just a little more careful programming on the server side. Write a func

Tools like that aren't foolproof, especially since browsers go out of their way to attempt to parse malformed input (unless you're serving content as application/xml, in which case the browser will just show an ugly parse error). I can't speak about that tool not having used it, but all it takes is one hacker finding yet another way to create a broken script tag that a browser will still run that they don't yet know about and all your efforts are for nothing.

There are many situations other than forum posting where it is desirable to include third-party content in your site. Advertisements are the first thing that jump to mind, but web widgets are also becoming popular. Having some browser markup that will limit what the third-party code can do would enable this to be done safely, without having to trust the third party or load and filter third-party content server-side.

Reddit does escape all of those symbols, and they use Markdown [wikipedia.org] for adding links. Still, they managed to get owned by an obscure vulnerability that was discovered only because their code is open source.

And that's the point TheLink was trying to make. It would be far simpler to tell the browser not to accept javascript in a certain block of code than it is to explore all the possible exploits that could be leveraged against your alternative markup language. There are hundreds if not thousands of places you can make mistakes, and it could be remedied by a single mechanism that prevented javascript from existing in certain blocks of code.

There is not a single brake pedal! And worse, the W3C or MS or Mozilla or whoever could introduce a new gas pedal, and you the website operator have to filter out the new gas pedal when it's introduced.

Undid my mods, but I had to post this.

There used to be a break pedal. I think it was Firefox 1.5 where this code didn't evaluate any tags:

element.append(document.createTextNode(sText));

The solution, therefore, was to manually parse italic/bold/a tags, to append those elements - and then create a text node inside. A perfect working DHTML/DOM solution, properly sanitized!

However, with Firefox 3, text nodes now evaluate HTML tags. This handy function went out with eval usage for local callbacks.:/ Opera and Chrome also evaluate some(all?) tags for appende

NoScript comes from a broken way of thinking, "you can identify attacking sites and trusted sites", the attack code for this was coming from reddit.com (a site you have to allow in order to use reddit). The only way this sort of bug can be protected against is by use of javascript filtering tools such as controldescripts [mozdev.org] that filter javascript request by type and domain, with such a tool it would be possible to protect yourself much more effectively.

mouseclick is submitting info -> allowmouseover is requesting data -> allowmouseover is submitting data -> request user confirmationjavascript function is doing something weird -> request user confirmationjavascript is trying to use a known exploit* -> deny and notify user (as a workaround for 0-days simply blocking the bad JS calls will protect users much faster than browsers usually get patched)...etc

You could also combine this with domain checking to have lists of pages where you allow*no-js (untrusted),*simple-JS (google, youtube, etc) but [it might allow functionality but could prevent tracking],*complex-js (facebook, etc) [all the ajax stuff means simple-JS wouldn't work]*all-JS (fancynewsite.com) [even the complex list of functions you allow just isn't enough]

Such tools could also help the paranoid among us use website that require JS, by disabling mousetracking and sending of data on non-click actions.

As long as people stick to the broken thinking of trusted/untrusted domains, there is little chance of this actually happening. The worst thing about noscript is that for an unkown site you often have to allow JS on it to see what it looks like, so unless you plan on only browsing sites you've already been to and those that don't use javascript, it is completely useless yet its users claim, nay genuinely think they are more secure!

The worst thing about noscript is that for an unkown site you often have to allow JS on it to see what it looks like, so unless you plan on only browsing sites you've already been to and those that don't use javascript, it is completely useless yet its users claim, nay genuinely think they are more secure!

If I go to an unknown site and it doesn't display anything useful without JS then I generally go somewhere else; if the developers are so inept that they can't make their site do something useful without it then the site is probably a heap of steaming monkey poo or a malware distributor.

Back in the real world, it's hard to see how allowing arbitrary JS to run on your system can be considered 'more secure' than only running it from sites you trust. This 'exploit' is nothing to do with insecurity, it's to do

Neither me nor aedes work for reddit. We were simply reporting what was known at the time to prevent further spreading and panicking users (people were thinking they were going to get banned for spamming, worrying about loss of karma, et cetera).

The admins acted within an hour. KeyserSosa is an admin, his username is highlighted in red and has a [A] next to it.

That's the use opting to execute extra javascript on your page, if this breaks your web site/application for more than that user then you are not doing it right. The posted hack is something much more fun.

Indeed, it will educate people to surf with javascript turned off, and it will hopefully educate webmasters to stop programming their sites in a way that requires javascript even for basic functionality.

Anyone who believes this has simply never written a web application. Javascript and cookies are absolutely essential to any web programmer who wishes have any type of dynamic content on a page. It annoys me to no end when someone says the solution to security holes is to turn these features off. The solution is for programmers to stop being idiots and write secure code, both in web applications and in the browsers themselves.

As a web developer, I beg to differ. There is absolutely no excuse for writing a page that doesn't 'fail gracefully' when javascript isn't present. Let's face it, for every reputable page out there (att.net, youtube.com, etc) there are a hundred others designed by average joe-schmo webprogrammers. And lord only knows if they designed their page securely, and lord only knows if someone has hacked them and injected malicious scripts. I seem to recall hearing a few weeks ago that the majority of malicious scripts were being put into hollywood celebrity gossip sites that people were hitting off their google searches.

For me, the solution is to just whitelist the sites I visit frequently, only allowing scripts/cookies when I know they can be trusted. I'm not saying that you shouldn't design without javascript, but I am saying that you shouldn't assume that everyone visiting your page is going to have it. Besides, how hard is it to write a page that vomits up its contents in a readable form when the javascript doesn't run to position all the css objects? It doesn't have to look pretty, but it should be usable.

There is absolutely no excuse for writing a page that doesn't 'fail gracefully' when javascript isn't present.

Yes there is. Making your page fail gracefully takes extra time and resources, which could be put to better use than supporting the 1% of users who choose to handicap their browsers by turning off javascript.

Failing gracefully is an important concern, but its not the only concern, and should be balanced against other priorities.

The idea is to build the page in fail-state first, and then use JavaScript to enhance it. Or in other words, build your DOM and then restyle, add event listeners, etc.

It doesn't take extra time, and it's a great technique for future-proofing your pages. It also makes them accessible to people who, for whatever reason, can't take advantage of teh javascript. If your website is in the US, and is big enough for anyone to care, ADA compliance pretty much requires it.

Absolutely right for your personal homepage. A professional web designer would not be able to get away with this. This kind of laziness translations directly into additional support costs for the client. And each time Microsoft recommends turning off Javascript due to a 0-day exploit you are cutting off more than 1%.

I can't think of any cases where it is ok to not fail gracefully. I hope you are not talking about just using client side validation, one of the most used cases for Javascript but where you must

I think you're talking about Section 508 of the American with Disabilities Act. And yes, it can apply to more than US Government web sites. Target found that out the hard way after refusing to provide alt tags and other accessible changes to their web site. After getting slammed with a $6 million judgement, no one else is bothering to refute what has become established case law.

I might also add that Section 508 covers much more than screen readers and javascript.

The solution is for programmers to stop being idiots and write secure code

Yeah because that mantra has really caught on, especially with Microsoft employees.

Face it, programs are written by people, people are made to f*** up on epic scale [wikipedia.org], therefore, you need to be ready to handle epic f*** ups or just not play ball. Granted you don't get the same dynamic experience but that's the trade off. I'm sure the guy your quoting understands that.

Anyone who believes this has simply never written a web application. Javascript and cookies are absolutely essential to any web programmer who wishes have any type of dynamic content on a page.

So by advising people to disable Javascript, I'm doing my part for killing off "Web Applications" and getting us back to good old Web Pages. Excellent.

Seriously, why would I want "dynamic content" when all that really means is a thousand pauses as more data is fetched? Give me static pages whenever possible. Better yet, give me a single large static page rather than a dozen small pages, so I don't have to wait while the next page is being loaded and rendered.

The solution is for programmers to stop being idiots and write secure code, both in web applications and in the browsers themselves.

The solution is to understand that most web sites are not applications, from the users point of view, and stop stuffing them full of scripts that do nothing but slow things down.

Well, neither cookies nor JavaScript are strictly necessary. REST demonstrates that URLs suffice. JavaScript certainly makes it more pleasant, and cookies can be used to address some usability problems (though they are currently abused).

KeyserSosa
Thanks for this (and thanks aedes ). I'm going to steal his idea and post here as well.
We've fixed a couple of underlying bugs in markdown.py, and will write a blog post for those interested once the dust settles. We've also gone through and deleted the offending comments.
This exploit was a good old-fashioned worm, and its only purpose seems to have been to spread (and spread it did). The effect was limited to the site, and no user information was compromised.

I'm a long time slashdotter and now spend equal time on reddit. What draws me to reddit is the spartan interface.
Of course, the content on reddit is halfway between slashdot's and digg's, so I (unfortunately) have to keep coming back.

Over the years I've also spent quite a bit of time on social sites like Slashdot, Fark, Metafilter, Digg,etc.... but now spend the majority of my time on Reddit. I actually like the design (its simple, efficient and useful). But the beauty of Reddit is the organized structure of the sub-reddits. If I'm short for time, i can just quickly browse the frontpage. If I have more time, I can browse my favorite sub-reddits where people know me. The commenting system is easy on the eyes and easy to follow. and the u