I asked a question yesterday Should I Bother to Develop For JavaScript Disabled?. I think the consencus is: Yes, I should develop for JavaScript Disabled. Now I just want to understand why users disable JS. It seems many developers (I guess people who answered the questions are developers) disable JS. Why is that. Why do users disable JS? For security? Speed? or what?

I think when you say 'users', the consensus was from developer users, NOT Joe Bloggs users....
–
DarknightDec 14 '10 at 15:35

4

I think you're making assumptions based on anecdotal evidence. Fact is, that 99.7% users do not turn off JS. In fact, if they really would have JS turned off, they wouldn't have answered question here, because this site does not work without JS.
–
vartecMay 28 '11 at 18:16

@varted, @kirk: I know a lot of people who do, or at least partly. A lot of security conscious people will only allow JavaScript on sites they whitelist, for instance. And I know a lot who disable JS on their smartphone as it often either worth the battery it drains.
–
haylemJun 6 '11 at 16:44

I have javascript disabled in Chrome by default, for security reasons, though I enable it for websites that are worth it. I really dislike that so many websites don't work without javascript, there are more and more of them that won't work at all.
–
Czarek TomczakMay 23 '13 at 19:02

8 Answers
8

One disables JavaScript in a browser environment because of the following considerations:

Speed & Bandwidth

Usability & Accessibility

Platform Support

Security

Speed & Bandwidth

A lot of applications use way too much JavaScript for their own good... Do you need parts of your interface to be refreshed by AJAX calls all the time? Maybe your interface feels great and fast when used with a broadband connection, but when you have to downgrade to slower connection speeds, a more streamlined interface is preferred. And switching off JavaScript is a good way of preventing dumb-struck web-apps of refreshing the world every 15 seconds or so for no good reason. (Ever looked at the amount of data Facebook sends through? It's scary. It's not only a JS-related issue though, but it's part of it).

We also tend to off-load more and more of the processing to the client, and if you use minimalistic (or just outdated) hardware, it's painfully slow.

Usability & Accessibility

Not all user interfaces should expressed in a dynamic fashion, and server-generated content
might be perfectly acceptable in many cases. Plus, some people simply don't want this type of interfaces. You cannot please everybody, but sometimes you have the chance to and the duty to satisfy all your users alike.

Finally, some users have disabilities, and thou shalt not ignore them, ever!!!

The worst-case scenarios here, in my opinion, are government websites that try to "modernize" their UIs to appear more friendly to the public, but end up leaving behind a big chunk of their intended audience. Similarly, it's a pity when a university student cannot access his course's content: because he/she is blind and his screen-reader doesn't support the site, or because the site is so heavy and requires ad-hoc modern plug-ins that he/she doesn't get to install on that refurbished laptop bought on e-bay 2 years ago, or again because he/she goes back home to another country for the spring break and the local bandwidth constraints cannot cope with the payload of the site.

Not everybody lives in a perfect world.

Platform Support

This point relates to the 2 previous ones and tends to be less relevant nowadays, as browsers embed JavaScript engines that are a level of magnitude more efficient than they used to be, and this keeps getting better.

However, there's no guarantee that all your users have the privilege of using modern browsers (either because of corporate constraints - which force us to support antediluvian browsers for no good reason, really - or other reasons which may or may not be valid). As mentioned by "Matthieu M." in the comments, you need to remember that a lot of people still use lower-quality hardware, and that not everybody uses the latest and coolest smartphone. As of today, there are still a significant portion of people using phones that have embedded browsers with limited support.

But, as I mentioned, things do get better in this area. But then you still need to remember the previous points about bandwidth limitations if you keep polling very regularly (or your users will enjoy a nice phone bill).

It's all very inter-related.

Security

While obviously you could think that nothing particularly dangerous can be done with JavaScript considering it runs in a browser environment, this is totally untrue.

You do realize that when you visit P.SE and SO you are automatically logged in if you were logged on any other network, right? There's some JS in there. That bit is still harmless though, but it uses some concepts that can be exploited by some malevolent sites. It is completely possible for a website to use JavaScript to gather information about some things you do (or did) during your browsing session (or the past ones if you don't clear your session data every time you exit your browser or run the now common incognito/private browsing modes extensively) and then just upload them to a server.

Plus let's not forget that if your browser's security model is flawed, or the websites you visit don't protect themselves when enough against XSS attacks, then one might use JavaScript to simply tap into your open sessions on remote websites.

JavaScript is mostly harmless... if you use it for trusted websites. Gmail. Facebook (maybe... and not even...). Google Reader. StackExchange.

All this being said, there might be perfectly good situations where you don't need to bother about supporting JavaScript. But if you offer a public-service website, do consider accepting both types of clients. Personally, I do think a lot of modern web-apps and websites would work just as well using the former server-generated content model with no JavaScript at all on the client side, and it would still be great and possibly a lot less consuming.

Facebook, for example, is a terrific drain on your CPU. I've come across some sites that were so poorly coded (or seemed to be), they would basically freeze my computer by fully loading the CPU (with a few other tabs open).
–
Mark CDec 14 '10 at 3:35

3

@Mark C: I consider sending back as much as 140K for clicking on "send" when I type a comment a slightly exaggerated use of web forms, honestly. Might have been in specific cases at the time and that have been fixed since then (hopefully). I lived for a short while in a country with a, hmmm, restrictive stance on internet censorship and not so great connection quality, and that makes you appreciate good ol' text-based websites a lot more!
–
haylemDec 14 '10 at 4:08

7

+1 for mentioning accessibility. Half the damned web is completely unusable to me, and I'm neither a casual computer user nor do I need to rely on JAWS (yet).
–
Stan RogersDec 14 '10 at 5:09

2

@Stan Rogers: it matters a lot to me. I had a chance to work with a blind guy at university, who happened to be both student and teacher, and I was blown away by his abilities. And I find it rather sad that big companies and even educational institutions now come up with crappy artsy web-sites where these users are left out.
–
haylemDec 14 '10 at 14:00

2

+1 for accessibility. I work for a site that is very heavily related to healthcare. (Thankfully not the abysmal one in the news) As much benefit as JS gives us, I'm greatly saddened of our priorities.
–
Katana314Dec 12 '13 at 15:18

+1 for the funny analogy. Though the fact that it's Turing complete has *nothing to do with the dangerousness of the execution.
–
haylemDec 14 '10 at 14:56

3

@haylem: Being Turing-complete means that it is impossible to mechanically prove secure in the general case. Heck, it's even impossible to prove basic things like that it doesn't run forever. For a more restrictive language, it would be possible for the client browser to prove that the script isn't doing something dangerous.
–
Jörg W MittagDec 14 '10 at 15:25

13

Turing-completeness is only about computability. It says nothing about whether the interpreted language is allowed to open files, make HTTP requests, etc. The only inherent danger in Turing-completeness is the possibility of an infinite loop.
–
dan04Feb 26 '11 at 7:01

@dan04 Or that it tries to emulate an x86 processor running Windows running a Desktop application which gets projected into your browser window - all in Javascript. Turing completeness is scary
–
sinni800Mar 11 '13 at 10:32

I am not a web developer, and I have only a moderate understanding of the way the internet works. So this is an answer from a user.

My experience leads me to believe many sites are simply poorly coded, whether out of laziness or ignorance: When I would view a basically static web page, such as a Facebook page, my CPU usage would increase by something like 15%, and drastically more with multiple tabs. Eventually it got to the point where I would have to wait for a response after clicking a button or link and my CPU would overheat and lock up.

On many of these worst offenders (sites), nothing visible is changing and nothing interactive is happening. I could only suppose the site's code was constantly making excessive refreshes, polls, and endless loops.

This drove me to install NoScript to free my CPU usage and stop browsing from becoming a frustrating chore.

Facebook doesn't provide static pages: it's using a technique called long polling to check for new notifications, IM messages, and newsfeed items. All of those things require JavaScript and some amount of CPU power.
–
user8Dec 14 '10 at 15:38

1

@MarkTrapp Yes, that is why I said "basically static" although it is not strictly speaking a static page. HyperPhysics would be an example of a site with static pages. I realize there is probably a need to do that kind of thing otherwise the boxes would never disappear and you would not see notifications until you refreshed the page, but: It seems each site helps itself to more of your resources than it should, similar to the situation where a professor or teacher expects you to put their work first.
–
Mark CDec 15 '10 at 0:00

if you think that facebook is static page, then you shouldn't comment on this question.
–
DainiusJul 15 '13 at 8:01

For me it all about security. I use noscript to allow certain websites to run javascript, while disallowing most.

In the end you really never know where the danger lies (nobel web site infected on techspot.com). Many zero-day (and other) exploits use javascipt; closing this one avenue of attack feels like a step in the right direction.

You need brackets around something for that link to kick in. That reminds me, I learned only last winter that the Yahoo! Sports ads were infected with some kind of malware (or would infect you). The young man administrating the home network where we were staying blacklisted numerous sites that had infectious ads.
–
Mark CDec 14 '10 at 5:11

My main reason is that it suppresses the most annoying ads. I'd rather not use AdBlock Plus, since that can affect revenue for the sites I visit (and I've used a site or two where the terms of service said I was not to disable ads). NoScript limits the potential obnoxiousness of ads, and I'm willing to live with the rest of them.

There's also the security consideration, and that's largely related to ads also, since any site that sells ads has to be considered potentially hostile.

Moreover, I don't necessarily know a site is dodgy before I visit it. Some people enjoy sending out links to sites, and aren't necessarily honest.

Because browsers used to have slow JavaScript implementations and too many n00b web designers just used it for irrelevant things like button rollovers.

On a fast machine, with a modern browser, nobody in their right mind disables it all the time. Which is not to say there aren't plenty of very "security conscious" people and others without the funds, desire, or know-how to be running a modern browser on a fast computer... It was only recently that IE6 stopped being the most popular browser on the internet!

"and others without the funds, desire, or know-how to be running a modern browser on a fast computer". I can understand and agree with the "funds" part. I can understand with the "desire" part, though I'd think it usually would be more a matter of "need" as an imposed constraint that a refusal to have a decent computer. But I don't really get the "know-how" part. How can you be unskilled to the point of not buying a recent computer? Or if you do, of misusing it to the point of not installing the bundled browser and using an older one instead? :)
–
haylemDec 14 '10 at 4:20

With Javascript activated any website may execute code on my Computer. I don't even know, if the particular website executes code and what it does. Even more worse, someone else may insert code without my knowledge into a normally harmless website (XSS). Recently a the well-known german computer-magazin c't made an article, taht an 16-year-old tried the online-banking-sites of the most common banks in germany. Many of them - including the biggest - were vulnerable for XSS. And you don't even notice, that your online-banking-site executes some Javascript that changes for example the target and the amount for a transaction. With disabled Javascript the XSS-attack in the context of a trusted site is useless, I don't execute the malicious code.