Dylan Schiemann of Dojo and Sitepen has expounded asking What is bloat?.

We do hear a lot of people talking about the Xkb of JavaScript that they are willing to put in a page/app, yet bloat is a lot more than that (as Dylan attests).

At the core of this article we hear loud and clear:

We need better tools to minimize bloat of various flavours

We need better tools to be mindful of the bloat

When users have 6 tabs open with rich ajax apps within them, it isn’t fun right now

Where does Dojo fit into this?

Dojo will be working over the coming months to improve performance, which is something we have been doing with each release of Dojo. But improving toolkit performance is not enoughâ€¦ we also need to share best practices on how to get the most out of our browsers, and to ask for more from browser vendors (theyâ€™re already listening). We will be writing a series of posts on performance tuning and testing, and we look forward to reading your comments and trackbacks, and learning from your performance tips. We as a community need to fix this issue before Ajax suffers the same fate as DHTML. After all, no one likes to be bloated.

I believe this is an extremely important. As we move towards more sophisticated RIAs, js library size is becoming very important, and speed is really one the single biggest factors in making an enjoyable user experience. This essentially why Ajax has become popular in the first place, we want to be able to communicate with the server without the lag of a full page refresh.
Dojo does have some sophisticated tools for creating JS packages, however I believe that most JS libraries are limited by the fact the static nature of the general approach to the loading of libraries. Generally, it is assumed that all JS code that would ever possibly be called must loaded when the page is loaded. However as applications become more and more sophisticated, you start seeing more and more features that are reachable from a single page because much more can be done with Ajax without needing a page refresh. These means more and more JS must be loaded up front. What is needed is that we must start moving into the dynamic loading of JS. If you have an entire package in your library that is only needed when a user clicks on a particular button, doesn’t make more sense to load these when needed? Dojo has done an admirable job of dealing with this issue with their dojo.requires/package system, however they are reliant on synchronous (SJAX) calls to do this, which is very unpleasant for end users (their browser locks up). Solving this problem is not trivial. I believe that utilizing continuation systems like Narrative JS is an important tool for this. I wrote an article discusses this issue more with regards to my technology if you would like to see at http://www.authenteo.com/#Dynamic%20JavaScript%20Loading (under dev notes/dynamic javascript loading)

The downloading and caching of javascript need to be addressed by the browser providing a new technical structure for this task. Something that is faster and more reliable than what the browser currently does.

Users of my site frequently receive corrupted javascript files (in both IE and Firefox) after having used the site for weeks or months. The current mechanisms for downloading and caching javascript are simply unreliable and arcane.

For example, someone could offer all of the prototype/dojo/moo/etc versions on one server, with a different url for each version, and if people can use them unmodified, then various sites can just load their JS from that one site.

I’m not sure if there’s a business model there, and it’s true you’re adding an additional potential point of failure to your application, but you’re also gaining:
– Speed from the fact that the download won’t count against your “two requests per domain” limit in the browser
– Speed from the fact that if other sites are using the same link, the odds of that js file being cached in the local browser increase

The host might also offer compressed/uncompressed versions of the libraries. Is anyone doing this?

This is really trying to address the load-time or wire-time “bloat” and doesn’t speak to the “bloat” of what the libraries themselves are actually doing once loaded (cpu/memory/ugly syntax/etc).

That’s the idea behind nomadic function (e.g.: nofunc.com), providing the same experience with less code.

Another idea that would be really cool, is a javascript parser that removes all un-needed functions from prototype, or mootools (or whatever other library you might be using), based one the code you are writing. It would need to descend the javascript, and cross-reference it with the library. I’ve never seen anything like this online, has anyone else?

Very often in an ajax application, the entire javascript code is not used each time, or at least at once: there may be some code specific to submitting comments, which are only needed once the user has read the article, and wants to respond… This code is, therefore, useless at the start of the application. Would it not make sense, therefore, to load those particular set of function later, in an attempt to increase the initial page-load time?

So, how can this be done? Well, one would obviously need a clever server-side script to deal with everything, but the real key would be this: use the xmlhttp (ie AJAX) object to actually download the javascript code from the server, then store the result in a variable (or even in the dom if you like). Next, when that function is needed, all you would need to do it call “eval(…)”, and just run it:

There are of course a number of issues with this. Firstly, security: you may argue that arbitrarily executing code downloaded from the server is baaad practice (like what happened to the server-side SQL equivalent). However it may be argued that
a) the initial javascript is also downloaded from the server, so downloading a snippet later can’t do much harm…
b) Js is a user interface only. The entire operation is carried out and authorized on the server-side. So if the UI messes up, F5 should help.
c) …

Secondly, there are unfortunately performance implications with using eval, but how do these compare with the page-load speed increase? Also, once the functions are no longer used, they can be freed (fun=null) so that the browser uses less memory overall…

This is a hypothetical suggestion only. It works, i tried, but cannot judge the performance side of the equation. Please comment on this, and give your opinion as to whether it would be a good or bad idea to do this in the setting of a large application…

Comment by Franchie — December 30, 2006

None of my javascript components use libraries for the simple fact that I am theoretically a lousy programmer and apply premature optimization.

For a scripting language like javascript, oftenly every cpu cycle counts. For most of my own work, I program routines especially to fit the task and nothing more elliminating as much code around it as possible.

The results are imho sometimes amazing. I can deliver well architectured pieces of code, with only the parts that are really needed and as such deliver more performance, or get the expected performance.

Comment by M. Schopman — December 30, 2006

Yeah my Javascript stuff is all out of wack. I went from nice neat PHP flat frameworks to Ajax, because Ajax, offered up a better interface application. The organization of my code since the switch is sloppy at best, and doesn’t lend to anything reusable. I tend to follow this path of just getting the site working, and have not considered the fact that I would have to go back and change things in a million places in order to adjust to my client’s requests for changes. But the libraries available to us are fairly good, like dojo, and Scriptaculous with prototype.

I love the Yahoo UI Library. It’s feels much efficient and also has all the -min versions, making the js size a lot less…

Comment by Satish — December 31, 2006

I recently encountered a problem recently that I never hear mentioned… I was using my laptop on battery power and my battery died about 50% sooner than expected. It turned out a website I was on was using enough CPU power (even though it was open in a tab in the background) to prevent my laptop (core duo) from using the optimal power saving mode. There was obviously a lot of bloat here and the effect was, imho, very serious.