Google CDN Naming Conventions (and You)

This is a way you can load a JavaScript library like jQuery directly from Google's CDN (Content Delivery Network). You can get quick copy/paste access to these from ScriptSrc.net.

See in that above URL how it is pointing to 1.4.4 specifically? That little part of the URL can be tweaked. Perhaps you've seen it used that way before.

/1.4.4/

Loads this very specific version of jQuery which will never change.

/1.4/

Today, this will load 1.4.4. If and when jQuery goes 1.4.5, this URL will now point to that. If and when jQuery goes to 1.5, it will link to the last release of 1.4.

/1/

Today, this will load 1.4.4. If and when jQuery goes 1.5, this URL will now now point to that. If and when jQuery goes 2.0, it will link to the last release of jQuery 1.x.

As far as I know there is no super reliable way to link to the "absolute latest" build of jQuery (e.g. that would still work if jQuery went 2.0 and include nightly builds). Let's figure out which one we should use.

A Little Reminder Why We Do This At All

Decreased Latency - file is server from a literally-geographically-closer server.

Increased Parallelism - browsers limit how many resources can be downloaded simultaneously from a single domain, some as low as two. Since this is google.com not yoursite.com, it doesn't count toward that limit.

Better Caching - There is a pretty decent chance that your visitors already have a copy of this file in their browsers cache which is the fastest possible way to load it.

To which I would add:

Saves Bandwidth - The minified version of jQuery 1.4.4 is 82k. So if you had 1,000,000 pageviews with no local caching, that's 78 GB of data transfer which is not insignificant.

Caching Headers

#1 and #2 above are going to help no matter what, but caching needs a little bit more discussion. It turns out which naming convention you use is highly important to the caching aspect. Paul Irish has some basic research on this. I re-did that research and the results are below. Turns out, only linking to a direct exact version has proper caching headers.

One hour in this context is kinda useless. It does kind of make sense though. If 1.4.5 came out, anyone who is using a /1.4/ link and had a one-year expires header would still get 1.4.4 and that's no good.

Latency, Parallelism, and Bandwidth are still significant things, but pale in comparison to caching. So if caching is super important to you, linking to a direct version is your only option.

Which to Choose

/1.4.4/

Will never change, so will never break code. Best caching. Clearest to understand.

/1.4/

Possible but unlikely to break code with future updates (sub point releases more bug-fixy than API-changy). Fairly useless level of caching.

I would say your best bet is to use the direct version links in almost all scenarios. The point-only links are pretty useless. The version-only links I could see being useful for personal demos where you kind of want your own demo to break so you know how to fix it.

Combining Scripts

If you are able to squish JavaScript files down into one file and serve them like that (like perhaps this or this) you may be better off doing that, even if they are coming from your own server or CDN. One request is almost always better than multiple.

Not Just jQuery

This same naming convention / structure is used for all the JavaScript libraries on Google's CDN. I tested MooTools and all the exact same stuff applies.

Other CDN's

There are other CDN's that host jQuery as well: Microsoft and jQuery.com itself. Neither of these have the same kind of naming convention so this doesn't really matter. However, do note that a direct link on Microsoft's CDN does the nice one-year cache.

Just a note on the “Combining Scripts” portion:
While you are quite right about multiple requests being bad, IMO it cannot be bad to use the caching that Google’s and other CDNs provide for things the size of jQuery (which I believe is around 30k) for higher traffic sites. I tend to lean on the side of using Google’s CDN to pull down jQuery and then combine all of my site specific scripts into one file. Although I am interested in everyones thoughts on the way I do this.

I used to have php serve a custom-built javascript file per page, but it’s a lot of extra programming for not too much benefit. caching is more valuable. I sometimes still include a local fallback for the cdn (see below) but only when jQuery is actually mission-critical.

This is in use on the examples on this very site. The “view fancy source” thing is jQuery powered, but not all examples use jQuery. So in the footer include (invisible, just analytics basically) I also handle the view source thing, and it uses this to determine if it needs to load in jQuery or not.

If you can combine those in a build script, that’s cool in theory, but any change to any file will invalidate that combined file and you’ll need to push a new one out. And don’t forget, you’ll need a cache-busting mechanism in this file like a date stamp added to the file name: combinedJS_20101126.js.

Is the code in the version of jQuery you’re using going to change? No. Is the lightbox.js going to change? Odds are no (you can modify it in dev, but in production it’s probably not going to change).

What will change? Any site-specific js, i.e., utils.js.

Also, files don’t stay in your cache for as long as you think. Browser caches aren’t really keeping up with the times, I think most still ship with a 50mb default, though IE9 will be bumping theirs up from what I’ve read. You can easily fill that up browsing around in an hour. Let’s face it, a lot of home pages are pushing nearly a megabyte at you (if not more) nowadays.

HTML5’s cache manifest will be interesting, but also sure to be abused.

“Decreased Latency – file is server from a literally-geographically-closer server.”

I would like to point out that is a simplification of the truth. Quite often the server of Google may not be nearer at all, and even if it is, occassionally it can still take several hundred ms of latency for the file to be retrieved. This of course all depends on various parameters as well as sheer luck. Therefore, I find improved caching odds a much bigger advantage then actual latency improvements.

Using Google CDN: Can we do this on sites that have secure information? what if Google site get’s hacked (although possibility is very minor) and somebody puts a malicious script in jquery.min.js, wouldn’t that affect all sites that use it?

We have a pretty good* newsletter.

Email Address

CSS-Tricks* is created, written by, and maintained by Chris Coyier and a team of swell people. It is built on WordPress, hosted by Media Temple, and the assets are served by MaxCDN. The fonts are Source Sans Pro and Source Code Pro. It is made possible through sponsorships from products and services we like.