Trimming the Fat

When I unveiled a new version of this site last year, I hoped the design would slowly evolve.

An update in February improved the responsive layout and saw some initial performance optimisations. The last few weeks have seen further iteration. Although the design looks remarkably similar, much has changed below the surface. Where each page previously requested at least 14 assets weighing a total of 385kB, now only 9 requests are needed, and with an unprimed cache, these total just over 100kB. I thought it would be interesting to detail the changes I’ve made, and this time, I’ve got graphs!

JavaScript

Uncomfortable with having 30kB of jQuery as a dependancy, JavaScript was my first target for weight loss. In reviewing the jQuery functions I was using, I realised many were unnecessary:

The Awesomersands function that allowed me to style ampersands was actually replacing the original glyph with a much uglier version. It also produced a distracting ‘flash of unstyled ampersand’.

A function that added thin spaces around emdashes could instead be incorporated into my Movable Type templates. In making this change, I decided to now use spaced endashes instead.

The HTML5history.pushState function used on journal entry pages was fragile at best, so became a candidate for removal.

A function that wrapped a <div> around video embeds to give them a fluid width was unnecessary when I could add this manually.

Using MapBox embeds in place of Leaflets JS meant I could simplify adding interactive maps to pages. Well, almost. To display paths requires an additional layer to be created in TileMill – hopefully the ability to add vector lines in MapBox isn’t too far off.

With this code removed, the only behavioural enhancement required was for the responsive navigation. I’d like to thank Anthony Williams for helping me rewrite this using pure JavaScript. However, I’m still calling jQuery on pages displaying slideshows, so actively looking for an alternative that will allow me to shed this dependancy entirely.

Javascript: Bytes downloaded (requests)

Before

35.00 kB (2)

After

1.28 kB (1)

CSS

While helping out on a recent project at Clearleft, Mark introduced me to LESS, a CSS pre-processor I became eager to use here. With LESSphp compiling LESS on the server, comments are stripped out and the generated CSS is easier to compress, too.

By removing unused style rules and refactoring others, my raw stylesheet shrunk by 19kB. Yet you’ll note that the compressed CSS file is still larger that it was before. That’s because the small background noise texture shown on larger viewports has been embedded as a base64 string, removing a further request.

CSS: Bytes downloaded

Before

8.25 kB

After

9.02 kB

SVG

In February I began using an SVG image sprite, falling back to a PNG image for browsers that don’t support the vector format. To prevent both images loading, a subsequent update saw me move the following detection script into the <head>, before any CSS can be downloaded:

If support for SVG is detected, an svg class is added to the <html> element. This allows me to create rules like this:

.icon {

background: url(/path/to/sprite.png) no-repeat 0 0;

}

.svg .icon {

background-image: url(/path/to/sprite.svg);

}

Going further

Besides stripping out the metacruft added by software like Illustrator, further optimisation can be found by using the <defs> and <use> elements. These allow you to define common objects, reducing the number of shape descriptions appearing in your document.

To demonstrate how this works, I’ll use three icons from my sprite image: a grey RSS feed icon (#feed), a Flickr icon (#flickr) and an orange and white feed icon (#feeds). In my original file, each was defined separately:

Note how the square shape, the feed icon and the circles used within the Flickr icon are described multiple times. The <defs> element means we can define these just once and reference them later with <use> and the xlink:href attribute, like so:

It’s easy to assume that gzip will take care of reducing file sizes, but manual optimisation beforehand can result in even larger reductions. For example, I was able to reduce my original SVG sprite (9.48kB, 3.36kB gzipped) to 7.34kB, which compressed down to just 2.84kB – comparable in size to the PNG sprite. 500 bytes seems like a small reduction, but using this technique on larger SVG images will have an even greater impact.

Image sprite: Bytes downloaded

PNG

3.42 kB

SVG Before

4.31 kB

SVG After

3.80 kB

Fonts

Earlier this year I cut the number of webfonts I was using from four to three by using a single font family. This reduced page download sizes a little, but changing my web font provider to Adobe Edge Web Fonts produced a far greater saving – although at the cost of being able to use Akagi (I’m now using Source Sans Pro). In fact, such was the reduction, I decided to include a forth font again, choosing the monospaced Source Code Pro – useful on code heavy pages such as this.

A free service without limitations or account management, Adobe’s new service is stupidly easy to set up. A single line of JavaScript provides a neat URL interface to various settings, and as the script includes WebFont Loader, there’s no need to add a chunk of JavaScript to the top of each page. Load times are brilliantly fast, and with fonts combined into a single file, the number of requests is the same regardless of how many you decide to use.

Of course, there is a trade-off here. Services like Fontdeck provide an extensive library of premium webfonts while free services like Adobe’s only offer a small selection of open source fonts. Yet with simpler set-up and greater performance, they’re an attractive option.

Other Optimisations

I’m now serving content via CloudFlare, a smart service that optimises content and intercepts dubious requests. With this is place, I no longer need PHPminify for CSS and JavaScript magnification. It also acts as a CDN, so static content has been moved from Amazon S3 (which I discovered isn’t actually a CDN) back to this domain where it’s easier to manage.

Calling a single PHP include from each page allows me to specify the character set in the HTTP header. Adding the async attribute to my analytics script means this will now download and execute without blocking other assets.

There have been a few design related tweaks too. I simplified the IA by moving links to my articles and academic essays to within the Portfolio section. I’ve also increased the base font size on content pages from 16px to 18px.

In February, I concluded the results of my performance optimisation by including results from Google Page Speed, YSlow and webpagetest.org. This means I can measure the effectiveness of these latest changes. Both Google Page Speed and YSlow scores have increased by two points, to 96 and 98 respectively. Comparing results saved from webpagetest.org, the following improvements on the homepage can be recorded also:

If I Had More Time, I Would Make the Website Quicker

Arguably, many of these optimisations are overkill, especially given some of the modest reductions. Still, this exercise was useful in understanding where performance gains can be found, and I can apply this knowledge on future projects.

Website optimisation can be a cruel game; everything has a number that begs to be reduced, but doing so requires a lot of experimentation, research and testing. And when you’re playing with the last hundred or so kilobytes, there’s little reward for your effort. Hopefully this overview will save you from playing the same game I have.