5 Ways to Make Your Site Smaller and Faster

Confession: I'd say once a week I genuinely wish I was a kid who spent his work day cutting grass and doing landscaping. Why? Because at the end of the day, they are able to say "the grass is cut, the job is complete." As Web Developers, we can't ever say that, can we? A site can always be more efficient -- there are always strategies for eliminating bytes. Always. And as long as we realize that, we internally and eternally say "the site isn't good enough." To be a great everyday developer, we're almost destined to feel as though our work isn't good enough -- what a negative way to live our lives!

The good news is that there are a few methods for incredible easy gains in the performance and load time departments. Here are five gains you can complete in minutes to make your site faster for all users!

1. Squash Images - ImageOptim

Squashing images is the ultimate free pass in improving site load time. Photoshop and other image editing applications are infamously inefficient with image compression, causing many KBs of extra download upon each request. The good news is there are many utilities to eliminate those extra KBs! My favorite Mac utility is ImageOptim.

You can use gzipping as much as you'd like but extra source KB size is wasteful so using an image optimizing utility is as valuable as any other strategy you can use!

2. CloudFlare

CloudFlare, a service that starts as free, offers loads of enhancements:

CDN services

JavaScript, CSS, and HTML minification

Downtime backup services

DDOS prevention

Location-based asset serving

This isn't a placed advertisement -- davidwalsh.name uses Cloudflare and has used all of its features. My site has saved GB's of data in transfer thanks to CloudFlare. Even when my server has been down, CloudFlare has served up the pages flawlessly. A complete win when you use CloudFlare.

3. Smaller Glyph Icon Libs with Fontello

Glyph fonts have been popular for a few years now and I'll pass on listing the reasons why -- we know why they're awesome. The problem is that we lazily use entire glyph font files whilst only using a fraction of the fonts within them. And though we seldom consider them, font files are usually massive. In an emoji: :(. Lucking utilities like Fontello exist.

4. Generate Static Files

We love our dynamic scripting but why serve dynamic pages when static pages will do? This is a practice often seen with WordPress -- the post content generally doesn't change but the advertisements and comments may.

The answer? Finding the key points when a page may change and generating static content when those points occur. A sweet WordPress utility called Really Static which accomplishes this feat for the blogging platform. Of course your non-WordPress CMS system will require custom page generation but the speed advantages will be plenty worth it.

If you have content that you need to rotate in those static pages, like advertisements or links to more current content, consider JavaScript and AJAX requests to get that content -- the page will be static and the JavaScript will be served from CDN -- the only speed consideration will then be the AJAX request!

5. Lazyload Resources...or Embed?

A commonly known symptom of site slowness is the number of requests each page generates. In the past we've remedied this problem with CSS/image sprites, concatenating JavaScript and CSS resources, and using data URIs. You could also lazyload resources or simply embed them in the page:

The example above loads the syntax highlighter only if elements on the page require highlighting. And what if the syntax highlighter CSS is just a few lines? You could save the extra request and embed it within the page:

Or you could concatenate the highlighter CSS to your site-wide CSS file -- either is a benefit!

As you can see, there are some incredibly easy speed and site gains to be had if you're willing to put in the few minutes effort to make them happen. And when you think about the number of visitors your site gets, and then the number of pageviews, you can see why these micro-optimizations are so important!

One event that's always been lacking within the document is a signal for when the user is looking at a given tab, or another tab. When does the user switch off our site to look at something else? When do they come back?

Before we get started, it's worth me spending a brief moment introducing myself to you. My name is Mark (or @integralist if Twitter happens to be your communication tool of choice) and I currently work for BBC News in London England as a principal engineer/tech...

GitHub seems to change a lot but not really change at all, if that makes any sense; the updates come often but are always fairly small. I spotted one of the most recent updates on the pull request page. Links to long branch...

We all know that we can set a link's :hover color, but what if we want to add a bit more dynamism and flair? jQuery allows you to not only animate to a specified color, but also allows you to animate to a random color.
The...

Discussion

One thing to watch: it throws your pre-compressed images into the trash once the process has completed, leaving the compressed ones in the originals’ location.

As I like to keep the source files as well as the compressed ones, I need to extract them from trash and move them somewhere. Not a major issue, just something to remember before you empty the trash!

Other image optimisation tools don’t usually overwrite the originals in this fashion, allowing you to rename/save the newly compressed ones where you want. To me this seems a better (safer) work method.

The good thing is though – it seems to be great at compressing – which is its main functon after all :)

I think the JS to load a CSS file optionally is better solved simply with good cache headers. Yeah, they will load it once but ideally never again. Embedding isn’t a good option for saving bytes over the wire as it actually prevents caching.

Adam

If I have multiple css files currently being concatendated and minified (served locally), would I benefit from using a CDN that has these same libraries? I’m taking about the number of requests that are required. Where does it become unreasonable to use a CDN and how would one go about attempting to determine that?

Steve

@Joe its important to know that TinyPNG.com uses **lossy** compression vs. **lossless**. Though the loss is extremely minor, there is some technical loss. If you use the online tool Smush.it (http://www.smushit.com/ysmush.it/) you can get the maximum compression without any image loss (lossless).

Robert

For image compression I always use https://kraken.io as it yields the best possible results (beats both ImageOptim and TinyPNG). I can choose lossy or lossless mode and they also offer an online web-interface so I don’t need to install anything on my computer.

I’m not sure if you guys have noticed it, but even after compressing images using any of the tools like tinypng, when you run a PageSpeed or GTMetrix on your website, they further ask you to compress the images. In fact they offer a compressed version of the image which surprisingly is actually heavier than the ones you got from TinyPng etc.

Also I think it is better to use CDN hosted version for files like jQuery, Bootstrap etc.

Continue this conversation via emailGet only replies to your comment, the best of the rest, as well as a daily recap of all comments on this post. No more than a few emails daily, which you can reply to/unsubscribe from directly from your inbox.