I’d like Scott’s feedback since he’s dealt with these issues while working on Respond and the like.

This is a pretty timely post for me. Javascript performance with media queries is why I began working on a project to support those browsers that don’t perform well with polyfills. I also wanted to support more of the media features that modern browsers do, while keeping the file size down. For examples of what I mean by ‘don’t perform well’, I was doing some testing on jsPerf and jsFiddle using matchMedia.js polyfill.
1. iPhone 4.3.3 gets 2 – 6 Ops/sec
2. IE <= 9 gets 40 – 90 Ops/sec
3. Chrome, Firefox, Safari and Opera fair much better 900 – 2500 Ops/sec

Testing the same media query with native matchMedia in Chrome, Firefox, Safari and Opera get 36,000 – 110,000 Ops/sec.

So now we know what we have and what’s possible. With the project below (Media.match), I was shooting for somewhere in between but hoping for much more. In short I got better than I expected. In Chrome, Media is more than 50% faster than native matchMedia but in Firefox, it’s 50% slower. I’m really excited about the results in IE and iOS as well but need more testing data. I’m not sold just by the numbers alone either, IE <9 feel like they can actually handle media queries. Again, I’m really pleased with the results and would like to share what I’ve been working on.

I’d love to see if we can make these even better, feedback is surely welcome.

We came across a lot of these performance issues whilst developing our new company site over at etchapps.com.

I’ll be first to admit we’re running 50 something requests and a 3.2Mb total page load on our heaviest page on desktop (ouch!), but we’ve done a lot to mitigate that pain.

The site is fully responsive (with a touch of adaptive) from around 290px all the way up to 1600px or so with a lot of imagery and so serving up those images in a nice way was essential to download speeds.

We lazyload our images with JS so the user can get hold of the content as fast as possible. They’re also resized at the server end so we don’t send any wasted pixels and smushed with ImageOptim. If the user resizes the window and hits a breakpoint, we check if there’s a more appropriate image size to serve up to maintain image quality.

On the JS side, we’ve tried to keep plugin usage to a minimum and written our own where appropriate to keep filesize down.

We’re also using the lovely HTML5 boilerplate htaccess file (as you suggest!) and it does a great job of caching files for faster downloads where appropriate.

Based on your article we’ll be diving into conditional loading for CSS and JS as I think we could squeeze a few more Kb off the site size by not loading some of our more complex JS on mobile.

Our own performance optimisation attempts have given us the same number of total requests on mobile but total page size on the heaviest page (3.2Mb desktop) is down to 1Mb on mobile. As the images are loaded with JS, initial download is only 15 requests and 400Kb.

Great article! I’ll be using it as a reference checker for further projects to help make sure we’ve got everything covered.

Very instructive, thank you. Regarding conditional loading for CSS, wouldn’t it be more appropriate to use the media property with a proper fix for IE – media=“screen and (min-width: 40.5em)” – even if it adds a HTTP request instead of a 2kb script like eCSSential ?

It might be worth clarifying that in its current state, Picturefill is not a polyfill of the <picture> element, but rather a div-based pattern that mimics picture in ways that are safe to use today and moving forward. Also, by using media queries, client-side solutions like Picturefill offer levels of control that can’t be mimicked on the server, where only default known values such as device screen dimension can be derived.

@Andy – I had the same thought about eCSSential initially, but the initial (though informal) testing I did actually showed pretty good improvements. I know WebKit does handle it rather smartly (at least some versions of it) but I haven’t seen any information yet about how many other browsers do it as well. I’d love to see more formal benchmarking. :)

@Jacques – Unfortunately the media property doesn’t stop all those extra CSS files from downloading. See Scott Jehl’s tests for a bit more on that.

Not all mobile operators ‘optimise’ images and in my experience even those operators that do only seem to re-compress JPEGs.

For example, in the UK, O2 do re-compress JPEGs but Vodafone don’t even on really low throughput connections.

Even if all operators did re-compress images would you really want to leave the final quality of your images to the operator?

For some operators e.g. O2, you can disable the re-compression of images by adding @no-transform@ to the @cache-control@ HTTP header.

Tim:

Great post Mr Kadlec!

I’ve got some concerns with eCSSential, as using JS to determine which CSS files should be downloaded interferes with the browsers ability to pre-fetch and prioritise the order resources are downloaded.

Browsers are already making choices on which resources they should download first, for example in this waterfall for enochs.co.uk, the stylesheets that are not applicable are deferred to the end of the page load
http://www.webpagetest.org/result/121205_G8_9079577333c6bffac22fcbc6480e7220/1/details/ (requests 35, 36, 37)

I guess what we in the performance community needs to do is a bit more work on benchmarking approaches like eCSSential vs multiple CSS resources vs a single merged CSS resource.

Great post!
There are very much articles about responsive webdesign but only few handle performance.
As i redesigned my site i was looking for possibilities to test the loading time on mobile and i came across weblenz, it’s available in the app store. The lite version is free and tests time, items and the size of a website. I like it because i do not have to simulate latency, i can test in the “real world” on my iphone.

To the many fine incentives for lightening the (down)load given in the article itself, I’d like to add another. Please consider the millions of people in the developing world for whom infrastructure is a precarious thing.

I spend significant time in south-central Africa. Even in university departments where one might expect decent connectivity, web performance can be woeful.

The “world wide web” really is the world wide web! The more the kind of ethos advocated in this article can be grown, the greater the benefits will be for the most technologically challenged parts of the planet.

Great article man. It seems like design think they have carte blanche when it comes to responsive design, and page speed is an after thought. Might I add, DEATH TO THEHOMEPAGEIMAGESLIDER, its a design crutch that has seen its day!

Absolutely spot on, responsive shouldn’t mean reduce functionality or just hide images. Too many I’ll thought out irresponsible responsive experiences… Obviously it’s not that “irresponsible” it’s not like getting your dog to take the wheel as you crack open a beer, but you get the point.

Exactly. Having responsive sites that pump just as many or even more bytes than usual kind of defeats the whole purple of serving a “mobile-optimized” site, doesn’t it? I’ve always wondered this, and the article provides a great answer and great tips on cutting down on resource load. I’ve learned a lot this year already, and it’s only been the first week!

I just thought to add: making performance an essential component is great, but as always, spend time only on things that matter.

To wit: I really hope nobody confuses things like eliminating unnecessary HTTP requests and compressing resources, for nonsensical notions of premature optimization in code such as keeping everything in tight loops or avoiding using descendant selectors ever. Notice how this article doesn’t mention the latter at all. That’s because that sort of stuff hardly ever matters, and I dare say not at all on a mobile site, where less is more.

Performance has been a concern since broadband connections allowed us to serve larger and larger assets. Just because our connections CAN handle more HTTP requests doesn’t mean we should stuff them to the gills. I’ve found YSlow absolutely invaluable for trimming the network fat. It goes without saying to always, always compress and gzip your assets, minify your CSS, and concatenate and minify your JavaScript.

Principles like OOCSS and its natural expansion (SMACSS) have come along to keep our stylesheets lean and mean, which on a site that leverages a lot of CSS3, can be critical. Furthermore, we have Modernizr to aid with conditional loading of scripts only when we need them. ‘display: none’ should be used sparingly, if at all as that markup still adds page weight and it makes no sense to punish mobile visitors.

Think about whether you really need a library or framework before you use it, as even a small file size can be a death by a thousand cuts with enough of them. Again, if you don’t need it on every page, don’t load it on every page. If you do use a library, try to use native JavaScript as much as you can and limit dependencies. Dump your objects when they’re no longer in use to reclaim that memory.

Above all, I think responsive design encouraged us to really think about our content and embrace web design in its own medium rather than as an offshoot of print. It’s a sign that the web industry has matured beyond its print-inspired design roots to have its own idioms and conventions. And that’s what makes it so awesome. I’m happy to be a part of its growth.

That said, the mobile web networks aren’t as blazing fast as their broadband network siblings, and we should be planning for that from the onset. We’re not designing for 56k anymore, but performance shouldn’t suffer for our advancement. In print, it only has to work from a design perspective; in web it should work even better than it looks.

Great points made. Particularly like the part on not just plugging in frameworks as they come and the mention of testing with apps like Slowy.

Shaun: Not all mobile networks compress images/data and the ones they do seem to do it to different extends. I got my hopes up when I recently read the article on Boagworld about this topic, but unfortunately it does not seem to be as straight forward as we’d like. I hardly every is, isn’t it?

Technology is moving faster than ever and people are moving right along with it. Having a company website that is mobile friendly is important, obviously. Just take a look around – everybody is on a smartphone. But just how important is mobile compatibility?

We cannot forget that publishers needs to not only adapt their site to many devices (make their layouts responsive), but in order to keep ad revenues, they have to think of responsive ads (and not degrading UX of their websites because of showing too big ones!)

At Clearcode we see that there are two main trends – either clients starts developing native mobile apps, or they ask for a responsive web application instead. But almost everyone starts considering a mobile as a part of their strategy.

Thanks for this post, Tim! In addition to your framework comment, the proliferation of plug-ins has made it too easy to increase the number of HTTP requests for a page. Learning how to consolidate CSS and JavaScript and minifying will need to be part of most of my future workflows. Performance, I agree, is too often an afterthought.

This article is dead-on on how responsive design is not enough, and web performance needed to be considered at the design, not just development levels. My suggestion for a meta-design principle, embodied over at http://sustainablevirtualdesign.wordpress.com, is to include web performance and responsive design (part of inclusive design) as Sustainable Web Design. Other disciplines (architecture, industrial design) already include adaptive design with performance, measured by carbon footprint and energy use. Similar ideas can be applied to web design. WPO needs to be pushed into the earliest layers of design, including UX wireframes and stuff done in Photoshop prior to prototyping in code. If we leave WPO to engineers, we get “streamlined hippos” – klunky, energy-wasting websites that a site engineer can only partly fix.

This article is dead-on on how responsive design is not enough, and web performance needed to be considered at the design, not just development levels. My suggestion for a meta-design principle, embodied over at http://sustainablevirtualdesign.wordpress.com, is to include web performance and responsive design (part of inclusive design) as Sustainable Web Design. Other disciplines (architecture, industrial design) already include adaptive design with performance, measured by carbon footprint and energy use. Similar ideas can be applied to web design. WPO needs to be pushed into the earliest layers of design, including UX wireframes and stuff done in Photoshop prior to prototyping in code. If we leave WPO to engineers, we get “streamlined hippos” – klunky, energy-wasting websites that a site engineer can only partly fix.

> Whatever you do, donâ€™t serve a large image that works on a large screen display to small screens.

I want to be able to zoom in on an image and see a reasonable quality. Obviously it shouldn’t be 600KB, but 40KB is not much at all and can be a decent size, both for desktop and mobile. All the ‘responsive image polyfilling’ is NOT good for performance. Every bit of JS you can skip, skip it.