I'm 100% on board with this idea. Let's serve up images we want to under the circumstances we declare. But is screen width the correct metric to use here? Perhaps it is, why would a screen that is 400px wide ever need an image that is larger than 400px wide? Well I think the retina display on certain new devices are answering that question for us. Those devices may respond to certain width media query but really have twice the pixels available and be able to show much sharper images. Yes, we have device-pixel-ratio media queries for that as well, but I still think we're dancing around the issue.

That issue is: bandwidth. If I'm in a place / on a device that has super slow internet, I'd love to be served a very light web page so browsing is still acceptably fast. If I'm in a place / on a device that has super fast internet, by all means ratchet it up and deliver me more.

This doesn't mean ignore screen size. That's obviously relevant when it comes to layout. This extension of media queries would give us a more appropriate tool for media in a world where the size of your device is increasingly unrelated to its bandwidth.

So I wonder: is it possible?

Share On

Comments

That would be kida neat. The real issue would still be user experience. Because I am on much slower connection, does that meen I get to see a much lower resolution, maybe even pixelated image? Optimizing your site for mobile devices first, can be a safe way to go.

I don’t see how specifying image sizes in em would be different to what we can already do, the problem is as they are bitmap – scaling them up results in loss of quality and scaling them down results in wasted bandwidth (as the file size is still the same). You can already scale images like this with CSS media queries, using % and min/max-width…

I’m sure a script can be done to “measure” your bandwidth. Something like sending from backend a date stamp, and then using JS to compare it with the date when the document and all its assets were loaded. I’d say that would give you an Ok estimate (and since it’s JS calculating it, you can do a callback afterwards depending on results)

I agree we need a mix of screen size, screen resolution and bandwidth.

But bandwith — moreover in mobility conditions — is really not stable enough to allow such criteria. During one single surf session on my iPhone, I can go from GPRS to WiFi/ADSL and back again to GPRS, so when/how would the OS/browser decide which is my bandwidth?

Experiments have been made with Yahoo!’s Boomerang, for example, but it’s not really accurate enough to be useful.

Latency is even often more important than bandwidth, for mobile browsing.

Additionally, max-bandwidth != available bandwidth. How many times has your phone shown 5 bars of 3G (or AT&T “4G”) and not even been able to load a single webpage due to delayed connection drop? Also, if you’re doing a mass-update of apps, your available bandwidth drops significantly, even though you might still be multi-tasking and web browsing.

While the ability to craft even more responsive interfaces in CSS would be welcome, I expect any near-/mid-term solution to this problem would be JS-driven — not CSS-driven — as the problem is too complex to be addressed declaratively. Additionally, responding to bandwidth is more than a presentational problem – figuring out what scripts/assets to load should be the primary use case.

This sort of comment scares me – with a more and more global Internet we need to remember that for some people what would be considered normal is way more than is even offered in their area.

The picture markup and -webkit-image-set are becoming more and more vital and need to be implemented soon – if your website has many US visitors for instance, then having low res/simplified content is becoming more and more important.

I would imagine that under this system, in certain cases I would just leave a lot of images out. When I’m on my iphone and can only get an edge connection, I’m not more interested in seeing pretty images than I am in getting the information I was looking for. While I love design, at a certain point, my interest in actually seeing the page finish loading is far greater than my interest in what it looks like. It’s accessibility before design; if someone can’t see your site, it doesn’t matter to them how pretty it is.

I wonder if there’s any javascript driven way to do this. Could you use an ajax call to measure the loading speed of a page on the user side, then mark the html tag with a class like: <html class=”25Mbps”>? Could you use that method to then actually leave images unloaded until you can identify connection speed, then either load big images, load small images, or wait to load any images until the content is loaded, and use javascript to load the image src after content? Maybe that could be a jquery plugin.

Great insight! How about a script that measures page load time relative to the page itself? The script fires in the head and measures time from then until the entire page including images has been loaded. It then stores a 1 hour cookie with a speed value. This value can then for the rest of the duration the user is on the site, serve appropriate images etc etc

That’s a good idea, Ash! That would mean even less loading for the user, and less HTTP requests. I might suggest that that cookie last a shorter period of time, and be re-checked frequently for users who are passing through different connection zones.

A similar idea I had was to use JavaScript to set a cookie that says whether the user has JavaScript enabled or not. That way, you could use a server side if statement to only load sections of content and images if JavaScript was not enabled, and then use AJAX calls to fill in unloaded content otherwise. An offshoot of that idea was to try and get a lot of websites checking for that cookie, and setting it if it wasn’t there so that it might be loaded the first time a user visits your site. Maybe by combining those ideas, you could crowd-source a package of information like that. You could distribute a plugin that checks for that cookie, and then sets it, then many different sites could use it. Every site could help every other site keep the user’s cookie up to date as for all of their current browser information.

Thats exactly how RetinaSwitch works. Every website who supports it, when their page loads, it makes a tunnel request to RetinaSwitch to check the “cookie” (it’s actually a localstorage variable) to see if they have disabled retina images. In essence, this variable is global to everyone, and in relation to the device that toggled the switch

The same idea could be built around the general assumption of speed throttling, rather than just retina images.

There is definately something here. The problem is creating something that doesnt require too much in the way of http requests, and too much shit happening all at once lol

i disagree with you. i do a lot of surfing over the mobile internet in germany and in berlin, i have a fast connection (fast in germany mobile internet means 1mbps :D) but a VERY slow ping. if my browser requests an image, it takes a load of time to solve the server ip and do the the http request but after hat, the image is being loaded quickly. if i measured the speed connection with ajax in the way you suggested, the result would be influenced by the ping time heavily. to my mind, javascript cant measure the connection speed exactly enough.
i think, the browser/os had to measure the speed, thats the only chance of doing an precise speed measuring. during a page load, the browser calculates the average loading FOR EACH DOMAIN/IP (be aware of slow servers) and then saves it somewhere. and i would be nice if the page could access that information with a tiny javascript api.
a totally different solution would be a technique using html attributes. maybe with the coming version of html or with a new spec from the w3c, this could be implemented. to my mind, this would be better than the javascript solution, because te browser/client could decide wether to load either the big or the small image. i’d introduce a new html img tag attribute called “smallres” (with a second img source) to be integrated into a html page. and during the browser parses the html, it recognizes that there is a small version and could load that depending on te connection. the browser could even ask the user about that.

I think that all of your suggestions and reasons for them are good. My suggestion was more of an: “Experimentally, how can we do this right now?” kind of thing. Obviously, the premise of this post is that it would be nice if the browser exposed that information for us in a very reliable way.

Out of your suggestions, I like the attribute idea more than the JavaScript API idea. I’m still wary of anything which uses JavaScript where it doesn’t absolutely need to. I would prefer to have a way that didn’t require so much markup as the attribute idea would, though. Either way, it would certainly be nice for the browser to expose that information for us naively. I just wonder what the fallback would end up having to be for browsers that didn’t support it. Would we just fall back to assuming the slowest connection? What about the fastest connection? Would there be a default we could assume where we just do nothing about it like we do right now?

We pretty much had exactly this. Back in the days, one could add a lowsrc attribute to an image – combine that either with a user preference (“only download low-quality version when available”) or automatic heuristics (‘I’m on a terrible link so I’m skipping on huge jpgs’) and you’ve pretty much got bandwith-sensitive images.

The speed of the connection is a good measure, but what about the COST of that data? A user could, in the foreseeable future, have a high speed connection from a cell network with a low data cap! I don’t want a website sending me larger files because the device I’m using is fast – especially when it costs me double.

I like the point you brought up because most of the “affordable” mobile data plans allow you to download 200-300MB in Germany using 3G and then your connection speed gets reduced to GPRS.
The following point might be a bit far fetched but what about mobile network capacity? Should people be tempted to generate high traffic because they can afford it? I guess you can hardly forbid someone to stream HD movies on his mobile device but maybe optimize websites to load fast regardless of the available connection speed.

Here’s my idea about it, it’s not super solid since it would require every browser vendor to get on board:

whenever you are actively browsing the web the device measures the download speed of certain requests vs. its size. A global “constant” is updated all the time and can be used in JS (platform.currentBandwidth).

JS can hook up CSS classes (.mediumBandwidth/.highBandwidth).

For backwards compatibility coders of course need to check for the existence of platform.currentBandwidth first.

I’m quite sure you would need to to this on the server side and it would require low-level access to networking facilities. If you calculate the average download speed of a session, the web app could respond and serve higher/lower resolution content. Node.js would probably be a good solution for this!

I agree with Chris on this. It would be useful to use media queries to serve up different images, but it would be much better suited to do it server side somehow.

My thinking is that it would either ping and determine what image to serve up (either resolution-wise or jpg quality), or that it will load as much image as it can progressively and stop after a certain amount of time has passed.

I think we are starting to get to the point where more front-end developers are considering performance. We also need to determine what is done with html+css+js and what is done with server scripting.

That could really annoy me – say if a site looked different on a slow machine at work, and on my mac at home. It would have to be used thoughtfully, say giving lover quality GIF images when the bandwidth is lower, but inevitably people would use it to change the amount of or appearance content, and then that isn’t fair.

This is a fantastic idea, Chris. As some have said above, it’s more complicated than “max” (or even “current,” whatever “current” would be ) transfer speed, it’s about latency, connection drops, etc., etc., so there’s more work to be done. But you’re absolutely right about screen size being a pretty inaccurate predictor of connection speed.

I think the data cap and cost are way more useful than the speed checks. And I think some of the caps would surprise you as well. For example, in Canada, the last time I was looking for a phone plan, the average data option was a half-gig of data as a $25 add-on.

What if there were just a media query that checked against a preference in the browser?

Maybe there should be a low-normal-heavy weight version of the same picture. As said before, the speed of the connection does not define the user experience. The speed of a web site depends on bandwidth, latency, number of ressources to load, number of threads downloading, etc. Moreover you can browse on many tabs at the same time.
That’s why i think it would be better to give alternative source with a description like : low/normal/heavy weight. The browser could therefore select the weight it wants with the best strategy:
* Network is free and responds correctly
* Network is ok but many tabs to load
* Download low, then download the heavy weight image when the user scroll close to the img.
* User defined strategy to not waste too much data quota.

And finally, this description will always be correct, instead of bandwidth measure. You cannot know today what will be the available bandwidths in the future. You cannot define today what will be the real measure that will always define the speed of the connection. Maybe tomorrow there will be a next generation proxy that would permit to download the high res at no-cost, even if the bandwidth is low (it sounds a bit strange, but why not?)

I agree with Mike. Plus I would add that bandwidth would be a tough measure; bottle necks appear other than hardware that may lead to erroneous values on this. I mean think of someone with a 4G device on a bad reception area, an over loaded router, performing multiple d/l’s.. etc. One could imagine that a throughput test could be performed at regular intervals.. but that seems a bit like forcing the issue.

Also one unexplored area is how to re-use something that we have for a long time: progressive loading. What if you could load the image low-res and keep loading until you have the full resolution ? Are there some aspects of progressive loading that we can re-think to make is more practical ?

The problem is that in order to be accurate enough, the download needs to be a good 400kb. You could have something similar to http://www.retinaswitch.com but for bandwidth. But problems arise in mobile when the user could be tested for high speed, and then still be dealt high speed websites when he is in a low reception area

I almost feel like my mind is being read sometimes when I visit this site.

I was pondering this in my head a few weeks back and thought “how much simpler life would be if…”

Bandwidth was one of the things it would nice to query. Problem would be how to determine it without using lots of bandwidth in the first place or using some javascript/library that likely measures a not insignificant number of kilobytes (and relies on JS being enabled).

How accurate would/could it be? We know that a “screen” is a screen. The width/height tend to be fixed sizes but can easily be detected when they change to another set size. Bandwidth constantly fluctuates and can’t be easily determine “on the fly” as quickly I don’t think.

It’s also interesting why the debate is occurring and what the potential uses are. I can see use cases for genuine images that are useful content. Or for serving an HD video as opposed to an SD video. If it’s used an excuse to serve or not serve bloat-ier designs with lots of images and backgrounds then that’s far less impressive as a use.

I would love to see your idea happen in real life. There is no reason why it couldn’t be a completely viable option. I tried solving this problem myself with http://www.RetinaSwitch.com but it is limited by those who use the API on their site. It would work if it was adopted but there should be a more elegant solution available for us to use.

Checking available bandwidth is fine but it shouldn’t dictate the user experience. I think the way it works now is ok, where the screen-size of the device is detected and then the appropriately sized image is served accordingly. You could further optimise these images to ensure excessive data is not used when serving a 320px image compared to a 1000px image.

Wow, there are many positive responses to this. Personally, I’m not that excited. Perhaps I’ve given it too much thought!

I’d hate to switch to a faster connection only to find sites are consuming more of it, potentially cancelling any benefit. It seems the faster the technology, the more it suffers from bloat/abuse. I can imagine many sites detecting I’m on a modest bandwidth thereby deciding to bombard me with their uninteresting or unnecessary marketing video material. I mean, ‘high quality’ is a subjective term.

Of course, network bandwidth is only an indication of the maximum potential. The user’s device may not be capable of processing the extra data (it could be low on memory, have a slow processor, etc), or the network bandwidth may be shared with many devices (usually is in my household).

I don’t see how a browser can decide the optimum speed also, because that varies greatly from site to site too, so it would have to be continuously measured.

It would be great to offer a ‘lite’ version of a site for very low bandwidth, but why media query it? Just have lite.css-tricks.com for example, so even people on high speed connections can use it if they wish. The problem with automatically choosing on behalf of the user is that the choice is not a user preference. A low-bandwidth user may not be that affected by visiting a website designed for high-bandwidth, particularly if client-side caching is effective.

Don’t like the syntax but like the general idea (why picture and not image). Think lots of things like nested media queries, browser version media queries & cobditional tags should come first.

Don’t understand how the bandwidth tag could be useful – surely with mobile devices bandwidth is constantly fluctuating. So would you have to measure bandwidth on every page load? Also didn’t it defeat the point of having a media query to then have to rely on additional scripts to calculate bandwidth?

Should it really be for a web developer to decide what image is a suitable size. Wouldn’t it be a good idea to have an image format that allows for multiple versions of the image (like .ico does) and then simply leave it to the browser to load the version it thinks is best? Just thinking out loud.

I think this would help the Web to be more global. There are a lot of first-world developers writing pages that work on first-world broadband and by being responsive to a low speed connection we could make pages accessible to more people.

I’m not just thinking in terms of giving lower resolution images – but how about ditching that massive background image in favour of just a background-color if you detect “dial up” or missing out that huge JavaScript library. It can be used on a lot more than you first imagine.

Fair idea Chris.
But i have couple of comment on this.
#1
To identify bandwidth from the browser, is better to download the large or actual image. Bandwidth is not identical to the system everytime, this required the check every time that has the http request(otherwise it may incorrect). The time to identify bandwidth is less than time takes for HTTP request.

#2
To provide feature min-bandwidth in @media is not enough. Here css should support nested styles also where we don’t have today.
example:
@media(min-bandwidth:10kbps) {
.small-img img {
background : url(small-img);
}
}

It has been fascinating exploring this topic and people’s thoughts on this topic.

I found myself flopping back and forth between wanting to be able to set the user experience depending on bandwidth and feeling that ultimately it has to be the user’s decision.

I agree with Lee Kowalkowski to some extent that if we open up this technology, while most developers probably will use it wisely it also opens a pandora’s box for those not so ethical to clog up your viewing experience with lots of unneeded content (i.e advertisements).

I think we need a bit of happy medium.

It would be nice to have the ability to choose for those users that are not savvy enough to choose for themselves but it would also be nice to be able to set something in you browser that says, either broadcast my speed as … or I prefer lightweight sites.

Ultimately for those that want to choose, they need to have the ability to choose. However, I don’t think this should slow the development of a standard that allows developers to find this information easily and reliably.

Simple. Start with the low bandwidth style sheet. Start a js timer in the body, then time how long until the page loaded event fires. The time will tell you how long it too to load that content (and you know the size), so then you know the bandwidth. Then switch the the appropriate medium or high bandwidth stylesheet, or add ‘high res’ classes to things.

While we are at it, a media query class or id would be nice. Who wants to write media="(min-width: 400px)" for every img tag? It should bemediaclass="landscape-phone"@media.landscape-phone (min-width: 400px)) {...}
or something simliar. Then we could turn the image source into a cascade that would automatically go up, even on re-size. I don’t think you would want to ever load the smaller version once you had already loaded the large version.

A browser API that would be checked on page load would be great. If a user could default to low quality and then go into the settings and raise it if they wanted to. I would hate to force every webpage to do a test load of data to test the connection on a mobile connection.

I like the concept because we are increasingly needing to understand the user’s context (on-the-go vs on the couch) to create responsive websites. Bandwidth may be unstable but it would be nice to have a query for network type.

Don’t some video players adjust the video quality based on the user’s bandwidth?

I can’t find a good example with a quick search, but I’m pretty sure I remember watching the World Cup last go around and the quality would adjust according to a “bandwidth” bar at the bottom of the player.

sure, but then i choose it for the specific video. to do that for the whole page (not just one video, were talking all images too) then we’re back to the old “choose what browser ur using”- or ” pick your resolution”-links on a splashpage…

I think the best idea would to have a setting across all browsers that the user can modify, or the browser can sorta sense if it ends up being variable with a single set name.

Having it set to 100% or max would mean bring on the high data useage, and 1% or min would mean I’m sorry but I have dailup, and 50% or half would be ‘totally average data speed in a developed country’

and it would scale % – wise with devices moving between networks.

It would be a user default like font-size:12px; and we would be editing the content bandwidth weight like a font size.

The only issue I see here is that bandwidth is a measurement over time when testing from a client. Therefore, you will need to take into account the time it will take to actually test the connection to find the bandwidth in addition to the actual time to load the page. This information could be stored in HTML5 storage or on the server for future visits, but that defeats the purpose as it would need remeasured each time your connection speed fluctuates. I could be overstating this as it may be a negligible amount of time, but it is just a thought.

a) We offer a lo-fi version, the user sets the option with a switch (icon, link, button, flag) and we set a cookie,

b) We detect bandwidth, set a cookie and offer an option

… and then…

a) We have a server side script that can create smaller (size/weight) images using GD or ImageMagick depending on the cookie set,

b) We set the options (mobile first?) using the data attribute. A javascript decides what to put on the screen depending on the cookie.

We could either dynamically generate CSS server-side or change the CSS file in use with AJAX.

Then again, we could simply set the rules for lo-fi stuff on the same CSS file and simply attach them to a body ID so that you can change the rules simply and just deal with it all by adding (or changing) the body ID, based on the bandwidth detection or the user selection… Which can be announced when the same page detects “Uh oh, this page is taking a bit too long” (much like gmail offers the simpler HTML version when loading takes too long).

I don’t suppose you even need to detect/measure bandwidth, just give the page x seconds to completely load. If it hasn’t, offer the lite option.

I’m not sure the CSS needs to be dynamically/responsively generated, and using AJAX may defeat the purpose too. I think it would be cleaner just to set a cookie or offer an alternate URI, e.g. lite.website.com, and use URL rewriting (e.g. Apache mod_rewrite) to select lightweight resources.

If you really wanted to emulate a kind of bandwidth media query, I suppose you could serve up low file size images by default and then perform a few different ajax calls requesting the higher quality images. Attach a low timeout value to the ajax calls, and if the caller times out then I guess it’s safe to assume they have low bandwidth. This would result in a funky effect where the site is littered with low-quality images and then the images just “magically” get better though… and some front-end developers wouldn’t be able to hang with the amount of javascript involved. This would also be cumbersome on an image heavy website, unless the img/div tags had some naming convention which was present within the .gif/.png/.jpg files themselves… there’s probably a handfull of ways you could refine the process… but still a bit of a burden.

Perhaps you could also have options/settings for the user to request higher quality images by default and store their preference in a cookie for later visits.

This would be totally awesome and seems pretty logical to me. To be able to serve up a page based on bandwidth would be a boon to the many users who still don’t have access to the blindingly fast connection speeds common in Europe and the States.

That being said, I’m not sure we’ll see this rolled out anytime soon…which makes me sad. Is there anyone over at the W3C that we can slip 5 bucks to get this fast-tracked? ;)

I believe there is more too it than CSS can handle. My website has content that is coming from many different servers on the Internet. Some of the content is delivered from my 100Mbps connection, some comes from YouTube, Vimeo, Facebook, Rackspace, Akamia, Amazon S3, and so on.

What is CSS going to do if the users connection is fast, but the speed of one of the servers is slow? For example, if Amazon or Vimeo are having issues again it is possible a page load slowly on average, even though the rest of the content may deliver at 100Mbps+ speeds. What will happen when some of the content is delivered by a transparent proxy at speeds much greater than the available Internet bandwidth?

I think it would be better to optimize content server side. Let Apache and IIS dynamically reformat the images and content. Lets have an option on the server where we can dynamically minimize our HTML, CSS and JS, when they are syntactically correct. Of course, the web server should cache said minimizations. Let let Apache and IIS dynamically turn images into progressive downloads. Let me store an uncompressed image on my webserver and rely upon the web server to dynamically resize it and make it progressive as the server determines is appropriate.

To differing degrees this exists with CloudFlare, Metacdn.com, Amazon and Rackspace load balancers, Citrix Netscaler ADC, and so on. AOL dialup had server side image recompression.

This would be a great idea even if it just whether the connection was WiFi or not. I keep thinking about those circumstances when rotating from portrait to landscape causes a new set of “wider” images to download.

This comment thread is closed. If you have important information to share, you can always contact me.

Treehouse is where you go to learn HTML, CSS, and how to build iOS apps. It's a complete education in modern web and app technology, designed to get you ready for a hot new job or to kickstart your own business.

The Lodge is a member login only area with access to video training on how to build websites from scratch using the best modern tools.

What now? I have some ideas for you.

Go explore CodePen!

As a front end designer and developer, you should have an account on CodePen so you can save your snippets, present your ideas, and engage with other front end folk. I'd encourage you to go PRO as well, to unlock the full power of CodePen.

Get the newsletter!

You should sign up for the CSS-Tricks newsletter. It's a clean copy of all the blog posts each week, combined together, right to your inbox. If email isn't your thing, there is an RSS feed, iTunes, and lots of other ways to subscribe.

Listen to ShopTalk!

Subscribe to The Lodge!

The Lodge is a members-only, ad-free video learning area here on CSS-Tricks. Just like the free screencasts, but organized into four large complete series. Membership is also the #1 best way to support CSS-Tricks.

We can do the real footer now.

Site Links

Colophon

CSS-Tricks* is created, written by, and maintained by Chris Coyier. It is built on WordPress, hosted by Media Temple, and the assets are served by MaxCDN. The fonts are Source Sans and Source Code Pro. It is made possible by viewers like you who subscribe to The Lodge and through advertising for products and services I like.