Main menu

A while ago Google introduced Google Web Light and I finally got around to have a closer look at it. So far this hasn’t gotten a lot of attention in general, but I suppose this might have to do with the fact that it’s only meant to be rolled out in some specific countries. Nevertheless it again underlines that performance is an important factor to consider when designing and building websites and I find that Google is going quite a long way here to provide a faster web experience for countries less privileged in terms of mobile bandwidth.

I believe that this should actually be somewhat taken care of by us, designers and developers of the web, but I’m not going to get into this, since it would be more than enough material for another post. And there have been many.

From the official Google Webmaster Central Blog:

In two weeks, we’re starting a field test in Indonesia to provide streamlined search results and optimized pages when the user searches on slow mobile connections, such as 2G. Our experiments show that optimized pages load four times faster than the original page and use 80% fewer bytes. As our users’ overall experience became faster, we saw a 50% increase in traffic to these optimized pages.
[…]
Update on June 11, 2015: Following the successful field test in Indonesia, soon we will be bringing this feature to India and Brazil.

I have tried it out on a few sites and it seems to do its intended job quite well. The results aren’t too pretty, but in the majority of cases you do get to enjoy the main content, since it is mostly readable. I’d even argue that the quality of the site displayed through Google Web Light is a good indicator of how well and how progressively enhanced a site has been built.

There’s quite a few things going on before a light site is shown to a user. For the most part it’s all about removal and compression. Following is a list of things which are getting stripped or modified, which I don’t claim to be complete, but it’s what I noticed so far:

Main navigation and site logo gets removed (I think thi is usually the whole header, which even works if there is no <header> tag used)

Remove all media queries from the stylesheet – this leaves you with the mobile version (sort of), no matter if the site was built mobile-first or desktop down (media-query-wise)

Remove class attributes and inline some of the original site’s CSS (I’m not sure what and why gets stripped here, since some rules containing e.g. margin-bottom and margin-right only retain margin-bottom, but not the margin-right properties. This behaviour doesn’t seem consistent, since not all properties get stripped the same way, as some of the observed pages have shown)

List style is set to none, removing bullets and thus even overwriting UA default styles

Inline and base64-encode all images, resize and compress them a LOT. The compression for JPG’s seems to be around the ~10% quality mark. (In one of my tests this way a 340Kb image came down to <6Kb)

I remember seeing the Chrome DevTools App a while ago, when it made the rounds on Twitter. This was quite a cool idea, but I didn’t really look into it and didn’t see the full potential it had to it. When I recently met Kenneth in Hong Kong, he used the chance to practice his talk “The Future of DevTools with RemoteDebug” in front of a small group of people at our friend’s office, I started to realise the full potential of what the standalone DevTools app was meant to prove. We then had him speak at our monthly Harbour Front event, where he demoed debugging with Chrome DevTools and BrowserRemote, which got me even more excited.

Last year I already came across the iOS WebKit Debug Proxy, which I liked using and thought was a great tool, until it broke because of a change in the DevTools protocol, which now has been fixed again. Now I can happily debug Safari on iOS with the Chrome DevTools again ;)

Being able to use your favourite DevTools to debug any browser you want is just perfect and makes debugging so much more fun, since there is no need to leave your comfort zone (aka favourite work environment) to get things done.

Not only because of this, but also many other reasons, the RemoteDebug Initiative deserves and IMHO needs to get more support and engagement. Try it out, spread the word and for better workflows in our development future, support it in any way you can!

This has never been an issue before and I’m not sure why UnCSS now wants to parse these external URLs (even though I can see that might make sense…), but it seems to be due to some of the recent changes in how UnCSS works. I filed an issue on this, but so far haven’t received any feedback.

As the current workaround to circumvent this behaviour and to avoid Typekit URLs being followed, I added a RegEx pattern to the ignoreSheets option:

options: {
ignoreSheets : [/use.typekit/]
}

Grunt UnCSS error #2 with .php input files:

While I had always specified one HTML file and one PHP file for the input list, this configuration:

I’m quite surprised this doesn’t work anymore, since I’ve been specifying the PHP file for quite some time, even though it seems to be a fairly widespread problem with PhantomJS. Unfortunately I did not find a better workaround than to remove the PHP file from the input file list for now :(

After these two steps listed above, Grunt UnCSS again runs as expected and works just fine. Ignoring the Typkit URL of course isn’t a problem at all and I can also live with the option of removing the PHP file from input, even though I’d prefer being able to use it.

tl;dr: Being put on “lowered priority to access your network” aka throttling makes for a pretty sad experience of browsing and using the web. Unless there aren’t any images involved, which hardly is the case. And it’s everybody’s job to try to make this better.

During December a WebRTC video chat sucked up the my remaining 7Gb of my data plan and reached the ‘fair usage data threshold’ and thus the provider ‘lowered my priority to access the network’. I’m not quite sure why they name it like this, but since pictures speak a thousand words, here’s a screenshot of what the mystic term ‘lowered priority to access the network’ actually means:

To translate the above download speeds to some reference points:

0.02 mbps = 20 kbps
0.15 mbps = 150 kbps

GPRS:

60 kbps

Edge, good connectivity:

250 kbps

3G, good connectivity:

850 kbps

DSL:

2 mbps

Cable Modem:

6 mbps

From the above screenshot we can see: Download speed is very slow, somewhere in between GPRS and EDGE for most of the time. December 16th was the day I got throttled, the readings above show the connection speeds I usually get to enjoy.

So out of curiosity about how this actually affects real-life usage and browsing experience, I ran some tests on a few sites, since as you probably can imagine, took quite some time to load.

This shall not be a comparison between sites saying one is better than another, they all have different requirements, serve different content and so on. It’s more to be seen as a showcase about what happens to your browsing when you’re throttled like this.

I picked sites that have somewhat a reputation for caring about performance, might be part of your daily use just by random choice.

These sites have all been tested with their ‘desktop’ versions with a MacBook Air late 2011 tethered to my throttled phone data connection. I wanted to actually try it from the mobile, but couldn’t get the iOS Remote Proxy to work properly. In this case, load times have been calculated from window.chrome.loadTimes() in Chrome. Furthermore, all sites have been tested from Hong Kong, which might slightly slow things down, since I believe sites to be faster from Europe or even within the US, because I can see a difference quite frequently.

I was mainly interested in how long it takes until you get to actually see something on screen and can start reading, rather than the full load times. Unfortunately for some of the tests firstPaintTime and/or finishLoadTime wasn’t recorded correctly which I didn’t notice at that time and thus aren’t available. I wanted to recreate the conditions by throttling the connection with Network Link Conditioner, but I somehow feel it doesn’t create the exact same conditions.

Here are the results from the sites I have correct data from. Unfortunately it’s also only one recorded result each, but I’m intrigued to go into throttle mode once again to make these tests more extensive. As I said before, these values should rather be seen as approximate, since depending on the throttled network, results might vary over the course of a day.

smashingmagazine.com

time to first paint:

1.09s

total load time:

n/a

creativebloq.com

time to first paint:

6.74s

total load time:

n/a

etsy.com

time to first paint:

n/a

total load time:

49.08

lonelyplanet.com

time to first paint:

12.30s

time to first paint:

4.91s

total load time:

4.91s

cathaypacific.com

time to first paint:

24.76s

total load time:

146.21

lufthansa.com

time to first paint:

n/a

total load time:

124.91

united.com

time to first paint:

14.44s

total load time:

70.72

bostonglobe.com

time to first paint:

29.35s

total load time:

332.75

lanecrawford.com

time to first paint:

30.36s

total load time:

146.52s

The one thing I noticed was that smashingmagazine.com was blazingly fast for every time I loaded it and that was so much faster than any other site. Remarkable job and well done :) For all others, and there have been some sites I just tried or even while surfing around while throttled, the web can turn into quite a painful thing unless there are no images involved; which is quite a rare case.

The throttling might be just an edge case, but it’s great to see if things work ok-ish under these conditions, because then it’s probably a great (or rather fast) experience when you’re on a fast connection. I’m glad to see so many people out there working hard on improving site performances and developing better performance tooling. But in the end it’s all of us who have to make a lot of smart and educated decisions about when to load what and how much of it, so we can improve not only our, but every user’s life. Because not everyone has the privilege of being on a broadband connection every single day. Yet.

A happy, healthy and also more secure year to you, dear reader!
I generally don’t do any new years resolutions, but of course there are things that need to get done at some point. I’ve been planning to update this site for quite some time, so the quiet start of the year is a good time to do that. Or at least starting somewhere, since this will probably be more of a step by step process over some time.

One thing I’ve been planning was to get a SSL certificate to serve the pages via a secure connection. There’s been a few articles on that in the recent time and I also wanted to find out how much of an impact on performance it had. If you’re interested to find out more about the why and how of serving pages via SSL, have a look at this question on Quora and this document from Google.

I started off with a SSL certificate from my hosting provider, but after setting it up and testing it only achieved an overall rating of B with issues that could only be resolved on the server, which in this case isn’t possible for me to do. Since I was looking into setting up a CDN as well, I now went with Cloudflare as the CDN and also use their SSL certificate, which in the end was much easier to install and rates with an overall A.

So far everything seems to work very well and I can’t notice any big difference in performance. SSL negotiation seems to slightly increase the wait for a first response, but overall it doesn’t affect the load times much. So far I can’t really say if the use of a CDN makes asset loading faster in turn, but this will show over time I suppose.
Have a good start into the new year and happy secure browsing!

A while ago I started working on a new side project called “Missed in HKG” with the main purpose to polish up my rusty JavaScript skills.

Here goes the back story: Many years ago I used to live in San Francisco and by that time, was a frequent reader of the two city magazines, the San Francisco Bay Guardian and SF Weekly. I can’t remember which one or if both, but there was a popular section called “Missed Connections”, which always was a fun read.

It also reminded me that I was looking for a good idea to build a side project around, a project with a purpose and an oversee able scope so it wouldn’t be too much work, because these things might never get finished in the end. It required to have all the functionality that I was interested in trying out and was basic enough to not eat up too much time. Building a ‘missed connections’ site seemed like the perfect scope and a fun idea.

I started to look for what else was out there and besides probably the most well known missed connections on Craigslist I didn’t find much else and have to say, not one of the sites I found seemed any good. Design-wise it almost felt like a little time travel back to the early 2000’s and back then and for obvious reasons none of them would display nicely on mobile. To me displaying ‘ok’ and thus being usable on a mobile phone is the minimum requirement for such an app. To fill that gap and to provide a prettier, better and faster mobile alternative was kind of a no-brainer and my new side project was born.

I talked to a friend of mine about what I wanted to do and he agreed it was a fun little project to work on and offered his help to build a proper backend and API for it. This was a huge improvement to the whole project and we started working on it.

To make a long story short, the site was built and has been finished for some time now, only getting some minor tweaks and improvements here and there.

So far we never officially announced it, since there was always something more important to do. To not carry one more to-do over into 2015, consider this the launch announcement for “Missed in HKG”, our own version of Missed Connections for Hong Kong.

Time flies and it’s almost the end of the year… Still we have planned the 4th edition of Harbour Front Monthly to send you off into the new year with some more practical take aways. This time our speaker Kristin Low will focus on learning UX by doing UX.

How do you learn UX? As a teacher of UX, I’m often tasked with helping to impart a designer-ly view of the world into the mind of students. Although each student learns differently, one common factor applies to all students: learning by applying.

The event will take place on Wednesday, December 17th in our usual location at the Hive in Wan Chai. You can register here and we’d love to see you there.