It’s the World Cup again. Being a Brit, I am on tender hooks with the first England game coming up tomorrow with the USA. A family feud for me. We start to see great microsites such as the Twitter @worldcup site, and as we think about what the fastest goal will be… what about the fastest website?

Analysis: so it takes almost 4s until the user sees a visual indication of the page load – that is definitely too long and should be improved

Recommendation: < 1s is great. <2.5s is acceptable

Time to onLoad: 8.25s

Analysis: it takes the browser 8.25 to download the initial document plus all referenced objects before it triggers the onLoad event that allows JavaScript to modify the page after it has been loaded – again – much too slow as nobody likes to wait 8s until the content is loaded

Recommendation: < 2s is great. <4s is acceptable

Time to Fully Loaded: 8.6s

the page loads additional resources triggered by JavaScript onLoad handlers. I consider the page as fully loaded when all these additional requests are downloaded. I guess I don’t need to mention that 8.6s is not fast

Recommendations: < 2s is great. <5sis acceptable

Number of HTTP Requests: 201

Analysis: 201 – that’s a lot of elements for a single page. We have seen many images that are the main contributor to this load. My first thought on this -> let’s seen how we can reduce this number by e.g.: merging files (more details later)

Recommendations: < 20 is great. < 100 is acceptable (This one is a hard recommendation as it really depends on the type of website – but – it is a good start to measure this KPI)

Number and Impact of HTTP Redirects:1/1.44s

Analysis: This is a very expensive and it seems unnecessary redirect from http://www.fifa.com/worldcup to http://www.fifa.com/worldcup/

Recommendations: 0. Avoid Redirects whenever possible

Number and Impact of HTTP 400’s:1/0.71s

Analysis: There seems to be a javascript file that results in a HTTP 403 Forbidden Response and takes a total of 0.71s.

Recommendations: 0. Avoid any 400’s and 500’s

Size of JavaScript/CSS/Images: ~370kb/220kb/890kb

Analysis: Size of individual mime types is always a good indicator and helps to compare to other sites and other builds. 370kb of JavaScript and 220kb of CSS can probably reduced to a smaller size by using certain minimization techniques or by getting rid of unused code or styles

Recommendations: It is hard to give a definite threshold value. Keep in mind that these files need to be downloaded and parsed by the browser. The more content there is the more work on the browser. The goal must be to remove all information that is not needed for the current page. I often see developers packing everything in a huge global .js file. That might be a good practice but too often only a fraction of this code is actually used by the end-user. It is better to load what needs to be loaded in the beginning and delay load additional content when really needed

Max/Average Wait Time: 4.31s/1.9s

Analysis: this means that resources have to wait up to 4.3s to be downloaded and that they have to wait 1.9s on average. This is way to much and can be reduced by either reducing the number of resources or by spreading them on multiple domains (Domain Sharding) in order to allow the browser to use more physical connections.

Recommendations: < 20ms is good. < 50msis acceptable (as you can see – we are FAR OFF these numbers in this example)

Single Resource Domains: 1

Analysis: from the timeline we can also see that there is one domain that only serves a single resource. In this particular case it seems to be serving an ad. We can assume that this might not be changeable but this KPI is a good indicator on whether it is worth paying the cost of a DNS Lookup and Connect if we only download a single resource from a domain

Recommendations: 0. Try to avoid single resource domains. It is not always possible – but do it if you can

The KPI’s tell me that the page is way too slow – especially the Full Page Load Time of 8.6s needs to be optimized. With the KPI’s we can already think about certain areas to focus on, e.g.: reducing the network roundtrips or minimizing content size. But there is much more. Let’s have a closer look into 4 different areas.

Then they get to analysis on the network, caching, JavaScript execution, and is all adds up to an F :/

Browser Caching: F – 175 images have a short expires header, 4 have a header in the past

The site loaded immediately for me in (gasp) IE8, it took a little longer in Chrome 5, but the animations are much smoother. Page to page is really as well. My experience is no where near the times the article claims. Nice looking site, I wouldn’t change a thing.

Being an American, I have no idea what a “World Cup” is for or what kind of drink it would be used for. It sounds impressive but could it really be larger than a Super Ultra Mega Big Gulp? If you brits want to challenge us Americans on cup size, you better come prepared.

Also, I believe you mean “tenterhooks”. And I know it has nothing to do with this article but, just for the record, there is only one Math.

Hopefully this is quite relevant. I’m pretty confident that you won’t find a faster micro-site than this demo we’ve thrown together to try and encourage people to enter our World Cup 2010 Real-Time Push Web App competition.

Ajaxian itself could learn from this article as well; I ran a simular test on the Ajaxian index with these results: (ref. webpagetest.org – empty cache result)

Time to first impression/drawing: 3.24s
Time to onLoad: 25.82s
Number of HTTP requests: 181
Number of redirects: 3
Number of HTTP 400’s: 2
Size of javascript/css/images/flash: 238kb/75kb/2.6mb/1.1mb
Total number of DNS lookups: 36

All in all Ajaxian scores an F as well; no long time expires headers, no use of HTTP compression, no optimization of images (some images are just downscaled in HTML), way too much requests and all in all a sluggish performance.

Besides, I have always wondered why some sites are posting full articles on their index instead of just a lead and a link to the full article?