FYI, I'm contemplating some changes to the grades from WebPagetest and wanted to see how people felt about them:

CDN - Change this from a grade to a yes/no (and probably green/yellow) since CDNs don't make sense for all sites. The big red F has been hard for people to ignore, even if they serve a small market all geo-located close to their servers.

JS/CSS Combine - Change the grading for this to account for browsers not blocking on JS download and using 6 concurrent connections (usually). I'm considering changing it to only start deducting points after 5 files before start render (to allow the base page to still be downloading) if it is on the same domain or 6 on a separate domain and then deduct a grade for every 6 files (any mix of JS and CSS). IE 6/7 will still suck pretty badly but it's not nearly as much of a problem for the more modern browsers.

First Byte Time - I've seen WAY more back-end problems than I would like (usually CMS systems like wordpress without caching implemented) and those are getting a complete pass currently. I'm considering adding a new grade for the first byte time and use the socket connect time as a baseline (since that is the native RTT). I'm thinking RTT + 100ms = A and then deduct a letter grade for every 100ms after that. Any redirects would pretty much trip this automatically.

I'm with you on CDN and First Byte Time. Can you describe more of what the grading algorithm would look like for JS/CSS combine? Even with browsers that don't block and support concurrent connections, combining files makes sense, IMO.

JS/CSS Combine. This is not that simple.
a) The current check is for JS/CSS files in the HEAD only, so it fails on those ASP.NET pages with lots of JS files at top of BODY, which equally hurts rendering.
b) Modern browsers do lookahead when bumping into blocking JS files, loaded with script src=file.js. This means the browser will look further into the HTML and start downloading resources, if a connection is available. But. The browser will not continue rendering! So yes, in IE6/7 it's worse because these dont do the lookahead and scripts block downloading, but even in modern browsers, the rendering is blocked until that JS file has been loaded, parsed and executed.

Actually, the current JS/CSS check is for anything that loads pre-render so even at the top of body should be detected.

Optimally it would be zero JS before render but it may be a bit too soon to be pushing that hard. With the concurrent loading behavior, there should be no difference between 1 and 6, right? That's basically the change I'm suggestion but instead of treating css and JS as different pools of allowed requests, lump them all together and put a limit on how many you can load pre-start render.

Another option would be to just detect the blocking activity explicitly and flag that but the logic around that might be more difficult. That way conditional styles or inline script would also be caught.

@obiwankimberly

Combining files may make sense (though Bryan had a good example of where separate files would actually be faster because of slow start) but collapsing down to 1 is not as critical as it used to be. Additionally, if you can get better re-use across your site by having 2 or 3 of each then you might actually be able to deliver smaller files and do incremental updates (a site-wide file, a template-specific one and a page-specific one).

I don't want to just make changes for the sense of changing things though (particularly to the main grades). The grades have always been the "these ALWAYS apply and fix these before you even consider looking at anything else" issues and I think the JS/CSS combining is losing some of that certainty.

Another thing Pat re: the JS/CSS combine.
If the browser is using the 6 connections to fetch CSS and JS files, it cannot fetch other assets. That is another reason to do combining.
- 1x JS
- 1x CSS
those start loading while HTML is coming in
Browsers have 3 connections left for fetching other assets, probably images.
Once HTML is in - and JS and CSS stil loading - another connection is available for fetching a resource.

But .... you may not always want to combine all CSS and all JS into one file, *especially* if you are putting that big JS file in the HEAD.

Re: the JS files at top of BODY ... I've seen the grade not being F often, so probably those JS files were high up in the BODY but bot blocking rendering entirely. From what I remember, nothing really was rendered, so the UX was bad, but yeah ... WPT can't know that.

Now how to create a good rule for this?
BTW, I totally agree on this:

"The grades have always been the "these ALWAYS apply and fix these before you even consider looking at anything else" issues and I think the JS/CSS combining is losing some of that certainty."

Maybe it makes more sense to just demote the combine check to be part of the optimization details score card (with warning icons on the resources that should be looked at) and remove it from the main grades (and putting the first byte check in instead)? Particularly given that it's not all that clear-cut for all sites.

-1 for that, because I think in *many* cases combining will help.
I often see that moving the JS out of the HEAD and/or load in non-blocking way is hard for devs, too little knowledge & experience to figure out how to do that in a way that works well cross-browser, meaning it takes a lot of time to do that ... resulting in not doing it. Only good thing left to do: combine.

Hmm ... maybe you can indeed take it out of the main grades, put it in the Perf Optim list, and put a few pointers in there (try using a script loader like LABjs to load multiple files in non-blocking way, while preserving exec order and combine with inline scripts, etc etc).

First, combining these two, when they are up there in the rendering path, is a huge, and sometimes simple, gain. Our former site would probably very well serve as "Worst practice example", like the "CNN in JS/CSS concatenation". We had, when this portal moved under my responsibility, something like 30 CSS'es and 15 JS'es in the HEAD. And the impact IS huge. So I still would see it as main grade.

Second, we have different behaviour on at least IE7 and IE8, regarding JS in the body of the basepage included via Script-Tag. I wrote something about my observence in my blog. The point is: I get different grades on this one if I test the page with IE7 compared to when tested with IE8. And that is somehow confusing. (IE8 engine seems to look ahead and pull the download of JS external Scripts in the body ahead, which results in a worse grade)

I am a little bit hesitating, taking it of the main grades (As important as I think this still is), because a valid algorithm for this beast to evaluate from the top of our heads doesn't seem too obvious yet.

Kind regards,
Markus

P.S.: @Pat: Just watched the Lightning Demo Video of yours on Velocity in front of my Laptop together with Volker Hochstein. Felt like home :-) And hats off for all the new stuff.

Time to first byte is not always in the control of a webmaster. (it is, but technically it's for the advanced only, not for your average wordpress blog owner). Giving a grade will cause people to switch hosts or become extreme in their changes, often needlessly.

There is an awesome wordpress plugin called 'Debug Queries' that tells you just how sql intensive your site is. Other plugins like w3 total cache can cache database calls and 'Debug Queries' will show that your pages go from 8 (up to 50 calls on some blogs) down to 2-3 database calls. Needless to say at 2-3 calls the time to first byte improves and the % of pageload time swings to the code side. I like to see sites spending 97% of their pageload time on the code side and 3% or less on database calls.

Can webpagetest detect database optimization and incorporate it? If so I'd .