Morning, I'm trying to speed up my website. By doing everything yslow suggests and etc. But I want to test along the way. Every testing website I've tried shows different numbers with each test. Is there a site that simply shows the same speed each time? I can't conclusively test it the numbers aren't reliable.

The different numbers ARE reliable. It takes a different amount of time for files to download every time depending on the path from the server to the browser and the other traffic on that path at the time.

The solution would be to run say a dozen tests with the same files and record all the times and then take the average. Then make your changes and run another dozen tests and take the average again to compare to the previous average.

Ok thanks felgall. Question? One of the first things I'm doing is combining all my javascript. Are you certain it's faster to have one external js link with 5 large js blocks in it vs 5 separate links?

Ok thanks felgall. Question? One of the first things I'm doing is combining all my javascript. Are you certain it's faster to have one external js link with 5 large js blocks in it vs 5 separate links?

Combining five files together saves four file lookups - a small saving.

To offset that is the situation you might have with visitors only visiting pages that only require some of those scripts and so one or more of the five scripts never runs for them. By not combining the scripts you would save by not downloading the scripts that the pages they visit don't use.

Another consideration is that if all the pages reference the same file then when someone visits multiple pages that reference the file they only need to download it once.(Which is why with huge JavaScript files such as the jQuery library you are best to reference somewhere like the Google copy that at least some of your visitors will already have cached from visits to a prior site that also referenced it).

What you change to speed things up for one visitor may slow things down for other visitors so you need to work out how people are most likely to access your site to work out which will give the biggest savings for the most visitors.

I agree. Any real test will have to travel across the real Internet, and possibly wait for other real traffic. It's expected that the result will fluctuate. You'll simply have to run the test several times to get a sense of the typical or average times.

I've used pingdom before as well, and it seems like a great tool. Another very similar and also great tool is WebPageTest.

Combining five files together saves four file lookups - a small saving.

To offset that is the situation you might have with visitors only visiting pages that only require some of those scripts and so one or more of the five scripts never runs for them. By not combining the scripts you would save by not downloading the scripts that the pages they visit don't use.

Another consideration is that if all the pages reference the same file then when someone visits multiple pages that reference the file they only need to download it once.(Which is why with huge JavaScript files such as the jQuery library you are best to reference somewhere like the Google copy that at least some of your visitors will already have cached from visits to a prior site that also referenced it).

What you change to speed things up for one visitor may slow things down for other visitors so you need to work out how people are most likely to access your site to work out which will give the biggest savings for the most visitors.

The tests will always be different speeds because your server will not be consistent enough to always load at the same speed. Same with Facebook, you could do a speedtest for that 1,000 times and it will always come up with a different time each test. Servers always react differently depending on the usuage at that particular moment from other users, websites etc;.