Beacon support isn't there right now but it shouldn't be hard to add. Was going to look at supporting having it send up HAR files and raw results so that falls right in line with what you are talking about.

Also on the list to consider is a way for doing recurring tests (at a minimum for the private installs but maybe a fixed set of pages on the public site as well) which combined with ShowSlow would give you interesting trending reporting. I'll ping you offline because I'm working with someone on possibly standardizing the various interactions which would allow for even more creative mashups.

First off, I really love the idea of automating page tests. It looks like you have a great start for doing just that.

Currently, when I run a test, I look over the results and pretty much never come back to it ever again - at least no more than a week.

I would love to see hourly, daily, weekly, monthly, and yearly trend data for a particular webpage or even an entire website. I realize this could be done manually but it would be very time consuming to do - especially when the entire process can be automated.

The first thing I would focus on is getting the testing process using a database instead of files. This way you have better control over the data.

The test history page could be setup a bit differently. You could have unique URL links (or titles such as My Homepage) on this page. Click one of these links could open up a new page that lists dates and times that tests were run for this particular URL. You could have links here to view hourly, daily, weekly, monthly, and yearly trend data if enough data exists for these particular URL. You could also have a link here to run test now so the testing data is already propagated into the test form.

On the actual test form, you could have a drop down box thats labeled something like "Run Test" and then "Once, Hourly, Daily, Weekly, Monthly, or Yearly". You could have this option only available to registered members so it would encourage people to register.

If a test fails to run successfully for various reasons, you could have an automated email setup that informs the website owner that something went wrong with the test. This could be an indication that maybe their website is currently down and they need to fix it.

For the trend data, you could have flash graphs where data points are represented by circles. Clicking on a circle could goto the test results for the corresponding point.

I'll be adding trending support at some point but I'm not sure if it will ever see the light of day on the hosted instance here (more for use in the private installs). The main reason is the test system capacity. Doing recurring tests hourly for a large number of sites would consume a significant amount of the testing infrastructure. I may do something where I have a set of industry sites that are trended and may do something where I allow registered users a couple of pages but I haven't figured out the details yet.

It's also a space where there are a fair number of commercial offerings (Keynote, Gomez, BrowserMob, WebMetrics, etc) - some of which have free offerings to an extent.

Internally at AOL we already use the guts of the WebPagetest infrastructure for exactly that purpose so I'm pretty familiar with the different pieces that would have to be covered for a useful offering but we also have hundreds of test systems and a fairly large database cluster for running it on (and even there we gather exponentially more data than people end up actually using).

The first thing I thought of when I saw you put the automated testing in place was the test system capacity. I know a lot of automated tests would put a heavy load on the server.

First, perhaps you could have a limit on the number of tests per time period that can be run from a single domain name. That number could be tweaked over time as more people use the website.

Second, you could have a script to automatically delete hourly data if it is more than 24 hours old. Thus, the trend data for a particular webpage would only show the last 24 hours. Likewise, for daily data, you could automatically delete data after 7 days. For weekly data, data could be deleted after 52 weeks. For Monthly data, you could delete data after 12 months. For yearly data, you could delete it after 10 years. If done this way, a single webpage would never have more than 105 database entries.

Third, you could have it where members have to log in every 60 or 90 days or their trend data would get deleted. This would encourage people to come back to the website.

Fourth, you could have a limit on the number of webpages that track trend data for a particular member. This number could also be tweaked over time.

Fifth, paying for a service would also be a possibility. If you do this though, I think you should have some sort of trial period or maybe even a number of free tests a single IP can do per day. That way you could still reel people in to use the website.

WebPagetest will always be free and there won't ever be subscription services. I like it that way as it's less complicated and simplifies building a collaborative community (also makes it easier to not have to support SLA's, etc). I am looking at various options along the lines of what you are talking about.

The code already supports running tests at different priority levels so the recurring testing would be given a lower priority than on-demand tests. Would just have to keep track of capacity to make sure the system didn't get over-provisioned.

As far as storage goes, even the hourly results are not a problem - where things get to be intensive are the storing of the screen shots, http headers and the transaction details needed to generate the waterfalls. For recurring testing it's likely that screen shots and http headers wouldn't be kept at all and the details for the individual requests would only be kept for X days.

One other option I'm considering is that users who run/sponsor test locations would get X testing slots (well above what would be made available for regular registered users) which would encourage more hosting of the testing infrastructure.

I'm actually probably going to support a beacon rather than a FTP dump so that when a test is complete it can post the data realtime to an external system as well.

In either case you'd have to build something to deal with the data (or use something off the shelf). I'm working with a few people in the industry to try to standardize the interfaces which would make it a lot easier to plug and play.