Update on Add-on Performance Testing

Just over a week ago we announced our add-on performance initiative and have received lots of feedback from add-on users and developers. Thanks to our awesome community, many developers have updated their add-ons to have faster start-up time, and others have dug into the results and filed bugs to help us improve our testing framework. Our automated tests don’t work perfectly with every add-on, so if you aren’t able to reproduce our results, please file a bug and let us know.

One of the most popular components of our performance initiative is the Slow Performing Add-ons page. We’ve made some changes to the page to more accurately describe the performance results shown and hide add-ons that have minimal impact on performance from being included in the list. Now, only add-ons with an impact of 25% or more are displayed instead of an ordered list of all add-ons tested. Additionally, we’ve corrected some wording that referred to these add-ons as the “slowest”. We apologize for this oversight.

We initially planned to begin displaying warnings for add-ons that add more than 25% to start-up time this week, but will delay that until we can verify and fix any major issues discovered by the community. However, all of the add-ons currently displayed on the page have been verified as causing a significant impact on Firefox start-up through manual testing and real-world data, in addition to the automated testing.

As we said in the announcement, this is only the beginning of our work to improve and educate about add-on performance, and we’ll continue improving our tools and documentation to help developers.

All of them. These percentages are based on average Firefox startup times of 500 ms. What you get in the real world however is cold startup (that’s what users mainly complain about) and non-clean profile state, so 1 or 1.5 seconds for Firefox startup while add-on initialization times typically don’t change.

It’s not like this is the first time I explain that. See bug 648742, nobody replied there so far.

This is the cold vs warm startup argument, which has also been discussed among ourselves. While I generally agree that cold startup overhead is a more realistic metric, I don’t think that warm startup overhead measurements are meaningless or nonsensical.

The difference would be that the overhead percentages would be lower, as well as our expectations. If we currently set the bar at 25% overhead for warm startup, then it would probably be around 10% for cold startup, and I doubt the list of slow add-ons would be any different. Yes, the percentages would be less dramatic and some users and the press would be less shocked by them, which I know is not a trivial concern.

However, we don’t have a system in place to properly measure cold startup performance, so we use what we have. It’s definitely worthwhile looking into it and move in that direction as quickly as possible, but dismissing the results outright as nonsensical seems counterproductive to me.

Jorge, I don’t see you find a way to get realistic results any time soon (other than throwing Talos out of the window and using ping data instead). It’s not only cold startup vs. warm startup, it is also profile state – a clean profile allows for much faster startup times than one with a properly filled places database (there are probably also other contributing factors). Which is why you shouldn’t use that metric in the first place – absolute numbers seem way more suitable in helping users make a decision. You admit it yourself – the current numbers are only suitable for shocking users and press which will eventually result in killing off every complicated add-on out there.

And I don’t really care where you put the arbitrary boundary at which you call add-ons “slow”. This is a pointless discussion right now given that you don’t have enough usable data to base a decision on. Once the most serious bugs in Talos measurements are fixed this can be looked into again.

I haven’t read your latest blog post when I wrote that. If you are right that the add-on slowdown is not proportionate to Firefox’s overall startup time, it is indeed an argument against percentages. However, this doesn’t change the fact that absolute numbers also can be misleading. So the question is, which sucks less?

You cannot make it right for everybody. However, so far my impression is that absolute numbers allow better decisions. I don’t think that they vary too much on current consumer hardware (worth investigating but I can only test on hardware that I have in my household). And somebody using a netbook or a 10 years old PC will know to expect higher numbers than for the “reference platform”. Right now the effect is that *everybody* expects much too high slowdown numbers which is IMHO unacceptable.

The previous comments have already more than covered the point I was going to make, which is that 25% sounds horrible when your start up time might be 5-20 seconds already. If, as Wladimir seems pretty clear on, those percentages are nothing like what the average user is going to see in a cold start with a well-used profile, then it would make sense to give some real numbers instead of percentages…

For moderately sophisticated users, reporting the times in milliseconds rather than as percentages would be much more useful.

But the biggest problem I see is that the page at https://addons.mozilla.org/performance strongly implies that add-ons that increase Firefox startup time also slow down Firefox all the time as you use it. Is there any data to back that up? If not, the wording on the page should be changed, disclaimers added, etc.

On the contrary, add-ons like AdBlock Plus (with, say, the EasyList filter), which were present in the first version of the table, make your daily browsing significantly faster than a naked Firefox installation.

While a step in the right direction, the presentation of this data isn’t the best.

Also, any add-on that stores a disproportionate amount of data (in total) in prefs.js will slow down start-up times very significantly. Example: if you run a few Greasemonkey scripts that all make heavy use of GM_setValue (which stores string keys in prefs.js), to store, say, some few hundred kb together, startup time gets completely unbearable.

Justin is a Product Manager helping experimental projects reach their potential in Mozilla Labs. He previously worked on the Firefox Marketplace for HTML5 apps and led the Firefox Add-ons team, helping millions of users customize Firefox to make it their own.