MemShrink progress, week 37

Add-ons

I filed a bug on testing the top 100 add-ons for memory leaks. The majority of these add-ons are not available on AMO and so are not subject to the leak checking done during AMO reviews. The instructions and current test results are here. Randell Jesup helped identify some some add-ons with unclear identities, and Nils Maier and Archaeopteryx have both tested several add-ons. A month ago I said that add-ons that leak are the #1 problem for MemShrink, and this bug represents a huge step towards reducing that problem. We’ll need lots of additional help to test this many add-ons, so please feel free to jump in!

As far as I know, comprehensive testing of the most popular add-ons has never been done before. Although this testing is aimed at finding memory leaks, it’s quite possible it will uncover various other problems with add-ons. Indeed, it’s clear just from looking at the list that quite a few of the non-AMO add-ons are of dubious merit, and this has stimulated some interesting discussion on dev-platform about third-party add-ons, i.e. those that are installed into Firefox by an external program.

Regression testing

John Schoenick’s excellent areweslimyet.com is getting close to being ready for a public unveiling. The site is currently live but password-protected, simply to prevent large numbers of people accessing it before it’s ready. (If you want a sneak peek, contact me for the password.) More importantly, it identified a large regression in memory consumption caused by the landing of the JS engine’s incremental garbage collector on February 19, which Bill McCloskey is investigating.

I guess the title could have been “All browsers suck equally bad now”. Firefox and Chrome have grown to become more resources hogs – closer to IE. These days I shudder to fire up the firefox as the memory leaks are terrible. Two hours of Firefox with a Facebook tab is enough to gobble up 25% of my memory. With Chrome, I would have 20 rendering processes showing up when I would have just 3 tabs open.

In the first 2 years of Chrome I had probably 2 or 3 crashes. These days, it crashes every week or so. I long for the 2006 era Firefox – clean, simple and lean. Its been a long time since I have used IE actively and with Safari lesser said the better. For the developers, we have to deal with the ultra slow update cycles of IE and ultraspeed update cycles of Chrome. In short, we are getting worse and worse in the browser arena. The market is ripe for a breakthrough.

“2006 era Firefox” would have been Firefox 1.5 or Firefox 2. Balaji may think he’s nostalgic for browsers of that era, but really he’s nostalgic for the web of that era, which was massively simpler than the web of today. If Balaji tried Firefox 1.5 or Firefox 2 today — with no HTML5 support, no JavaScript JIT and no hardware acceleration — today, I’m sure he’d quickly decide that 2012-era browsers aren’t so bad after all. I guess the interesting question is this: why do people blame browsers instead of websites for high resource consumption?

Bug counts

This week’s bug counts:

P1: 28 (-5/+5)

P2: 129 (-12/+7)

P3: 83 (-8/+12)

Unprioritized: 2 (-2/+2)

That’s a net reduction of one bug. But more importantly, there is lots of movement, which is great! It means that problems are being identified and fixed.

(Finally, commenters please note that I’ve turned on the WP-reCAPTCHA plug-in, and this the first post I’ve written since doing so. In my testing it’s worked fairly well. Hopefully it won’t cause problems.)

46 Responses to MemShrink progress, week 37

Ever since I upgraded to Firefox 10.0.2, I’ve noticed some significant drops in memory usage, especially regarding plugin_container.exe and Flash videos. For me, plugin_container would maintain or increase its memory footprint over time, despite tabs containing Flash applets having been closed. Now, the behavior seems to be much better; plugin_container will drastically decrease its memory usage as tabs with Flash are closed, even to the point of plugin_container dropping to as little as 2 or 3 megabytes of RAM usage as listed in Task Manager.

I agree with you that, for some people, it’s more about nostalgia for the old web rather than the old browsers. As time has passed, more and more dynamic and complex JavaScript and web plugins are deployed, and the demands of the browser grow greater.

I’m looking forward to the soon-to-be-released Firefox 11 for more exciting memory optimizations, as well as the new Firefox UX.

As to why users blame the browsers, rather than the websites, it’s largely because they don’t know any better. They don’t realise it’s all the adverts, “Share”/”Like”/”Tweet”/”+1” buttons and other tracking / analytics, effect scripts, high-res graphics and so on piled upon the websites until they’re at breaking point.

All they see is their browsers getting slower (or claiming massive speed boosts with each release, but not actually getting any faster) and using more resources.

Switch off / block all that stuff and suddenly everything gets faster again! Funny how that works.

Well, to play Devil’s Advocate for a moment (cut to scene at the arcade), I part of the issue is that browsers cheer-lead the latest, often experimental, web features that often cause the most memory issues. I think if browsers made sure that the features they gave to developers could be widely deployed without causing memory issues for the average user, before they actually release the feature (and in all fairness I realize that’s hard), that would be an improvement.

Now, obviously a lot of issues with memory management come from old features as well, but browsers again must share some of the blame for optimizing for new and faster features at the cost of optimizing for less resource consumption.

Most of the reasons why websites abuse memory is because they can, at least when the user first visits the site. Later on when they’ve been browsing a little, the impact of all those allocations and reallocations hits. But this goes back to the types of speed optimizations browsers do. Speed optimizations at the cost of resource usage provide better initial performance (and better performance overall if the web isn’t taxing it, but as you pointed out we live in a web where that can no longer be presumed), but worse performance later in the browsering session.

Hi, Thanks for such list. Yes, for average user..it does not matter if its browser/addon/site that is causing lag…its plain usability. If its possible to have Task Manager kind of interface to monitor who is eating how much (and fix by killing a addon/site), its difficult to justify that to user. And i think thats only possible if we have process per tab (not sure..just guess.)

I guess the interesting question is this: why do people blame browsers instead of websites for high resource consumption?

Probably because it’s easy to look at the process list and see that Firefox is using X megabytes. About:memory correlates some of its reporting to URLs but perhaps a memory report which shows memory used per site would be useful. It could show memory usage per site across all tabs so if I happen to have, say, Slashdot open in four tabs (main page and three story comment pages) I can easily see that Slashdot’s memory usage is currently Y megabytes.

Once memory leaks in add-ons are fixed, a new version of that add-on is released, right?

Do all add-ons out there update automatically? If not, you could have fixed a whole load of issues and made things much better for the users, but if they are unaware of the fact there is a new version of their favourite add-on, they could continue to use the old version for months, cursing at it and threatening to leave Firefox because *IT* is still awful.

It’s not just the add-ons, but how Mozilla communicates to users about the add-ons. It’s of equal importance.

Glad to hear that Memshrink is continuing to improve things. Please keep up the good work.

Third-party add-ons, i.e. those installed into Firefox by another program, are just ordinary programs, so they can have their own update service. For example, I think the anti-virus add-ons get updated when the entire anti-virus package updates.

[i]I guess the interesting question is this: why do people blame browsers instead of websites for high resource consumption?[/i]

Why? Because I have upgraded my computer from Single Core to much higher IPC, Higher Ghz, and Quad Core CPU, as well as 4 times the memory, and better graphics. All in all i am having something like 10 times more powerful PC.

And what has the webpage changed? It may look more beautiful, more Ajax and auto loading things, but yet these are comparatively SMALL improvements.

We have 10 – 100 times faster JS engine then what we had then. But we still dont feel any difference.

And to be honest we properly have snappier browsers then what we have now.

John Schoenick’s excellent areweslimyet.com is getting close to being ready for a public unveiling. The site is currently live but password-protected, simply to prevent large numbers of people accessing it before it’s ready. (If you want a sneak peek, contact me for the password.)

You can 🙂 But the live site has multiple graphs, and it lets you zoom in on subsections of the graphs, and you can see the data and revision ID and contents of about:memory for each data point. It’s pretty powerful.

Add-ons, especially ones not ever proofed for memory consumption or coding, can be a severe problem in user experience. Most people I know switched to Chrome because theirs Firefox profiles were too polluted with bundled add-ons which broke something in Firefox.

But from my experience (as advanced user), there is also a substantial blame for Firefox and this is due to the vivid memories of 3.6. On the same machine with half the amounts of add-ons (in 3.6, I had more than 120 enabled), Firefox balks pretty early if I open my news reader and opening interesting stories in tabs (blocking Flash, JavaScript, Cross-Domain-Requests). I hope this is fixed in Aurora and Nightly versions because of cycle collector changes. In numbers, 10 gets into trouble if 80 or more tabs are open, 3.6 handled 200 tabs better than these (and even higher numbers had been manageable).
Unfortunately, 3.6 hasn’t the tools like about:memory?verbose for comparing, but numbers like 75% for js-gc-heap-unused-fraction or memory fragmentation blocking 20% of physical RAM are bad.

Compared to newer builds, Firefox 2 had some advantages like modal dialogs for adding bookmarks which could easily be extended to show more folders etc. The current dialog is so tiny. With doorhanger notifications in 10, you can even miss the notification if you switch between applications because clicking elsewhere will dismiss it, e.g. the ‘restart now’ notification after add-on download to complete installation will be gone.

Sorry for getting offtopic.

Finally, website owners and designers also have to get their blame. In my humble opinion, the cpu consumption in the etherpad with the top 100 add-ons while typing fast is terrible.

I run the current release (10.0.2) and the nightly side-by-side. I use both about equally, but use them for different things. Today I noticed sluggishness in the release version; checked memory, and it has 1.15 GB (task manager) to 1.25 GB (about:memory) in private bytes. (Nightly is at 300 MB in private bytes.) About:memory also shows a 72 MB zombie compartment in the JS section; probably more, but that one caught my notice at the top of the list.

Given that I know a huge number of bugs have landed that haven’t made it to release status yet, I’m not sure if it’s meaningful to try to track down what may be causing this (if it’s even possible; I’m pretty sure I couldn’t reproduce it on demand, and I’d lose all info about it after a restart, so I’d have to hope someone could point me at what to look for). Is there any use in trying to provide this as info to examine, or do you prefer to skip past things that have possibly already been fixed?

Do you use FF10 and FF13 in similar ways? E.g. browse the same sites, install the same add-ons? If so, then reporting deficiencies in FF10 probably isn’t so useful. Otherwise, reporting may be useful. But note that having reliable steps to reproduce a problem is crucial for Firefox developers to have any chance of fixing it. E.g. if you can identify a particular site that causes high memory consumption, that’s extremely helpful.

If in doubt, file a bug, and the worse that can happen is that it’ll be marked as WORKSFORME or similar…

You can get some info if you follow Honza’s about:ccdump howto, which does not require a restart. The root cause might not be obvious even then, however; then again you don’t have to spend a lot of time on it.

One thing that would help is if the tabs would tell the user how much memory each tab is using, or which one is using the most. The BarTab addon apparently could do that, but it’s not maintained for current versions. Firefox (firefox.exe + plugin-container.exe) is still using a large amount of memory. I believe I have it configured to unload all but the most recent three tabs, which are all simple Google searches, plus this page. I’m not using any plugins that I know of on any of those pages. So memory use should be very low. It’s not. But I know I could work around some of the problem myself by closing the tab(s) that Firefox can’t handle, yet for Firefox 10.0.2 at least, I cannot find that information.

I long for the first Firefox I used, which I think was 3.6. The web hasn’t changed much in a year, or at least my use of it hasn’t changed, but Firefox jumped off a cliff and never recovered. I even doubled and maxed out the memory (1 GB to 2 GB) in my PC, with no noticeable improvement. It used to work great, which is why I switched from IE, but all the versions after 3.6 eventually crash, because they want to use more page file than I can permit. I can’t buy a new PC just for Firefox (except maybe that Raspberry thing…)

This comment has nothing to do with this post to be honest, but I wanted to tell you that I greatly appreciate your efforts. I’ve been using Firefox now for years, how many I don’t know, and its been so long technologically speaking I don’t even remember the version. I think it was 2 something.

I know that in the Open Source world there are many reasons people work on OSS but as someone who gives meager contributions back I know how awesome it is when someone compliments your work.

So, thank you. On my work PC before you efforts Firefox would freeze randomly for about 3 seconds and then snap back to life. In the past 8 months or so that problem has almost disappeared and I attribute much of that to you and your team’s efforts.

Websites are always pushing the boundaries. Why? Well you see, Mozilla tells them to! Ok not just Mozilla but in this week of B2G isn’t it more obvious than ever that Mozilla – and in particular Google – think that the browser is infinitely scalable?

In short the open web has never had any limits on developers in terms of performance with the one exception being the “Script taking too long” prompt.

Mozilla says it’s mission is to promote the open web. Perhaps it’s time Mozilla changed it’s mission to one that promotes efficient and realistic use of the open web?

As for Google well they want everything including a kitchen sink to run via a browser. Developers just keep pace with what browsers allow them to do. If browsers solicit open slather adoption from developers, then developers push the boundaries, is it not at least equally as much the browsers fault?

If you want less blame towards browsers, take responsibility for openly soliciting – and more importantly facilitating without qualitative analysis – both bad and excellent developers.

I suggest it’s time for browsers to expose poor websites to users. This is already possible with tools like Google PageSpeed and Yahoo YSlow. There is no reason why browsers cannot give web pages efficiency ratings based on the various criteria that is finally evolving to evaluate bad developer practices. If you want the blame for poor sites slowing down this wonderous ‘open web’ then you will simply have to organise some method in which users can identify the difference between poor sites and poor browser behaviour. This is why it’s critical that MemShrink ensures the underlying memory use and performance characteristics of web pages, Add-Ons and general browser functionality is measured separately! This is partially happening isn’t it? But where’s the UI? I’d suggest some sort of red yellow green traffic-light style colour coding in the location bar or the favicon’s background colour. Ironically some clowns have removed about the only means for users to determine when it’s poor network or server or website behaviour and not browser-based: the status bar! I’ve personally used Extended StatusBar for years to expose the weight, speed and loading time of all websites I load. Additionally the user feedback that displays in the status bar has always been superb for giving anybody who makes an effort to learn the basics of networking, full feedback on what their page is doing. It will say “Looking up … blah.com” which is DNS, also “Waiting on blah.com …” which tells the user it is a network lag slowdown … etc. Rather than the stupidity of removing this status bar, it really could be the home for an increase in user-UI-feedback that helps users realise where the problem lies. I can think of a lot of copy:

For https pages the status bar could add

“Secure page loading …
Decrypting …’

For all pages, after “Looking up …”, “Waiting …”, you could add “Rendering ..” and/or “Processing JS …” etc. There many options. The best thing is that people will see all this text change super-fast when everything is normal, but the step that lags sits there for ages and the user then has time to read that, notice what is going on, and deal with it.

Here’s a list of how you can evolve browser UI to enable users to understand when the perf hits are not browsers related:

1) progress bars should have speed not just % completed
2) All substantial stages of loading should be documented in the status bar
3) Visual indicator of a site’s speed should be built into the favicon background
4) Expose render time, JS parsing, CSS parsing decryprtuion
5) expose protocol type (SPDY or HTTP

Basically the more the merrier. You want smart users who can differentiate where they choose to lay the blame? Tell those UX clowns to stop dumbing down the interface with tactics like removing the status bar!

Brilliant post I wonder if this will ever get to the mozilla developers. Its true they just keep adding new crap and new features which is a big reason the browser is slow, yet they just keep piling in newer and newer features, oww and they spec out new features too, just incase they think the current list might run out. These features Im talking about is HTML5 and other crap that just isn’t too important. And the user isn’t told about any of it. I think this is just plain business. Don’t forget this is mostly about business and getting people to use your product. But mozilla was kind of special putting the user first and being innovative, they need to start being innovative again. I think they need to fire a lot of people in their design teams who are just pushing their single minded views on how things should be with their mockups and theme enhancements when all its nothing more than wasting time getting the product nowere and most importantly its not being innovative. Mozilla should set up a committee that actually analyzes productive features vs features that feature crap. They should also take a more critical approach to the way they interact with the user and present their product instead of a bunch of cute foxes and other animals. Agian this is to get people to use their product and they are loosing focus of whats important. Remember that it was chrome that gave Firefox a good kick in the ass. And it seems they are not recovering from it. Competition is good for these kinds of things.

I’ve seen dramatic drops in memory usage with the last few versions. Zombie compartments are still a big issue for me because I use Firebug, but Firefox 10 and 11 have definitely brought memory way down. I’m concerned about the regression in 12 but glad to know that’s been pinpointed and is being worked on.

One thing I’d still like to see improved is that my video playback suffers pauses a lot, which has come a long way since 3.5 but still has issues. I believe garbage collection is part of that, so maybe the incremental GC will fix the problem once and for all if its memory use can be fixed.

Some 3.6 users are clinging tightly to their version because they’re convinced that the current version is a worse story memory (and performance?)-wise.
Do we have any hard numbers on this?
EOL is coming up and many will have a decision to make. Perhaps we can create a page with numbers & graphs to illustrate that things are better and put them on the consumer facing Firefox site?
(If we have relevant UI responsiveness numbers we can use those too – but not flat JS performance, that has already failed to convince them. And perhaps it should.)http://ask.slashdot.org/story/12/03/04/047248/ask-slashdot-life-after-firefox-36x

I came to 8 from 3.5. I had tried 3.6 and found its startup time was so disastrous I had to roll back, and it also seemed worse on memory than 3.5. This was with a fairly large number of windows open and a few tabs (on average) each.

My experience was that 8 was a little higher on memory than 3.5. 9 was similar, and 10 and 11 have been noticeably lower. I’m still having problems with zombie compartments in Firebug, so the memory usage for a regular user is probably quite a bit better.

With the landing of the incremental GC, along with all the memory fixes over the last half year, the Nightly builds have really reached a point where I’m happy using Firefox again. The last week or two has been running really nicely. Firebug I think is the last thing that still causes issues, and its usefulness outweighs its negatives; I can just disable it when not using it, anyway.

The FF10 release is still a bit of a shambling beast. Aside from the 1.25 GB memory usage from a few posts up, it’s still commonly sitting in the 500MB-800MB range, while nightly is at 300MB or less. Just a couple of months ago they’d almost always have very similar amounts of RAM usage before I’d be forced to restart as memory climbed. I only still use FF10 for 3 reasons:

1) Technically nightly can still break stuff; that’s what it’s for. I always want a stable version available as backup. (can’t wait for 13 to hit release)
2) It’s nice to be able to compare stuff across versions in case of regressions or whatever.
3) Going from two programs with 5 windows/30 tabs each to one program with 10 windows/60 tabs makes finding stuff more difficult. Though it may let me stress-test Panorama a bit more, it’s also way too easy to forget about the hidden background groups.

If you open the error console (ctrl-shift-J) you’ll probably see some kind of error. If you hit the “clear” button in the console and then reload about:memory that might make it easier to spot the error. Can you cut+paste the error message?

That’s the one. An assertion within aboutMemory.js is failing, unfortunately until recently some of them were missing the extra parameter that indicated what the problem is. That’s been fixed, but you’ve got an older version. What version of Firefox are you running?

Ok, I see which assertion is failing now. It’s a surprising one… document.title has an unexpected value. Did you enter exactly “about:memory” into the address bar — no extra stuff, no upper-case letters, etc?

If you can find the file called aboutMemory.js in your installed Firefox (I’m not sure exactly where it is on Windows, sorry), if you can change this line:

Would you mind opening a bug report? Then I can commit that patch which gives more info, and then when you update your Nightly you’ll be able to give me more information. That seems the easiest route forward…

“Why do people blame browsers instead of websites for high resource consumption?”

Please tell me that’s rhetorical.

Users can not properly place the blame for bad performance, if they lack the tools and instrumentation required to do so. Google, for example, pointed this out quite explicitly, in Chrome’s release day promotional material back in 2008, and has put a high priority on addressing this problem with Chrome’s task manager.

Firefox seems to be gaining the needed instrumentation, but it is still in ugly “about:” pages designed for nerds developers. If you want the average user to place the proper blame, the necessary information has to be put into a form that is accessible to the average user.