An excellent tool. Careful using it though, as it attaches to the system through debugging hooks and hence certain copy protection systems scream at you and make you reboot, and not run it... I'm staring at you SecureROM!

That's a bogus argument. Virtual size is absolutely unimportant as far as the performance is concerned. You have a fixed amount of memory in the system and if Firefox takes up a lot of real memory, other parts of the system will feel the pain. The fact that the *real* memory usage of FF 3.0 is low means that it is not being greedy about using system resources. Every process has the full VM size to play with, so looking at the VM size doesn't really tell you much about what effect that process will have

Is it just me or does it seem like 60MB or even 34MB is a LOT of memory for something that browses Web pages?

I mean, people used to make fun of GNU Emacs, saying things like it stands for eight megabytes and constantly swapping or eventually malloc()'s all computer storage. Emacs takes somewhere around 10MB or so on a RHEL4 box, and that thing is practically an operating system. It reads mail! Firefox doesn't even read mail, and it takes 60MB. Opera reads mail, but still 34MB seems just too big, too.

Maybe I'm just getting to be a cranky old man. Now you kids get offa my lawn!

Not for modern webpages. A single flash ad requires a couple of megs, tack on the capability to have multiple pages open in a single browser (that adds to the memory usage a little), a bunch of these ads, and the actual page content and it's pretty small.

Actually something of interest I've noticed is that since I got NoScript my FF ram usage has dropped considerably. I rarely get about 83MB with FF2 now, because it doesn't have to load the plugins and such.

IANAWBD (Web Browser Developer), but... there's just so much data for web pages now. You've got plugged-in interpreted flash code, graphics that need to be kept in RAW formats in memory because of speed, the full length and width of the page on an in-memory surface to pan through on a window.

Even then it still needs a dynamic layout for CSS and scripting on the fly. And even then some scripting is safe, some is not, so there are rules that the code has to implement like pop-up blockers, password managers, warnings on insecure pages, warnings on cross-site scripting, etc. All that and the browsers STILL need to be able to sensibly parse and display completely borked pages with invalid HTML.

The real shame of it is that the Dillo project is on hold now, even though with the tiniest fraction of the resources of the Mozilla project, it could very quickly become an absolutely amazing web browser. It's really the same thing that happened with Links-GUI... Two amazingly promising browsers, going nowhere.

They finally managed to get the code released for the half-finished port to FLTK last month, and there's been a massive flurry of activity on the developers mailing list [wearlab.de] and in CVS. I guess no one

But Emacs doesn't display images (somebody will probably correct me on this). Just the cached copies of all the images can take up quite a bit of memory. And from what I remember, it has to basically uncompress them to bitmaps, and keeps those in memory, so that can eat up a lot of memory. Also, all the CSS, DOM, and other information that a text editor doesn't have to keep loaded all the time probably uses up a large amount of memory also. Not to mention plugins like flash and other things that probably

It's the "interwebs" what's really bloated. My Firefox executable here is taking just 7.3MB, but then you open several web pages that take MBs each, and some more in uncompressed, parsed form. Then some browsers cache other stuff like rendered pages in memory, and you get those figures we're talking about.

Yes, but it's equally inefficient for Firefox to have to swap this sort of thing in and out when the OS is under memory pressure.

What we really need is a mechanism for e.g. Firefox to use large amounts of memory to speed up page loading when there's plenty of memory, but to optimize for a small memory footprint when I've got ten zillion Gimp windows and Picasa open.

Why should Firefox behave in exactly the same way in two totally different situations?

We need a way to tell the operating system that some memory is important and other segments can be dropped at any time (cached or precalculated data) provided the application is told so it can rebuild it when necessary.

The OS scheduler would choose which applications are idle to the user and dump some of the applications data.

Is it just me or does it seem like 60MB or even 34MB is a LOT of memory for something that browses Web pages?

I mean, people used to make fun of GNU Emacs, saying things like it stands for eight megabytes and constantly swapping or eventually malloc()'s all computer storage. Emacs takes somewhere around 10MB or so on a RHEL4 box, and that thing is practically an operating system. It reads mail! Firefox doesn't even read mail, and it takes 60MB. Opera reads mail, but still 34MB seems just too big, too.

Maybe I'm just getting to be a cranky old man. Now you kids get offa my lawn!

I used to browse the web on a machine with 8 MB of RAM. Total, including the OS. At the time, real time decoding of a JPEG was extremely difficult, but my current CPU has 100 times the clock speed and is 64 bit and has vector processing features. Yet, browsers still seem to make the same class of CPU-memory tradeoffs that made sense on a 68030. For example, I may have ten tabs open in a window. I can only see one of them at any given moment, but the fully decoded images are all sitting in memory for all ten web pages, despite the fact that the page could be re-rendered almost instantly on a modern system.

Since browsing a few web pages is seldom the only thing I do with my computer, I go and do other stuff in Lightwave, Blender, Photoshop, whatever, then I come back to my web browser, and I wait while the whole working set gets swapped back in. Then, I click on the tab I want, and I wait while the working set for that tab gets swapped back in. If it just rerendered the page from the original bits, rather than using cached decoded images sucking up RAM and whatnot, it'd have almost nothing to reload and worst case performance would be orders of magnitude better. Hooray for "optimisation!"

Oh, and can we get some ninjas to fucking kill Flash. Seriously, I shouldn't need a bunch of script blocking and flash blocking extensions just to be able to browse the fucking intarwebs without having a seizure.

What does it do you ask? Well, it does everything you want. It disables all flash by default. You can whitelist a site (like youtube) to always show flash. Or you can simply single click the space where flash would norm

Is it just me or does it seem like 60MB or even 34MB is a LOT of memory for something that browses Web pages?

But a browser doesn't just browse web pages. A browser is a limited form operating system, as it has an execution language (Javascript) and environment (the DOM). A mail client is relatively simple as it's just a texty protocol. A browser is HTML + XML + CSS + HTTP/S + JPG/PNG/GIF/etc renderers + embedded plugins + caches, and in case of FF, it has XPCOM and various other extensible subsystems.

The raw web data transfered for those pages couldn't be more than 10MB, why do they need 60MB?

Do you think that DOM editing of the document tree, for example, should be implemented by actually editing the raw data gotten from the web? Or, for another example, when you double click on a web page, how do you think the word on which the clicked happened is found in the document? I guess you are imagining that the whole document is reparsed, the placement on screen for every single thing is recomputed, reflowed and so on, and then the word in the click coordinates is found? Have you not considered th

Opera's no saint. After running Opera and Firefox 2 with no plugins for quite some time, they used the same amount of memory on my computer (Linux). I've been running FF w/ no plugins for a very long time on my machine due to its limited memory (512 MB).

While the rendering engine has an obvious need for memory, it's nice to see that they're cutting down on memory usage; it has been one of the biggest drawbacks of using Firefox. There's really no need for a web browser to use more than 100MB of memory. Fo

Actually, that 'low memory bug' has already been fixed - I've downloaded the beta and installed it on WinXP - after looking at 2 pages, Firefox 2 memory usage was at about 45MB; Firefox 3's memory usage was up to about 750MB after less than 5 minutes (and the same 2 pages; in two tabs, just as with Firefox 2.0.0.9), completely bringing the machine to a crawl (1GB mem; and apart from Firefox, Outlook, Eclipse and SquirrelSQL were open)...

I'm reverting back to Firefox 2 for the time being, and will file a bug report once I have some more time to find out what's causing the issue...

Really? Because I can do it in one page. I tried out FF3 yesterday. Opened our local search page (about a 4K page with very little other than text) and let it sit for 2-3 minutes while I worked on something else. Suddenly the machine slowed to a crawl, the drive went to 100% on, and by the time I could finally get Task Manager open a minute or two later, Firefox.exe was at 635MB and climbing.

It's already been reported. It's caused by the anti-phishing stuff built into Firefox. Apparently, despite the fact they could simply copy the 16MB Sqlite file down and use it, they choose to send down the data and then reload it into the local database. That's what burns the time, CPU, Disk, etc. for nearly 2 and a half hours.

Are they using the handy dandy Task Manager? If so, this is not even remotely accurate. In the age of managed memory, this is an estimate at best. Don't believe me. Open up internet explorer, run it a while and look at the memory usage. Now minimize IE. Watch the number drop like a lead balloon.

The Working Set (physical memory) size will drop, but the memory consumption (Private Bytes, Virtual Memory) will be the same. When a window is minimized, Windows mark the memory pages as candidates to be relocated in case of memory shortage. When you restore IE focus the Working Set size will return to the previous size.

I don't believe this is only IE--when I listen to music on my computer on an airplane, I minimize the music program because it uses less memory, and therefore less battery (from my tests, Winamp Lite minimized uses the least RAM when minimized of all the players I tested. It was better than Foobar and mplayer, etc).

In fact, I just tried the same thing with Opera--it dropped from 60,000 to 11,000.

I don't think it's an estimate--I think the program really uses less RAM when minimized.

When you minimize, the working set size is reduced. This causes pages to be swapped out to the pagefile. When you maximize (or restore), the working set size is increased, meaning that the application is *allowed* to use more physical memory, but that doesn't mean it's going to immediately start loading back the same pages it swapped out. It's going to wait for page faults to compel it to do so. That is why #6 is lower than #2.

When you minimize, Win32 sets the working set of the process is set to its minimum (a few MB). Each of the pages removed is indeed marked as invalid in the page table entries of the process. Physical pages have a reference count: removing the page from the process working set reduces the count by 1. When the reference count reaches 0, the page is moved to the standby list. In the standby list, the page cannot be modified (as it's no longer mapped anywhere). A copy is written to disk lazily, but it still exi

To echo a previous reply, the reason #6 is less than #2 is that when the application is swapped to disk and swapped back, the OS doesn't fully restore all objects that the application had in memory. It only restores the objects that the application is using, and doesn't restore the rest until the application requests some swapped objects. Given that most browsers are coded in C++, rather than Java or C#, garbage collection is a non-issue.

How does minimizing a program / using less active memory cause less power draw? I would imagine that the power you save would be from not having to render the visualizations, resulting in less CPU work. Memory chips aren't going to be shut off or anything like that.

That behavior drives me crazy about Windows, so much so that I will often resort to forcing the page file down to 2MB just to keep it from swapping my applications out when I Minimize them. I hate having a system with 2GB of memory and having to wait 30 seconds for it to page some application back in (slow laptop HDDs don't help) just because it thought I might want a lot of free memory for some reason.

Loading a five pages into the browser - 38,644KB
Loading a single page and leaving the browser for 10 minutes - 63,764KB
Loading 12 pages into the browser and wait 5 minutes - 62,312KB

I wonder what would have happened had he loaded 12 pages and let the browser sit for 10 minutes -- would the memory usage still be less than the single page/10 mins test?

Seems to me that memory usage must still spiral under 3 beta, otherwise how would the single page/10 min usage be less than the 12pp/5 min test? Sure, it's not as bad, but that number really caught my eye... more testing is in order if I can get some time away from the in-laws over the holiday.

Also, seems like a pretty crappy test to me, especially considering that most of the complaints with Firefox are with memory leaks, and not memory usage from opening a few pages. What happens after an entire work day of using the browser? Is there a significant difference in memory at that point? People who open their browser and look at 10 pages, then close it again will rarely ever have a problem with memory usage in Firefox. However, those of use who leave it open for days at a time, doing web development, and constantly looking at new pages are the ones who need to worry. It's like comparing the performance Visual Studio 2003 to Visual Studio 2005 on a project that only has 5 classes.

Form the tests that the developers have been running, most of the memory leaks in Firefox itself seem to be fixed (there are probably still some left). However, memory usage still remains a problem. I think this blog post [pavlov.net] summarizes their findings. They've been using dtrace and other tools to find out exactly what is going on.

Unfortunately, I think the damage to Firefox's reputation is already done. There are many people who have had negative experiences with Firefox who keep on harping about the "memory leaks" and I don't see how Mozilla devs can change this public perception.

I don't see that memory usage remains a problem for most users. It's just the vocal few who are having memory problems. The main problem is that these users assume this is part of the "normal" experience of using Firefox, so they complain that every user must also be seeing the same thing. They take no steps to fix or report their problems, as they consider the problem to be "well-known" and think developers must be idiots for not being able to see it.

If you're still having serious problems with Firefox, try creating a new profile [mozillazine.org] and installing the Firefox 3 Beta [mozilla.com]. If you still have problems, discuss them on the MozillaZine Builds forum [mozillazine.org]. If the problems do not get resolved, just switch to another browser. It's not normal to experience serious problems when browsing, so I don't see why anyone accepts it as part of the "normal" experience.

I agree that the damage to Firefox's reputation is already done. I've found that no matter how many reports come out that Firefox doesn't have a severe and obvious memory problem, the few reports that show a problem are the ones that become popular. If any of them just included instructions to reproduce the problem on other computers, those reports would be productive. Somehow, they always seem to leave that part out.

Have you considered allocating resources to work on the problem of hunting those leaks and fixing them? One of the two browsers you mention provides you with full sources so you have what to work on. You seem to be one of the many people extracting value from Firefox: maybe you could put some value back...

What about a task manager extension for firefox that shows how much memory each extension is using? Seems like it could be useful. I mean, we know how much memory firefox in general is taking up, but it would be nice to get a breakdown of where that memory is going to.

>last i checked it was plugin writers who were blamed for all the memory issues by MozillaWhich to me sounds eerily similar to Microsoft blaming 3rd party software for taking down the operating system.

Except "taking down the operating system" is very different, both in severity and root cause, from leaking memory. If you're going to allow extensions to run as part of the browser, you don't really have any control over what they do with the browser's RAM usage. An OS has the luxury of being able to partit

You can enable extensions not explicitly marked as compatible with Firefox 3 beta by going to about:config and adding an entry for extensions.checkCompatibility : false. I'm running the same extensions and usage pattern as with Firefox 2 and performance is MUCH improved, especially AJAX performance on Gmail and shutdown/session recover speed. Of course, it has only been one day since my last FF restart. FWIW, I'm running about 8 extensions and have about 50 tabs open across 5 windows; currently on my 2GB machine Task Manager shows Firefox 3 using 235MB, where in the past Firefox 2 would easily consume ~450MB or even 600MB+ under similar workload. (Of course in the past I only checked Task Manager once FF's performance became noticeably slow, so this is not necessarily a good comparison.)

Another point regarding your IE7 and Opera9 tests: as far as I know, all modern browsers choose to allocate more or less memory depending on how much memory the OS reports as available (certainly Firefox does), so users on different boxes can show very different results.

It's not just a js debugger:It tells you the size and speed of download of all elements on the page.It gives you a dom layout.It lets you modify html, js, css on the fly!It lets you right click an element and instantly find its source.It lets you click a little symbol next to each and every css element so you can enable/disable it on the fly to see what is causing what.It lets you view inheritence of any element in terms of css...

I just thought I'd mention this, I have no vested interest in firebug outside

Perhaps it's waiting for YOU to create it (why not, if you have the skills, you know?).

Seeing how small projects like Firebug can consume huge amounts of time for it's developers, I doubt that there are many people who want to dedicate themselves to a single project like Firebug when they could be doing *insert thing they aspire more to doing*.

I have to say, I don't think this attitude of "don't like it? Go make it yourself!" makes a product you are promoting look good.

Come on, this is not a fair test. When I go to bookmarks/open all in tabs in a folder, I usually open anywhere between 18 to 30 tabs. In fact the first thing I do is to open all the editorial cartoons bookmark folder under "open all in tabs". By the time I am done with email, I will have the 21 cartoons ready to be perused.

BTW I never found old FireFox's memory consumption as annoying as intransigence of some sites in refusing to support Firefox and the lax/laisse-faire coding for IE only. May be because at work I usually have a couple of four processor 16GB machine for development/testing. I used to have a dedicated 2GB machine exclusively for Firefox. But that old machine's hard disk started squealing with an annoying noise so I had to throw it away. Even at home with my puny 512MB 4 year old desktop or the 1GB 2 year old laptop I get by without any serious memory issues.

Memory usage really isn't a huge issue for most end-users. Sure if it was sucking up 800 meg with 2 or 3 tabs open people would complain but right now people are just starting to get used to the idea of tabs much less use 12 of them.
The memory usage now is hardly a system stopper for most people who only run their browser and mail client and maybe an office suite and picture viewer.

It becomes an issue when the memory grows at the alarming rate shown in the test. Considering the browser is the one app I have open all the time, and keep open for days/weeks at a time, memory usage does become a concern.

Also, considering some other applications I often have open are memory hogs too (Microsoft Word / Excel / Powerpoint, Apple Mail, VMware Fusion) memory efficiency becomes more concerning. Even with 2GB of RAM, I run into problems at times.

Either I got a bad build, or I've got a weird system setup. FF3b1 was using 180 megs (yes, 180 megs) of memory to load my intranet page, and would try and scream upwards from there before my poor IBM laptop (P3 800, 320 megs of ram) ground to a halt. FF 2.0.9 was using 30 megs.

I wish I could have submitted a bug report, but my machine would freeze before firefox actually crashed.

(and no, it does also take me 15 minutes to move a 20 meg file on my mac.....)

Seems like the author is playing up to some feature in FireFox 4 that releases un-viewed pages from memory after a certain amount of time.

I bet if he re-clicked on each of the 12 tabs after the 5 minutes was up, that memory usages would go back up again.

"using less memory" isnt always desirable. I have 4 GB of RAM in my system and i'd rather if the applications USED THAT RAM, to keep application response "instant", rather than un-caching stuff, only to pull it back into memory again when I want to see it.

"using less memory" isnt always desirable. I have 4 GB of RAM in my system and i'd rather if the applications USED THAT RAM, to keep application response "instant", rather than un-caching stuff, only to pull it back into memory again when I want to see it.

A good point! And firefox does seem to take this into account. I am running Firefox 2 on Ubuntu (right now!) on a Thinkpad T20 with 256MB of ram - it works fine.

Don't do that. "But I want to keep my applications in memory!" you might say. That's wrong. Virtual memory systems these days basically use main memory as a cache for the disk. It doesn't matter whether a page came from a file, an anonymous application allocation, or anywhere else. The kernel automatically keeps the most frequently used blocks in RAM and pages everything else out to disk. By using 0 for swappiness, you defeat that automatic management and force the kernel to treat application pages specially. You don't want to do that.

On Ubuntu 7.04 and 7.10 if you install the flash plugin nonfree package from apt-get flash works fine but whenever you try installing it from Adobe's site or the auto plugin installer, FF grinds to a halt on it using around 100 CPU on anything Flash related like Youtube or Slashdot's ads, disabling flash solves it, however on my other computer that is not much more powerful (slower clock speed of CPU but higher bus speed) when I installed it from the auto plugin installer it works fine getting only around 50% of CPU Max. Firefox or Adobe needs to fix this so Linux people can test the binary that requires you to install the auto-plugin and doesn't work with flash-plugin-nonfree. However, Firefox 3 is my preferred browser on my other computer and it was on Windows even more. My question is, why can't Firefox produce either a sane way to compile it (its a pain to compile it already...) or supplying.deb and.rpm for the builds to make it easier to install? Linux seems to be neglected by Firefox lately, with more strategy of stealing IE's market share then making a better browser on Linux. And Konqueror is painfully slow when on XFCE or GNOME (or just about anything thats not KDE) but perhaps KDE 4 will fix that....

If I don't shut down Firefox when I leave work for the day my system will be at a dead crawl in the morning - it shouldn't do this. (The only other program that acts like this is MS Streets & Trips). I am annoyed that Firefox is painfully slower to load certain pages - I do a lot of work for an in-house Quickbase application and MSIE blows firefox out of the water performance-wise, to the point where the same page in MSIE will load 3-5 times faster than it will in Firefox.

I am going to guess that you have a couple dozen extensions installed on firefox and most of them you don't ever use (or even think about). Get rid of the extensions you're not actively using and see if that helps both the memory and speed problems you're seeing.

So I started using the beta yesterday, and I can say that I won't be going back to IE or FF2. It runs extremely fast, stable, and is nice and polished. It seriously reminds me of the early releases of FF, but much, much faster. I've got about 14 tabs open right now, and its still running screaming fast. The earlier/. article is no lie, it installs in a heartbeat, opens fast, closes fast, even browses fast (as would be assumed given that it uses a smaller memory footprint, though I could be wrong about that). I reccomend.

Betas can be surprisingly stable, but I'd give it more than a cursory glance before recommending it widely. Until you've played with it for at least a week, you won't see its weaknesses. That said, I'm sure responsiveness and memory usage are better because they were two of the glaring flaws in the package and an obvious site for improvement. As for the rest, well who cares if other little issues remain, if they did fix those two glaring flaws and kept the rest tight it should be the de facto browser fairly

Installed and fired up firefox 3 beta 1. Went to visit www.speakeasy.net/speedtest, couldn't even hit enter. The default page wasn't even loading. My system slowed to a crawl. I checked the availible RAM, and of the 1GB I have in this system I had 2 megs free. Here Firefox was using 707.13 Megs of RAM... don't think the memory leak has been complete fixed (yes this was a windows machine...)

I wish I had mod points, because this needs to be brought to people's attention. Everyone seems to be claiming victory over the memory bugs, but for me (and you and many others) there are still random problems.

My system exhibits the exact same problem you describe. My Firefox will spike from around 66 MB of RAM usage to 700 then 800 then 900 and will just sit there chewing up more RAM until I kill it. I'd love to know the cause and even better, the solution to this problem.

It is happening in FF2 and in the 3 Beta. It doesn't happen on the same site every time. It happens most frequently when using JavaScript, but not always. I can't seem to narrow it down unfortunately.

Can someone please tell me what the columns are in English? While it's great to know how much "NI" Firefox has, I'd rather see the memory usage.

That's an ignorable column. Instead, pay attention to "VIRT" (virtual memory used) and "RES" (approximately the physical memory used). In particular, note that both of those figures are relatively lower once a few sites are visited, meaning that FF3.0b1 is both less memory hungry and less inclined to touch pages that it is using (i.e. should be better performance in practice, especially if you're doing other things with the machine too, such as running an IDE.)

Memory leaks are of course always bad and should be fixed, however, I have to say that a much more pressing issue is the tendancy for the interface to lock up ( especially on less powerful systems ) if one tab gets stuck loading or has to deal with a poorly coded javascript.Mind you it is perfectly possible that the two issues are related, and since my knowledge about the inner workings of firefox are, to put it very mildly, limited, I suppose I can't really judge what kind of changes would be hard to imple

Although I'm sure there are some Slashdotters who run Firefox on a 350 Mhz PII with 256Mb of memory, that is really not issue for me. Most people with a recent PC probably have over a Gig of memory and more like 2 onboard.

CPU utilization where the browser all of a sudden is sucking down 100% of your CPU or of a single core and/or crashes are just as important (or more). More than likely the memory leaks have related browser stability issues that can be addressed with single fixes but if the browser continues to have runaway CPU issues and crashes it will not matter HOW small a footprint of memory it uses.

I think we've got to the root of the problem that you and some other Firefox 3 Beta 1 testers are seeing.

Starting yesterday, we began receiving reports, like yours, of a new memory/cpu usage issue that happens shortly after a normal startup and can spike the CPU and chew up hundreds of MB of RAM. This is apparently happening to people with new profiles or in profiles that have a very outdated list of bad sites for the Phishing Protection feature.

What's going on is that soon after Firefox is started, Firefox tries to fetch updates to the site forgery list -- the lists of bad sites that allows Firefox to warn users about suspected Phishing attacks. If the profile has very outdated or no local list, as is the case for a new Firefox profile, Firefox is trying to bring down a complete, rather large, list in one big chunk rather than slowly in small chunks. This causes Firefox to consume large amounts of CPU and memory and can slow the users machine to a crawl.

This problem is due to the change in the "SafeBrowsing Protocol" which only affects Firefox 3 Beta 1 and nightly build users. If you're on Firefox 2, this isn't going to affect you.

The work-around for this problem was for us to throttle it on the server side. We've done that and if you try Firefox 3 Beta 1 again, it should be fine.

Because mostly on Windows, most people's RAM is stretched to the limit, if a simple program that people use every day (Firefox) will decrease memory usage, then they can focus on speed and in the end, if Firefox can be 2X as fast as IE, Konqueror(and by extension Safari), and Opera people will switch to it. And I actually have around 512 MB on both my Laptop and Desktop with the Desktop currently running Xubuntu and my laptop running Ubuntu 7.10 happily. And when Linux can resurrect a "dead" system like a crashed Windows system that someone may give you for like $10 that happens to have 256 MB of RAM on it and a slower but usable processor like a Pentium III, Linux can run fine on it however, if FF runs slowly, most people have little need for a computer if they can't browse the web with it.

What Firefox needs to do is in about:config have an option that would be "Use RAM saving rendering" have it set as true but if you have loads of RAM just set it to be false, good for people running Firefox on slower systems and good for people with 4 gigs of RAM