You don't always need 64bit, but the frequently with which you do need it are increasing every day. Consumer laptops are now frequently coming with >4GB of ram, a browser with many tabes open can easily consume more than 4GB. And remember the 4GB limit is address space, not total ram usage of a process.

Having both 64bit and 32bit support requires support in the kernel, 2 sets of userland libraries etc, and the 32bit libraries will contain support for more legacy features (ie anything that got deprecated before 64bit was introduced likely wont have been compiled into the 64bit builds).

So yes individual 64bit apps may consume more resources than 32, but having a mix of 32/64 and all the legacy baggage associated with 32bit libs going back 20+ years could actually result in higher resource usage than a pure 64bit system.

Then there are the quirks of amd64, where the 64bit mode adds a lot more registers for example... The lack of registers in 32bit mode can be a performance bottleneck, which is eliminated by running in 64bit mode. Many programs run faster, even if they don't take advantage of any other 64bit features.

By only supporting 64bit you also rebase the lowest common denominator, there are more cpu features that you can take for granted and use without having to have multiple code paths to support older processors.

There are many benefits to moving towards pure 64bit... The stupid thing for Apple is that they never should have supported 32bit x86 at all... Microsoft has a long legacy of 32bit x86 support, but Apple moved from powerpc to x86 *after* amd64 was already established. They could very easily have made OSX 64bit-only right from the very first non-powerpc version.

I understand your concerns about 64 bits performance and less 32 bits support bloat. However I would have to wonder why browsers needs so much memory nowadays, web pages doesn't features 4K pictures. Coders should be more frugal about memory consumption.

Apple chose Intel because of deal, better overall performance and power economy in 2006 face to AMD's offering, and also the integrated Wifi AMD was lacking (the whole Centrino stuff).

Apple made the transition in early 2006 when the Core 2 Duo would only be available later that year, once the first Intel Mac were shipped with 32 bits cpus.

High memory consumption is due to aggressive caching strategies to get you faster reload times. Many people visit the same sites over the day, and to improve load times it is reasonable to cache whatever is possible and keep it in memory. If you load a web page that is 40MBs, the next time you visit, it can load as little as 1MB, and no, it won't use hdd cache when you have enough ram available.