If there are two apps that do the same thing, and one uses less RAM than the other, I usually go with the one that consumes less.
However, if a better app uses more memory, I don't bother much. If it's worth it, I'll use it regardless.

Its more about the quality of the app itself rather than the memory usage. I have 512MB which is mostly enough although I am upgrading to 1GB soon, and still would rather use more memory for better managed features than something that uses less memory but is not as good.

I’ve got 512Mb of ram installed (XP Home) and habitually check the swap file for use or access. Can’t find any application, including AutoCad 2005 being used with Bryce at the same time that has accessed the swap file for normal memory use in recent months. This indicates to me that 1gig of memory usage is outside of most commercial software at present even on large projects. However the message passing system known as both the XP operating system which is in fact based on a now old Unix based kernel with bolt on bit’s, and the resulting memory management model as used by most software programmes accessing it which have their roots in MS Dos have a history and culture of conserving memory so this never seems to that much of an issue. Games are another story as are professional based graphics modellers using multiple CPU’s. But this is another story.

As memory prices fall programmers and programming techniques will become sloppier and so memory usage will increase. This is to some extent already happening, just look at the difference between Office 2000 and 2003. Memory use has never been an issue, memory leakage is more important usually caused by poorly written programmes where optimisation of the basic code has never taken place due to cost issues.

s memory prices fall programmers and programming techniques will become sloppier and so memory usage will increase

Click to expand...

sloppier is not the right word Dave, this is a little too cynical, no?

when programmer knows users have more memory, they would be poor programmers indeed if they didn't take advantage of the resources that are available wouldn't they...graphics, user interface, everything will improve if the programmer knows more resources are available.

for instance, when a programmer knows a user has dual CPU's, he'd better take advantage of those extra quanta, otherwise his program will not be as good as it could be...same thing with memory...if I'm a programmer, and I know most users have a gig of memory, you better believe graphics and animation are gonna reflect what I know the users hardware can handle

for instance, I have 512, which used to be fine, but now that I'm running a gps and leaving it running while I'm doing other things on my box at the same time, my memory goes under pressure all the time.

Well I’ve been doing mostly this and that and sort off, well chilling in (as opposed to out) and watching television programmes about the area of prohibition in America and it’s implications in the rise and politicisation of mobsters and some large corporations and the Kennedy family, all who’s fortunes were made in this era.

Any way I suppose that the way forward, as I see it in the personal computer word must be towards the direction of multiple CPU’s. This is already happening with all CPU manufactures now nearly ready to produce two CPU’s on a chip or more. This is not new technology but old, and has been around for more then ten years now. In only a few years time when you by a CPU it will contain 1gig memory on the chip itself and four CPU’s along with all the memory controllers and so on. This is already on the drawing boards of AMD and Intel. Having said that I remember ten years ago the single PC on a chip being mooted, this never happened for various reasons but mostly production tolerances, ever changing standards and a lack of flexibility in both the manufacturing process itself and software reliability which is still an issue today.

Had disk drives of the mechanical variety whilst cheap at the moment are already obsolete as a technology, and have been for some years. What’s the point of having a mechanical device (including CD/DVD writers etc) if you can just plug in your 50gig solid state memory pen into whatever port to transport data.

I have 512meg of mem, and I am constantly looking at the footprint of the progs because I play Neverwinter Nights and that needs all of the mem and CPU time it can get, so I am constantly "Trimming the Fat" whare I can.

Speaking of torrent clients, ABC is by far the best. I've tried Azeurus on the recommendations of my peers, but it would quite literally, send my computer to a screetching halt.

I have what is considered to be a fairly fast computer:

2600 XP
512MB DDR400 dual channel

Plenty of harddrive space; often observed to be a snail race with just the bittorrent client running against a freshly booted OS. Oddly enough, the resource consumptions are forgivable, but the performance seems otherwise. ABC has been the best of the lot.