Posted
by
timothyon Saturday October 08, 2011 @09:21AM
from the those-sound-like-good-ideas dept.

bheer writes "Microsoft's Windows 8 blog has a good post about the work being done to reduce Windows 8's memory footprint. The OS will use multiple approaches to do this, including combining RAM pages, re-architecting old bits of code and adding new APIs for more granular memory management. Interestingly, it will also let services start on a trigger and stop when needed instead of running all the time."

Except for the services part, Windows memory management has been improving a lot with each version. It made a huge difference when they let the OS decide more intelligently where to put resources not in use to.

Most people who don't really understand memory management will just look at the processes and start bitching how much memory each program uses, or how Windows shows there isn't any memory available (while in fact it's just used for caching things). They're only half-intelligent, which hurts them even more than not knowing at all. The fact is, non used memory goes to waste. Every time there's memory that's free, well, it's just wasting it. It's much better approach that OS tries to use it all intelligently.

This same pattern of stupid comments can be seen in browser comparisons too. It's always full of people going "omg Firefox/Opera/IE is using this much memory!" while it shows that they don't understand what is really happening. The browser and OS reserves that memory because it speeds up things. If the memory is needed elsewhere, it can and will free it up. That's something that seems to be really hard for people to understand, as the same thing always happens in every browser story or story about memory management.

By what mechanism can a browser know when the memory it has reserved is needed elsewhere in the system? I don't think it works that way.

When people complain about browsers needing excessive amounts of memory they usually refer to memory leaks, not to intelligent use of memory through caching.

The bit about how some people misinterpret the amount of free memory the OS reports is totally true, though.

I recall people complaining that their Vista system with 8GB of RAM had no free memory. This was true of systems running with 2 GB RAM and 16 GB RAM. This tells me that much of that was cache but that didn't stop people from claiming that Vista was a memory hog.

However, you do have a point about the browser. If I leave Firefox running on a page that refreshes itself, like Slashdot, over the weekend, when I come back to the machine, Firefox is using over up to a GB or RAM and everything else is swapped out to the HDD. It takes several minutes for the system to become spunky again, and it usually requires a force close of Firefox. Firefox has pages cached on its own and OS knows nothing about it. All it sees is that Firefox.exe "needs" 1GB RAM.

Maybe Windows should find a better method to report how much memory is "in use" then? On Linux, there is no "free memory" either. It's always been a truism, that Linux will use all available memory, even if it's just for caching. But, I keep System Monitor open most of the time, and it reports that I'm using about 60% of my memory. Apparently, System Monitor recognizes that cache memory is actually "available memory".

I'm no memory management expert, so I'm not going to get into an in-depth pissing conte

Which is true on Windows as well. Im honestly not sure where people are getting this from; I have never seen a windows box with anywhere close to 100% utilization reported that didnt cause disk thrashing. Possibly people are using RAM cleaners or something, but task manager either does not report the caching as used RAM, or the caching is nowhere near as extensive as reported.

In 7 at least you can go to "Resource Monitor" under the task manager Performance pane, and windows has a detailed breakdown of the way it is using Ram (Hardware Reserved/ In Use / Modified / Standby (i.e. cache) / Free). The last two add to be "available". It also shows how much on-disk cache you are using. Interestingly, 8 isn't using all my memory for pre-fetching any more (it was when i upgraded from 2GB to 6GB).

However, you do have a point about the browser. If I leave Firefox running on a page that refreshes itself, like Slashdot, over the weekend, when I come back to the machine, Firefox is using over up to a GB or RAM and everything else is swapped out to the HDD. It takes several minutes for the system to become spunky again, and it usually requires a force close of Firefox. Firefox has pages cached on its own and OS knows nothing about it. All it sees is that Firefox.exe "needs" 1GB RAM.

A mechanism would be interesting where a certain process (say firefox.exe) would have a physical memory cap (say 256MB) - it could acquire more but everything over that would be swapped. Only the "most alive" part of Firefox would be held in RAM. Then all the rest of the OS would not be swapped out when Firefox goes a bit crazy. And it creates an automatic side effect where the background tabs are "hibernated" when not in use. Then again, swap is starting to be a relic of the old days so I don't know if it'

That would absolutely kill performance, and could make certain programs unusable. The current system seems to work fine. Having used firefox for several years (since the 0.9 days), I have never seen this awful RAM consumption except around 1.5 when I had about 25 addons. These days I tend to use chrome, but when I use firefox it does tend to remain open for days, with no issues.

As for the automatic "hibernating" of tabs when not in use, that already happens. Tabs that havent been used in a while are swa

A mechanism would be interesting where a certain process (say firefox.exe) would have a physical memory cap (say 256MB)

That sort of reminds me of how Mac OS worked in the dinosaur age.

Well, early MacOS and Windows. I had MacOS 6 and Windows 3.1 running on the same system (two motherboards shared disk and peripherals). When I tried to run an app with preallocated memory footprint in MacOS that would bring my total over the cap, the OS stopped me from launching it until I quit something. Windows 3.1 just crashed; which also stopped me:) Then Windows leapt ahead with Win 95 which crashed some of the time, but usually actually handled the memory for me.

More useful would be a message asking tasks to free up memory if they can. Tasks that can't (or were from prior to the new message existing) would simply ignore it and the OS would deal with them just as it currently does.

Tasks that are just holding onto memory for caching or other non-immediate uses could potentially free up a lot. Obviously wouldn't apply to a whole lot of programs, but being applied to a handful of important ones (say, browsers) could make a lot of difference.

Of course the OS would still need to be smart about it.. it would be too slow to try that on the fly.. but the OS could easily determine when it thinks its own cache is getting too small and can start bugging programs to free up theirs.

I recall people complaining that their Vista system with 8GB of RAM had no free memory. This was true of systems running with 2 GB RAM and 16 GB RAM. This tells me that much of that was cache but that didn't stop people from claiming that Vista was a memory hog.

One thing is cache, leaving things in memory after you've used them. But early Vista had a very aggressive idea of SuperFetch, pre-caching applications and other resources it thought you might need in best Clippy-style. So no matter how much RAM you had, Vista would churn on your disk. I think most people mistakenly identified this as swapping because Vista was out of memory. From what I understand later SPs and Win7 dialed it back to far more reasonable levels.

Except in practice, if a Windows [XP | Vista | 7] box is dragging tail, and task manager shows physical utilization of RAM at 94%, you can be darn sure that the cause is RAM exhaustion and the disk is thrashing to swap stuff into paging.

The ONE example I have seen of all RAM being legitimately used-- no matter how much you have-- is Exchange 2007 / 2010. No matter how much RAM you throw in the box, it will reserve as huge a chunk of it as it can.

By what mechanism can a browser know when the memory it has reserved is needed elsewhere in the system? I don't think it works that way.

I don't think that any browsers actually use this, but Windows has since Vista allowed memory to be prioritized. It is used by the various caching algorithms to ensure that even though a memory page has been used for caching, it will be given up instantly when memory is needed by an actual application (allocating with higher priority). When the cache later tries to access the page it will get a page fault. If it then tries to allocate and there is no free or lower-prioritized memory it will simply not alloc

So the AC points out how people who don't know what they're talking about "form a highly vocal opinion about it anyway!" and then proceeds to do the exact same thing about poverty in Texas, riddled with so many false assumptions it doesn't warrant a point by point response.

"Except for the services part, Windows memory management has been improving a lot with each version."

Are you forgetting Vista? It's only two versions of Windows ago. Windows 7 certainly improved on Vista, but Vista's memory requirements were hugely greater than XP's, for seemingly little benefit (despite all the little tricks they introduced).

"The browser and OS reserves that memory because it speeds up things. If the memory is needed elsewhere, it can and will free it up."

I understand the concept of RAM caching - it's not exactly rocket science. But how does Firefox/Opera/IE free up memory when the OS needs it? What is the mechanism by which the OS tells the browser to free memory?

I hope you're not referring to paging. Excessive paging to and from disk as you switch between applications is not a sign of a well-performing system.

I hope you're not referring to paging. Excessive paging to and from disk as you switch between applications is not a sign of a well-performing system.

Well, yes and no. If I remember my Windows 6/7 kernel design correctly, it was basically that every program requests memory, and some of it starts getting paged asynchronously during idle time. This allows the system cache and readyboost, if enabled, to really start working, keeping unused RAM to a bare minimum.

Unused RAM is wasted RAM, but you always need to have a little for the next application you want to run.

I understand the concept of RAM caching - it's not exactly rocket science. But how does Firefox/Opera/IE free up memory when the OS needs it? What is the mechanism by which the OS tells the browser to free memory?

I suggest you don't really understand the concept of ram caching at all.

OS memory management does not rely on the cooperation of applications to free up memory upon request. It simply pages out data-pages on a least recently used basis. Firefox might be sitting on a mountain of cache, but you can bet that 99% of it is not in active use, and the system will simply write these pages out and let some other program use the storage.

It doesn't ask permission from the application. It asserts control. This is tr

Too many people view 'free' memory as a good thing and would complain if IO cache was reduced to improve the 'free' memory. However, they can find a measure to soothe their worries. I assume it is also the case in Windows, but in Linux, for example, the categorization of memory usage as disposable cache is clearly delineated (though some cached memory can't be disposed and it's hard to tell what *that* value is, which is a problem). If free memory is under pressure, cache is safely dropped and it was as

Its using that much memory because of the incredible size of the page which it has to render and hold in memory as an image.

And for the record, I have it open right now in both Chrome (1 tab) and Firefox 7 (1 tab). Chrome, between its main process (~100MB) and its slashdot process (~30-50MB) weighs in @ 151MB (at the moment). Firefox, having a single process, weighs in at 140-150MB (varies over time..). When I open a second tab with the slashdot homepage, Firefox shoots to 170MB, while chrome shoots to ~

so what you are saying is Microsoft is spinning its wheels and there really is no problem with memory bloat in Windows and everyone else is wrong. I guess that's why PCs ship with only 4GB of memory as standard these days.and if Windows has been getting better with memory management with each version, why do the system requirements for memory go up with each version. Is it really doing that much more than the last version? Don't bother answering, I don't really care. From what I see this article is more pro

Nice try, but you can't simply give apps carte blanche to use gobs of memory by assuming it's being put to good use and speeding things up. True, there's a good case for the OS to use lots of RAM for caching, since it's the 'overseer' making all the programs get along and has the best picture of when one app needs to sacrifice for the others. But when a single application is a memory hog, it can no longer play nicely with others.

Bottom line, most memory-hogging software fails to run well at all on machi

I still remember at one place I worked they had me clean up a memory leak. Unfortunately I couldn't get it past QA because they didn't understand caching. Basically the deal is that when you free memory it goes back to the memory pool for the process and then the pool decides when to release it to the OS.(Which may be never) So when I freed my memory in debug build the pool immediately returned it to the OS. When QA did that in release the pool held on to the memory and reused it. I even showed them how if you did multiple processes one after the other you could actually see the app use more and more memory while after the fix it would plateau. (Because it was just re-using the memory it had already allocate.) They totally didn't understand, I might as well have explained it to the pavement outside the building. (In the end it just got marked as unfixable. After that if I saw any memory leaks while coding I fixed them as part of other bugs and then didn't mention it to QA.)

Except for the services part, Windows memory management has been improving a lot with each version. It made a huge difference when they let the OS decide more intelligently where to put resources not in use to.

I can't really speak to that, but there is a difference between management and footprint. I can speak to footprint. I usually run Windows in a VM for testing purposes or just to run the occasional Windows app. At some point the amount of RAM Windows can access becomes a limitation on usability and more needs to be allocated to make using it anything but an exercise in frustration. For all practical purposes, this is a minimum memory footprint for Windows and a few basic apps.

A) You are using crappy pluginsB) your profile has been in use for ages, and some corruption has crept in. Try backing your profile up, and creating a new oneC) your usage has simply increased, and you havent realized it. How many tabs are we talking here? How many are gigantic scrollable blogs?D) Youre doing something else wrong.

I hate to pull in anecdotal evidence here, but Ive dealt with several hundred unique installations of firefox across several diverse platforms and scenarios, ranging from WinXP-

The problem is that if it only shows up on some machines then it's not something that they can do on their own. And it's not just the web browser that can lead to unreasonable memory consumption, poorly coded or bloated web pages, extensions are also possible problems that end up causing memory use.

The problem is that if it only shows up on some machines then it's not something that they can do on their own. And it's not just the web browser that can lead to unreasonable memory consumption, poorly coded or bloated web pages, extensions are also possible problems that end up causing memory use.

Well, that's the thing, isn't it. Go to a site that runs Flash, for example, and you're invoking a large chunk of code over which the browser has little control. If people hadn't decided to turn the World Wide Web into a giant, animated color brochure but had stayed focused on content, we probably wouldn't be having this discussion.

Actually windows HAS been getting better about memory with each version except Vista of course. For example Win 7 has delayed start services and with superfetch and readyboost RAM is used much more efficiently for the things you actually do. as for why so many services on at once? I can answer that....grandmas. You'd be surprised how many times I've had to deal with someone's PC because they have a "power user" in the family that killed some service like imaging support and now their scanner won't work. fol

In a normal program in traditional desktop programming, state information in ram cannot be disposed of willy-nilly. Notably, Android took the opportunity of a new platform to declare out-of-view applications as having ram content considered disposable by default to get this benefit in 'normal' programs'. Hard for Windows to realistically do that. On the other hand, as an Android user, it is sometimes painfully obvious when an app I was 'running' in the background was killed by Android, so despite the pr

> it will also let services start on a trigger and stop when needed instead of running all the time.

Nice.

Although I have to wonder, why are "services" treated differently than other programs, in this context or any other? Does it have any positive effect?

First of all, it's worth noting that Service trigger events shipped with Windows 7 [microsoft.com].... they're just making better use of this capability in Windows 8. (This is a common flaw with Microsoft's development process for Windows.... they include some really smart new APIs but then take another 5 years to start really using them thoroughly in Windows itself.)

But to your main question -- why are services different from other programs? A service is actually a regular program, with one exception -- it hooks into the operating system to receive events telling it to pause, continue or stop its operation.

Why do this? Management. You don't want 20 different programs with 20 different ways of starting & stopping them.

A feature the Windows Service Control Manager offers is the ability to run your service in a single pooled process alongside other services that require roughly the same privileges on the system. You can see this at work in the Windows 7 task manager -- go to the Services tab and sort the list by PID. If you ever wondered what "svchost.exe" is on a Windows system, or why there are several running on your system, each under different user accounts...... there you go.

My first self-owned computer was a Kaypro 4-84. The OS was CPM and the machine came with 64K (yes, K) of RAM. When it booted up the screen said it had 63K of RAM. I thought I had been ripped off so I called the company. The tech explained that the other 1K was being used by the OS. So I don't think Windows 8 is going to impress me.

It's been a long time since I've dealt with Windows other than XP in a VM, and even that is rare.

My old Asus netbook recently died, so I was forced to go out and buy another. I bought an Aspire One loaded with W7S. I really wanted to like W7. Really. I liked the interface. But damn, it was really slow and memory hungry. With no pgms running, it was taking up about 560-580M of memory, compared to Ubuntu (11.04) taking 260-270M with no pgms running.

I really couldn't have more than two programs running in W7 without hitting 900M memory use. Granted, they were big pgms - Thunderbird and Firefox, both latest versions. But contrast that with Ubuntu where I ran TB, FF, Pidgin, Hotot, Tobmoy, LibreOffice and Rhythmbox all at the same time and never go above the 850M mark in memory use (at least not yet).

This release of Ubuntu has its own set of problems (Compiz, anyone?), but I much prefer it to W7. If MS can get Window's memory usage down I'd be more inclined to use the latest version.

I don't understand this way of looking at memory. Unused memory is just that, unused. It's basically wasted. Why is this seen as an advantage? As long as enough memory is freed up when it's needed then I'd rather have the OS find a use for it, hopefully for things like cutting down application loading time. Whether W7 puts it to good use or not is up for debate, I personally have no idea.

Besides, 4 GB is pretty standard these days, I really don't see the problem in an OS taking ~12% of that when it's not

Because the gains of having more gigabytes of I/O cache, for instance, is not that big, when compared to the time taken when an application actually needs that memory and the OS needs to flush this cache to disk? So yes, this makes things seem slow, especially when you want to open a program. Remember, most windows programs are killed when its last window is closed.

Since the parent mentioned VMs, guest OS instances are a good example of where a memory-hogging OS is a bad thing. It would be better to let the guest OS's memory usage rise and fall with their load, instead, aggressive caching basically implies static memory allocation to VMs because whatever you set as maximum, they'll use. This is a problem with Java too - you just guess how much heap the program might use, but don't guess too high, because the JVM tends to use whatever you let it, because garbage coll

That argument comes up frequently and is usually an attempt to justify swap algorithms that aren't aggressive enough at paging out the RAM. The problem is that a lot of things go in and out of RAM frequently and you will notice a significant drop in performance if that's happening regularly.

I remember spending many hours trying to figure out how I could get those last few kb of RAM freed up so that I could run my fancy new DOS game that really had to have either 512kb or 640KB of lowmem RAM. When I wasn't g

I remember spending many hours trying to figure out how I could get those last few kb of RAM freed up so that I could run my fancy new DOS game

Me too. It became sort of a game to me, in and of itself, to get everything loaded high. Using QEMM and a lot of fiddling I got ALL my drivers (network, mouse, whatever) into upper RAM, leaving only about 40 kbytes used of main memory. Yeah, I know, ridiculous waste of time, but it was entertaining

The vast majority of people -- even those who think they know how to interpret windows memory statistics -- don't know how to interpret Windows memory statistics. The common tools (like Task Manager) give meaningless numbers for both process and total system usage. Sysinternal's Process Explorer is better, but you still need to understand how the Windows kernel and memory management works to properly interpret the numbers.

I wouldn't read anything at all into the numbers you were seeing. 900M memory usage for two programs in Task Manager is just fine -- you quite literally *can't* get the real information through Task Manager.

Modern OS memory management is one of the most complicated things an OS does, and unfortunately no one has ever come up with a good way to distill all the information about what is really going on in your physical memory into a single number or statistic that lets people know if something is wrong. The only real statistic that matters is the percentage of pages that the total sum of processes are actively using relative to the commit charge... a process with a gigabyte of memory mapped files, or a hundred megabytes of shared code pages, or hundreds of megabytes of allocated and populated pages that only infrequently use them is running just fine.

Reducing memory usage in Windows 8 is more about reducing the churn of pages through the various kernel data structures in the memory manager. As the article says, that involves things like optimizing old code to not trigger page faults all the time, or to suspend threads or otherwise idle background services that aren't being used. (A thread waking up, and going immediately back to sleep because it has nothing to do will still potentially cause a page to be re-loaded from disk.)

The Russinovich/Ionescu book "Windows Internals" has some pretty good sections that talk about how Windows memory management really works, if you're curious about it -- it would likely be enlightening about some of the misunderstands that people have about Windows.

The Russinovich/Ionescu book "Windows Internals" has some pretty good sections that talk about how Windows memory management really works, if you're curious about it -- it would likely be enlightening about some of the misunderstands that people have about Windows.

I will fully admit to not knowing the internals of memory management. But I can say without a doubt that W7 definitely takes more of a performance hit than Ubuntu with the same programs. Thunderbird and Firefox bring the machine to a crawl in W7, while they don't in Ubuntu. Memory management is the only reasonable cause I can think of. It certainly not the processor - it's an Atom 570 dual core running at 1.66GHz. Add a third largish program (Media Monkey in my case) and W7 becomes unresponsive - REALLY unresponsive.

I don't get this behavior at all under Ubuntu, with more programs running, Granted, Ubuntu makes it slightly easier for me to see how memory is being used - probably because I'm a bit more familiar with it - by showing me buffers/cache. So as a layperson, I come to the conclusion that it's memory management.

" into a single number or statistic that lets people know if something is wrong."

You don't need numbers for memory leaks, applications usually slow down/crash because they are poorly programmed. Firefox's memory leaks become definitely obvious just by observing how the program slows down over time. Indeed interpreting memory data is difficult but that doesn't mean there aren't obvious give-aways from the user perspective of badly used resources or resources that become hijacked, especially when dealing wi

I find it difficult to believe that people are buying new machines with less than 4 gig of ram. Memory was cheap by the time Win7 came out - cheap enough to load a new machine with 4 gig, anyway. Maybe I'm something of an asshole, but anyone who invests hundreds of dollars in a new machine, and decides to go cheap on the memory deserves to have a shitty running machine. I don't care if it's an Apply fanboy, a Windows drone, or a Linux nut. BUY MEMORY, or don't complain about performance!

Now, if you had said that you installed all the memory that the mainboard would support, and you were getting 60% to 80% usage before you even started any programs, THEN I would agree that there was a problem, I would sympathize with you, and I would be willing to look for the problem.

A couple of guys have commented on how much memory their browsers use. Well, I've seen FF using around 1.5 gig, while at the same time, Chromium was using in excess of a gig of memory. As someone else commented - the memory is there, why not use it? It's better than waiting for "virtual memory" to thrash the hell out of my hard disks!

but anyone who invests hundreds of dollars in a new machine, and decides to go cheap on the memory deserves to have a shitty running machine. I don't care if it's an Apply fanboy,

Quite relevant for the apple folks, since extra RAM for them (preinstalled) is quite expensive-- can be ~10% of the price of the machine to upgrade the RAM. I think they charge $200 to upgrade to 8GB from 4.

Dell, HP, et all arent always that much better either, often they peddle upgrades like that for $100 (when RAM costs about $6 per GB right now).

Uhhhmmmm. My bad then. I don't do netbooks, and I spoke a bit to hastily. I'm familiar with desktops and laptops. In which case - that's Window's bad then. Once again, they are giving minimum and recommended RAM sizes that are to low, if they are recommending Win7 and/or Win8 be installed in netbooks with 1 to 2 gig of memory.

I find it difficult to believe that people are buying new machines with less than 4 gig of ram.

General objection: Just because something is cheap doesn't mean we should spend it wastefully. I'd still rather that RAM be put to better use than code bloat. I'd rather the PC be faster, or do more things, or do new things, or be cheaper still. Maybe if software wasn't bloated, PCs would be less than $100 these days. Then we could put that money towards better support.

Specific examples:

For a business with 100s or 1000s of PCs, spending an extra $50 on more RAM just because it's cheap means thousands o

I find it difficult to believe that people are buying new machines with less than 4 gig of ram. Memory was cheap by the time Win7 came out - cheap enough to load a new machine with 4 gig, anyway. Maybe I'm something of an asshole, but anyone who invests hundreds of dollars in a new machine, and decides to go cheap on the memory deserves to have a shitty running machine.

If you buy a pre-built machine (which 99% of people do), the worst thing you can do is buy it with lots of memory. Desktops and laptops ha

DSL is kind of cheating, all the software is frozen to circa 2003. A lot of features and code have been added to packages sense then. Not to say you can't still make a 200 MB distro fairly easily, but anything under a 100 is really hard to do unless you make a lot of sacrifices in usability.

I remember an interview with a former Excel developer who said that "re-architecting" was forbidden as it might break things no-one understood at the time; the original developers had left a long time before.

Ye sure, this is what always happens they promise some new and better stuff, and then they drop half the stuff and the OS is just crap. I wouldn't hold my breath for any of this. It's better to wait for the actual release. This is just hype talk.

With windows 7, memory has become less an issue to me. I just don't care that much; I have 4 gigs, and stuff starts right up when I click on it. As a user, that's all I care about. I could obsess about how much memory is being used at all times, I guess, but what does that metric even mean? I currently have fo:nv, mstsc, 10tabs in ie and ~20 in chrome, everything is still snappy. What does it matter that the system is showing high ram utilization?

What I'd like to see them focus on instead is the file system, and making searches work at least as well as they did in XP. Vista utterly broke file searching ( which is amazing in and of itself ), and while w7 brought back some of the functionality, it's still a crap shoot.

It shouldn't need to - you quite literally cannot buy new memory in less than 512GB capacity. That would be like saying "I can run Linux on a Motorola 68000, will Windows 8 do that?" - it won't, because there's very very little market demand for running a new operating system on decade-old hardware.

In the old days, there was two ways for a programmer to optimize code; for speed, or for size. You couldn't afford to not design your code, otherwise you would immediately run into memory and performance issues. 16-bit compilers wouldn't allow you to allocate more than 64K at a time. After your application loaded, there would be less than 128K free anyway.

So you would have to take care to plan ahead where and when you were going to use memory. Is the variable going to be a persistent data block that is allocated when the module is first started (IO cache block), something that is just loaded and then discarded (configuration parameters), or loaded until the user no longer wants it (data file). For every variable, you would have to decide whether it was 8-bit, 16-bit or 32-bit, signed or unsigned and assign it accordingly.

Floating point was expensive and you would use fixed-point integers whenever possible, at least until the 80486 came out.

2D FFT on a large image (512x512) wasimplemented by loading in each row of pixels separately from disk, applying the transform, and writing out that row again. This would be repeated again for each column.

Even if you did get everything planned out, there was still the chance you would run out of memory. Then you would have to go back and prune every variable for size. Do name strings really need 128 bytes? Do attribute flags really need to be 16-bit? Do coordinates need to be 16-bit?

These days, there are two ways to write code; for shortest project completion time; or for reusable code. Either deadlines are so tight that everyone just throws in code on top of each other, or there is actually time to design and plan ahead.

No one really bothers with whether structure or class variables are 8-bit, 16-bit or 32-bit, or whether an array should have an upper limit of 32, 128 or 1024, whether result codes should be returned to indicate whether the memory was allocated. Just defining variables as 'int' is good enough, and C++ container classes takes care of the dynamic allocation of arrays.

it is when you want a light weight portable device like a netbook, tablet or phone. You don't see big hard disks spinning on the top selling models of those do you? Well you do with netbooks because many of them still run Windows and must have a hard disk. You do know that this memory issue is all about trying to fit Windows 8 on battery powered portable devices don't you? The 4 core ARM chips are almost here so the CPU side of the problem of running Windows will be addressed but that still ate up more batt

No, but the memory optimisations are clearly a core part of the OS, so they did it first and it has been working for 2 years. Vista was a clusterfuck for all sorts of reasons, we know that, however unless they manage to find a bug in the optimisations that causes your PC to burst into flames, I doubt they'll be removing anything.

Longhorn (and more specifically WinFS) was one of the very few times MSFT's ever talked about features that weren't delivered. For Windows 7, I can only think of one feature which was announced that wasn't actually delivered (bluetooth audio).

Except for Longhorn features, what Windows features were promoted but not delivered?

Trigger started services were introduced in Windows 7, this isn't new. I just wish people were taking advantage of them (I'm looking at you Google chrome and Adobe with your long running processes that do nothing but check to see if there's an update).

So - disable the fucking services? Hey, I test drove Vista, I've driven Win7, and the first thing I did in each instance, was to refer to Black Viper's site. He had already figured out that disabling a list of services was all good for almost everyone, and disabling a longer list of services was good for some more people, and disabling an even longer list of services was still good for some people.

Update managers? Disable them, or change their settings to manual. Christ on a crutch, man, isn't this a ge

The Black Viper list is pretty good, but the reality is that from Win7 on, the list of OS services that's enabled is the set which won't break something (if you read Black Viper's list they point out what breaks with each service disabled).

And I've not yet figured out how to convince Google Chrome to stop auto-updating (and I don't want to stop flash from auto-updating, flash and pdf are the two biggest vectors for malware out there). I just wish their auto-updaters respected the user and recognized that

Isn't that a tremendous security vulnerability if the scanning isn't done correctly or frequently enough? I'm sure it does lower memory utilization, but I'm not sure that I'd trust that not to have any bugs or vulnerabilities.

Not if you mark pages as read only. I don't see how not merging frequently enough is a security vulnerability. Not doing it correctly could be, but I'd think be somewhat unlikely that could be mistaken for something import and actually contain code that exploits rather than crashes the system. Also it probably excluds the kernel and c library address just to be safe.

This does matter for use on non-PC systems, such as tablets and phones. One of the stated goals for Windows 8 is the desire to have the platform be the same across devices, and while desktop systems with 8+ GB of memory are increasingly common (and will likely be standard by the time 8 is widely adopted), tablets and phones still usually only have a few GB available.

I fear you may have lived in the desktop world for too long. "Usually" for phones, RAM isn't "a few GB available", or even in total. Try 128/256/384/512 MB.The few GB of memory they report is flash memory, i.e. storage.