Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Barence writes "Windows chief Steven Sinofsky has taken the unusual step of responding in the comments of a blog posting that claimed Windows 7 was suffering from a potential 'showstopper bug'. Stories had been sweeping the Internet that using the chkdsk.exe utility on a second hard disk would lead to a massive memory leak bringing the operating system to its knees in seconds. Responding to a blog post titled 'Critical Bug in Windows 7 RTM,' Sinofsky wrote: 'While we appreciate the drama of "critical bug" and then the pickup of "showstopper" that I've seen, we might take a step back and realize that this might not have that defcon level.' He signs off with the words: 'deep breath.'"

I wonder how this obviously one-sided summary even got posted -- it just sounds like a calling for bashing from people who dont read the article. Here's another snippet from Steven's response:

We had one beta report on the memory usage, but that was resolved by design since we actually did design it to use more memory. But the design was to use more memory on purpose to speed things up, but never unbounded â" we requset the available memory and operate within that leaving at least 50M of physical memory. Our assumption was that using/r means your disk is such that you would prefer to get the repair done and over with rather than keep working.

And it does make sense for two reasons:1) Windows has to lock the drive anyways, so its better to get it done fast.2) You CAN spend RAM. If the whole RAM isn't used, you're just wasting it. In this case chkdsk.exe will use dynamically what there is left, making the process faster. How is this a bad thing?

UPDATE:After emailing back and forth with the VP Sinofsky, it was found that the chkdsk/r tool is not at fault here. It was simply a chipset controller issue. Please update you chipset drivers to the current driver from your motherboard manufacturer. I did mine, and this fixed the issue. Yes it still uses alot of physical memory, because your checking for physical damage, and errors on the Harddrive your testing. Iâ(TM)m currently completed the chkdsk scan with no BSODâ(TM)s or computer sluggishness. Feel free to do this and try it for yourselves. Again, there is no Bug.Thanks all.

The GP you are responding to never claimed there was no bug at all. What is being said is that the bug is in the chipset controller driver or somewhere else, not in chkdsk like this FUD submission is trying to claim. Maybe next time you should learn some reading comprehension.

Agreed, this is a non issue, or at worst, a very tiny issue.
For the very tiny amount of people out there that will run "Chkdsk -r" on a secondary partition, they may see almost all their ram used up while it is scanning the disk. If they have prexisting hardware or software glitches, it might blue screen on them.
For the 90% of consumers who would never run chkdsk, and who don't have more then one parition, this is a complete non-issue.

I regularly put customer's hard drives into a different computer as a secondary drive and run chkdsk. Your math sort of makes it seem like 4-5% of a market isn't a lot to account for, yet that 4-5% means in terms of the OS market hundreds of millions of users. Should we let you take those support calls?

Probably so, but how many end users are going to run chkdsk/r? How many would run it on their recovery partition? Hint: Most users wouldn't know what chkdsk is.

The article seems to suggest this occurs on a second physical drive. Other data indicates this only affects certain controller chipsets. In either case, what GreenEnvy22 stated is true; it's a non-issue.

Actually a disk check happens quite frequently for a lot of people. I'd say the affected number is still pretty large. No, not everyone schedules the disk check but Windows will initiate it itself at times, and yes, against a second hard drive.

Did you read the article? It stated rather plainly that the problem was with chkdsk/r. When Windows schedules chkdsk, it's usually after a crash or improper shutdown--and it does not attempt to scan for and fix bad sectors. Never have I seen Windows perform an automa

I don't want the/. eds to be Pulitzer Prize winners. I just don't want them to be so dumb that it makes my head hurt. Eventually I hope/. will just turn it over to the readers via firehose and save some money on "editors".

2) You CAN spend RAM. If the whole RAM isn't used, you're just wasting it. In this case chkdsk.exe will use dynamically what there is left, making the process faster. How is this a bad thing?

This sounds a lot like the Outlook 2007 discussion on Vista (and some reports on XP). Vista has "advanced memory management" and Outlook "continually asks for RAM, as long as some is available". The result? Outlook allocates ~700M, according to the Task Manager process list, while the Physical Memory free (on a 3G system) reports 6% free. Closing Outlook brings the ram free percentage up to %60. Some MS MVP said just what you said "The RAM is available, so Outlook uses it and the program responds faster, th

Actually isn't not unresponsive, Outlook will give up RAM quite happily and it's not opening 3 emails, it's keeping your entire PST/OST loaded into RAM so you don't bitch and complain when selecting different emails is "slow to load". Therefore if you have big.PST/OST expect for it to use alot of RAM.

However, I'm sitting on Windows 7 Ultimate x64 with 6GB of RAM and Outlook is using 200MB total including what's committed for use and what's it happily taking because it can. I have 457MB.OST (Exchange cached file) so wanting to load half of it's not unreasonable. Linux uses similar memory management system and I don't hear alot of complaining about it.

I missed the part where having all your e-mail in one big file is a good thing. I've never had any problems with "slow to load" e-mails, whether I was using an offline e-mail client or being served e-mails from a webmail address. What exactly is so good about the PST/OST file that it's worth keeping EVERYTHING in RAM for? (I'm not being entirely sarcastic here, if there's a good reason for this, I'd like to know it).

This sounds a lot like the Outlook 2007 discussion on Vista (and some reports on XP). Vista has "advanced memory management" and Outlook "continually asks for RAM, as long as some is available". The result? Outlook allocates ~700M, according to the Task Manager process list, while the Physical Memory free (on a 3G system) reports 6% free. Closing Outlook brings the ram free percentage up to %60. Some MS MVP said just what you said "The RAM is available, so Outlook uses it and the program responds faster, that's a good thing", completely disregarding the fact that the computer is near unresponsive to everything else. A program should never take RAM "because it's available", it should take it "because it's needed". Using over 2G of RAM to open 3 emails is absurd, using 1G for texture and sound data is more reasonable.

I'm running Outlook 2007 on Vista right now this instant, and it's using 92MB of RAM. Physical memory free = 57%.

Even if Outlook was using all but 6% of free RAM, why would that necessarily make your system "unresponsive to everything else?" 6% of RAM is plenty to keep your machine responsive, assuming it has a gig or more in it.

And this statement:

A program should never take RAM "because it's available", it should take it "because it's needed".

Is doubly-retarded. RAM takes time to fill, yet takes no time to empty. Therefore, all software should fill as much RAM as feasible to make itself more responsive to the user. RAM isn't some physical object you "take away" from something else-- if Outlook allocates RAM than another process needs, the OS just overwrites it as needed.

Why? Because system already has disk corruption issue, it could be also related to memory corruption. Also, thing runs on journaled volume with huge help from journal file. One should also admit how clever they hide it from

that's fine if the ideal dynamic utilization has zero overhead. So if I need it for something, I get it back without delay. In practice, if it adds anything to my system's ability to allocate whatever memory I need to open something else, then it sucks.

chkdsk.exe is a disk checking and file-system repair tool. Most users will never know about it.

The chkdsk functionality kan be invoked through Windows Explorer as well. Some users will find this tool if they deliberately are looking for it.

chkdsk.exe with the/r oprion (and *only* with the/r option) has been designed to allocate most of the available physical memory, but always leave at least 50M free. This is not a memory leak. It was a deliberate decision because using more memory will dramatically speed up the surface verification/repair process. Note, it will allocate from available memory, i.e. already allocated memory will not be forced out into paged/virtual memory. If this was a leak the allocation would go on and on, cause more and more swapping until the system trashed itself to death. But it's not. The system remains responsive and the memory is freed when chkdsk ends.

The crash condition appears to be an unrelated issue with chipset controller drivers. Propably this issue becomes more pronounced during periods with intensive disk usage and/or low memory conditions. It is not caused by chkdsk, it is a driver/controller issue which has been reported to be fixed by updating drivers to the latest version.

No, the real issue is that Microsoft appears to be slated for a massive success with Windows 7. At this point some Microsoft detractors will leap upon any issue in an attempt to spoil the party. In this category you find Randal C. Kennedy of InfoWorld who leapt on to this issue with blatant disregard for any facts.
Even if the original blogger and mr. Kennedy were so stupid as to believe this issue was a memory leak and that it caused the crash, by their own account it would only manifest itself under very specific circumstances:

chkdsk.exe must be invoked with the/R option to perform a surface scan/repair (this is the most radical option).

chkdsk.exe must be invoked for a non-system partition (chkdsk must dismount the drive/partition - using/r on the system drive requires chkdsk to run during boot instead).

So, even if this was a bug, only users with

2 or more drives/partitions,

one non-system exhibiting suspicious behavior to warrant a "surface scan".

users able to find and launch the tool

No, this whole bruhaha has a distinct smell of desperation about it. And kdawson is - as usual - all to happy to assist.

That's hardly the case. Unused RAM is used as a disk cache, so that frequently read disk blocks reside in RAM instead of on disk. This makes reading them extremely fast. If applications allocate memory willy-nilly just because it's there, there won't be any memory left for the disk cache, and your system might become very slow. And if even more memory is allocated, the system will start paging stuff in and out of memory, slowing stuff down even more.

It seems that if you install Windows 7 on the second hard drive, it will put it's system reserved boot partition on the first drive. This absolutely boggles my mind. Now I need both hard drives just to boot my system? I discovered this when Windows 7 fucked up my Chameleon installation. Then my Hackintosh wouldn't boot into OS X until I reinstalled Chameleon from the iAtkos disc. Then I had to unplug the OS X drive and reinstall Windows 7 so it would stick to it's own goddamned drive and leave the others alone.

Bad, BAD fucking move, Microsoft. Now Windows 7 can easily fuck up unrecognized partitions on other drives during installation. I really hope that gets fixed in the final version.

I think if you use an existing partition instead of making a new one Windows will just put everything on one partition.

Anyways you could always copy the files and boot sector from the small partition to the Windows 7 one and raze the small one, then you just need to edit the BCD registry using EasyBCD or bootedit.exe to point to the correct partition on boot. But yeah those are both WINDOWS tools... but bootedit.exe should be available from Windows 7 Setup on the DVD if you mess up and can't boot into Wind

Anyways you could always copy the files and boot sector from the small partition to the Windows 7 one and raze the small one, then you just need to edit the BCD registry using EasyBCD or bootedit.exe to point to the correct partition on boot. But yeah those are both WINDOWS tools... but bootedit.exe should be available from Windows 7 Setup on the DVD if you mess up and can't boot into Windows (press SHIFT+F10), and fixboot.exe can install the boot sector onto any partition.

So, what does one do when one has to reinstall Windows? That happens often enough that it becomes a PITA to have to keep unplugging and shuffling drives around to keep their installer from finding and f*cking up every other partition.

So, what does one do when one has to reinstall Windows? That happens often enough that it becomes a PITA to have to keep unplugging and shuffling drives around to keep their installer from finding and f*cking up every other partition.

Generally, Windows wants to boot from the FIRST partition of the FIRST drive. There are tricks you can use to get around this-- GRUB has methods of remapping devices and partitions so Windows "thinks" it's on the right drive. But, generally, you install Windows first, to first partition of our first drive, and then install the other OSes afterward. If you have to install Windows again, generally other OS install discs come with some kind of "rescue mode" where you can re-install your bootloader. For insta

Generally, Windows wants to boot from the FIRST partition of the FIRST drive.

Even though I'm a programmer, I don't consider myself particularly apt with regard to installing or setting up an OS (thankfully, XP is drop-dead easy to install and configure in general), but my previous XP machine was running for several years while booting off my H drive. I had a habit for a while of taking whatever drives I had in my previous machine and just throwing them into my new one, so C, D, and E were various hard drives from previous machines (some with Windows directories still on them). F a

So, what does one do when one has to reinstall Windows? That happens often enough

And you don't see anything wrong with that? Seriously. Why should you have to install the same version of he same OS more than once on a machine? Since I've been using Linux, I've only had to reinstall it twice. Once because just after an upgrade I did something foolish and trashed my Linux partition and once because an upgrade didn't work out well. (The newer version couldn't find my NIC no matter what I did so I ended

And then, of course, they reinstall everything they "need," and they're back where they started. Seriously, I'm not talking about people like that, I'm talking about computer geeks who routinely reinstall Windows on their own boxes and think nothing of it. I once knew a man who insisted that NT 4 was completely stable, but he still reapplied the latest service pack every month because if he didn't, his system started crashing. It's that

Well, at least it no longer overwrites GRUB when installing (or at least Win7 RC didn't do that) - while XP always did.

Funny, I just wrote up something about this in my last post. You must've been reading my mind! (Although, I didn't exactly experience it with Windows 7.)

Generally, I install other OSes to their own drives. In the XP days, it'd attempt to overwrite grub (or other bootloaders) on drive(s) you weren't installing XP to. Talk about ridiculous! Though, that may have been an artifact of the other

No. Not windows 7. If you partition your hard drive, and install windows, it will take all partitions as it's own. If the other partitions are already occupied (and it can tell that there's something there, even if it can read Reiser, Ext3, etc.), it will not overwrite. You'll need to repair grub afterward, but it's a lot less painful than discovering the partition is wrong and you need to redo everything, trust me.

yup, and yup to the genoo/xp dual boot too, microsoft goes to great lengths to not play nicely with other operating systems, i was hoping microsoft would change that attitude but hope is a cousin to dreams and we know both of those are not real...

looks like i wont be buying an OEM with windows7 on it later this year, the more unfriendly microsoft is to other OSs the more newegg gets my business (building my next desktop)

You have obviously not installed many OS yourself, and if you really believe what you are writing you should probably stop installing those you already are installing. You can control exactly where and how you want any partitions to be, so even with windows 7. It has a certain default, which is to install a 100MB, let's call it, rescue partition.

Just pre partition the disk the way you want it and you won't have that extra partition. So perhaps the bad move is on your for not knowing what you are doing and still posting as if you did.

My point is that the user shouldn't have to bloody worry about it. Why should I have to prepartition my drive just to keep Windows from messing with other drives? It should stick to the installation drive by default, not require extra steps to keep it from messing with other drives in the system. Plopping the 100MB system reserved partition on another drive by default means I need BOTH drives to boot, which is stupid. But yes, my bad for assuming Microsoft would do things in a logical fashion.

Plopping the 100MB system reserved partition on another drive by default means I need BOTH drives to boot, which is stupid. But yes, my bad for assuming Microsoft would do things in a logical fashion.

To be fair to MS, if a user knows enough to know they have two hard disks and enough to know how to install a second hard disk, they should know how to pop the case open and unplug the one they don't want anything to happen to. Frankly, it'd be a good habit to have, even if you were installing software anointed

My point is that Windows should automatically put the system reserved partition ON THE DRIVE TO WHICH IT IS BEING INSTALLED. I cannot think of a conceivable reason for a single installation of Windows to spread itself across two different hard drives by default, thus requiring both drives to be present and functioning to boot the system. I should simply be able to select the drive, tell it to go, and not worry about it. Please explain to me why it's necessary for Windows to do this by default.

You're missing the point. Even if you pre-partition the second drive Windows still installs it's boot loader on the first. This is not just true of 7, it's been doing this since NT 4.

Brushing aside your "you should just know how to it" bs ( I thought stuff "just works" in Windows, it's teh easy!) it goes beyond understanding the partitioning. It's about behaving in a counter intuitive way that requires discovery on the user's part. I cannaturally assume that I'll be better off partitioning my own drive. It takes a real WTF moment to realize you have to rip out one of your drives before you install Windows if you don't want the unexpected behavior of your master boot record being on a different drive then the OS. Another poster said "install Windows first, that's the rule". Fine, I get that but it's still f'ng stupid.

That's also BS. I can set my BIOS to boot from any drive I want. Windows always picks the first drive's master boot record to install it's loader no matter what the BIOS settings are. I've been through this dozens of times, including with Windows 7 and you're wrong. Try it, set your BIOS to boot from your second hard drive and then throw in the Windows install disk. It will overwrite the master boot record on your first drive without giving you the option to change drives or even skip that step so you could

I have three hard drives in my machine, one IDE and two SATA. I change the order of the drives from my BIOS and put Windows 7 on one of the drives.

When I want to boot to a different drives, I flip the drive order in the BIOS and that way no OS sees any other. I have Linux on one drive, Windows Vista on another and Windows 7 on the third, and each has its own little world.

Why even worry about boot loaders and the like, when its so easy to pick a

When I want to boot to a different drives, I flip the drive order in the BIOS and that way no OS sees any other. I have Linux on one drive, Windows Vista on another and Windows 7 on the third, and each has its own little world.

Same here, except I use grub to boot between the different drives (yes, it's possible). If you were a little more keen on bootloaders, I'd suggest you should give it a try as it'll save having to screw with BIOS--if your BIOS isn't terribly old. I used to do the BIOS flip ages ago. It

Don't you have a "Press F12 to select boot device" prompt somewhere between the memory test and the bootloader that would save a bit of time? Are you doing more in the BIOS than just change the boot order to prevent the Windowses from finding each others' drives through hardware enumeration at runtime? Are you elsewise abusing the way that XP and Vista bless secondary hard drives?

Not all machines are like Dells nor are all BIOSes created equal. For example, I tend to use Intel's reference boards at home, an

Windows assumes that if you don't manually create partitions and instead tell it to do it automatically, it can put things where ever it wants. You basically told Windows "hey, just drop your stuff anywhere, thanks." on the other hand, if you had manually created a partition and then told Windows to use it, it would have.

The behavior might be slihtly sketchy, but it's not some horrible conspiracy.

It's due to a limitation in how the BIOS in your machine works. Virtually none can boot off any drive other than the 1st IDE device. So if you want to boot off a 2nd drive, you really boot the first sector and loader (sectors 1-62, LILO/GRUB style) off the 1st IDE device then continue off the other drive.

To do it any other way wouldn't work with any machine out there except perhaps EFI machines.

don't you think while using a Hackintosh, trying to dual boot a beta OS, and probably some other crap you didn't mention that you might run into a few problems? And yeah, I am sure your dual boot hackintosh is on the top of the list for a fix.

Hi.

On the top of your browser, there's an address bar, after the http:/// [http] and before the next / does the word 'slashdot.org' appear?

On the top of your browser, there's an address bar, after the http:/// [http] [http] and before the next / does the word 'slashdot.org' appear?

I'm assuming yes, so seriously, what did you expect?

I actually chuckled at this. While you're jesting, I should confess an analogous story of a friend of mine.

For as long as I've known him, he's had the unusual capacity of being able to break nearly anything he comes in touch with. Ubuntu install? Lasted 10 minutes. Gentoo install? After about 15 tries just to install th

It's true that some people are more problem-prone. My Dad could break ANYTHING. Lawn mowers, about every year and a half. Vacuum cleaners, about every year (although now he has had a Dyson for about 3 years and only repaired it twice). Not to mention can openers, dishwashers (my mom wouldn't let him do dishes anymore unless he did them by hand).

No dipshit, Hackintoshing has very little to do with it. As far as Windows 7 is concerned, it was simply another drive. That's all. The point of the matter is that it fucked up a partition that it didn't properly recognize. The same thing could happen to Linux installations as well. It's an ugly oversight that is NOT specific to Hackintoshes, so pull your head out of your ass.

You raise really good points. There is one unfortunate thing in the behavior of Windows' installation process when it comes to drives it doesn't understand. But first, an example:

Let's assume that you're installing Windows on a system with two hard disks. On the first disk exists Linux or BSD. You plan on using the bootloader to boot Windows (off the second disk). The second disk is blank. When you attempt to install Windows to the second disk, it will alert you that it needs to make changes (i.e. wipe the bootloader) on the first disk. It's possible a situation like that might result in unexpected changes, but it's not difficult to resolve--simply load a live CD and replace the bootloader. (At least, this prompt would occur with Windows XP--I have no idea with Windows 7 because of a habit I've acquired. Keep reading.)

However, I've never actually had windows make any unexpected alterations to anything other than the disk I was installing to. Perhaps it's partially thanks to a healthy dose of paranoia; whenever I install Windows to a dedicated disk--really, whenever I install any OS, I have a habit of unplugging all the drives I don't want to touch. As you alluded to, since the OP clearly didn't take such precautions, he sort of got what was coming to him.

Maybe my measures are a little excessive, but when I'm dealing with the prospect of having to reinstall several OSes just because of a stupid late-night mistake, a typo, or maybe a software bug, I'd rather take the time to make sure it can't happen. Not that this method isn't fraught with complications--it's possible to unplug the wrong drive. But, that's why you check it first to make sure it is the one you want to wipe!

So yes, you're exactly right. The OP really should have taken greater precautions with his data. It would've saved him a reinstall.

Microsoft doesn't sell hardware, they sell software. A person who buys a license and runs it on a Mac means just as much to Microsoft as a person who runs it on a Dell. I would say they have all the incentive they'll ever get.

Sure they do. Linux is the only operating system on my home computer. Someday a game may come out that I think is worthwhile, I'm open to installing Windows 7 then. If I can't do that without damaging my Linux install, they don't get my money.

I didn't wanted to write a complete script to make that joke that could have lost non bash speaking reader. It leads to the question : is there an easy way to detect windows partition ? Is checking for NTFS enough ?

I am not sure you will crash the system by filling all the disk. Sure most applications won't like it, but the system should stay alive.

If it is really such a serious bug, than it will be fixed with the first installation and following windows update. (or OEM patches).

No sane person runs a vanilla installation of windows.

Actually, in the first months when win 7 gets released, a lot of even more serious bugs will surface (because of the wide exposure). They also will be fixed and integrated in the update service. It's known that the first months of release is always the release test and fix cycle.

There's no doubt in my mind more bugs will surface with the much wider install, but if this site [hitslink.com] is correct Windows 7 is already close to 1% of the OS market, a respectable install base for an unreleased OS.

the current user base are technically inclined folks (you have to make some effort to get it before official release) who manage to avoid specific bugs that will show up when millions of monkeys start to bash the system (OEM's install and sell it to regular Joe/Jane).

You really want to imply that those two testing environments have anything to do with each other?

Obviously they are not perfectly analogous. But current (more technical) users are also more likely to correctly report bugs. In addition, not all of the current users are techies. Speaking only for myself, I have migrated a number of friends/family to Windows 7 RC because there were running Vista. *shudder*

While I'm on the subject of W7 bugs, I've said this before and I'll say it again: be wary of Homegroup. It's great when it works and it usually works, but it can destroy all it touches if it gets upset.

Errrrr, yes they do. Organisations the world over do not install lots of individual Windows updates unless one or two are absolutely necessary. They always (if they're sane) create a build from the known vanilla install and then service pack increments as they become available so they always know exactly what is installed on all their systems at any given time.

However, since it will be many, many years before most organisations upgrade again then it'

"chkdsk" isn't an arcane process. "chkdsk -r" on this particular chipset employs an arcane process to do an in depth check for physical problems on the drive. In other words, this bug: only affects people running "chkdsk -r" on a secondary hard drive, with a particular chipset, who have not update their chipset driver, and is caused by an arcane process within the un-updated driver. I'm hardly a Microsoft apologist, but this seems like a Hell of a tempest in a teapot to me.

I just don't understand why you can't post correct factual posts, is that so hard??

On my machine, with 12GB of memory it uses up 10GB, I still have over 1GB of free memory (10%), the computer is not sluggish and working fine.

If you get an BSOD from this, you should know that it most likely comes from a driver that has not been verfied under low memory scenarios, which is a prerequisit for being WHQL certfied. It is also part of the Driver Verfier supplied by MS.

UPDATE:
After emailing back and forth with the VP Sinofsky, it was found that the chkdsk/r tool is not at fault here. It was simply a chipset controller issue. Please update you chipset drivers to the current driver from your motherboard manufacturer. I did mine, and this fixed the issue. Yes it still uses alot of physical memory, because your checking for physical damage, and errors on the Harddrive your testing. I'm currently completed the chkdsk scan with no BSODÃ(TM)s or computer sluggishness. Feel free to do this and try it for yourselves. Again, there is no Bug.
Thanks all.