Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Stoobalou writes "In a chat with fellow CEOs at Microsoft's 14th annual CEO Summit, Microsoft boss Steve Ballmer came close to admitting Vista was a dog. 'How do you get your product right? How do you help the customer? How do you be patient?' he asked, as if he knew the answer. What he did know was that Microsoft spent too many years building Windows Vista. 'We tried too big a task and in the process wound up losing thousands of man hours of innovation,' he said." You can also watch
video of the speech, but 31 minutes of Ballmer is a lot of Ballmer.

"We tried too big a task and in the process wound up losing thousands of man hours of innovation,"

Boy that word sure doesn't mean jackshit when it just gets thrown around and abused like that, huh? Like watching the word 'fuck' get detoothed in Scorsese's Goodfellas, there's this sort of desensitization toward 'innovation' that leaves me confused as to how I should describe people like Tesla, Turing and Shannon. If Ballmer considers all of his workers as 'innovators' and has "thousands of man hours of innovation" at his disposal then surely there must be some new word to apply to the real innovators. I guess there might be something to the theory that innovation diffuses with time [wikipedia.org] but this is downright ridiculous.

Innovation requires risk and not the kind of risks Microsoft took with their Vista debacle. It requires that you do things entirely differently than everyone else. This is not Microsoft. This is not Windows Vista nor Windows 7 nor IE anything.

Boy that word sure doesn't mean jackshit when it just gets thrown around and abused like that, huh? Like watching the word 'fuck' get detoothed in Scorsese's Goodfellas, there's this sort of desensitization toward 'innovation' that leaves me confused as to how I should describe people like Tesla, Turing and Shannon

Innovation requires risk and not the kind of risks Microsoft took with their Vista debacle. It requires that you do things entirely differently than everyone else. This is not Microsoft. This is not Windows Vista nor Windows 7 nor IE anything.

Microsoft took a big risk with Longhorn and tried to write pretty much the whole OS in managed code (entirely different to everyone else) and it didn't pay off. Most of the delay came from throwing most of that work away and starting again back in native code.

Longhorn never was a managed code approach, which is still a lofty research goal (and may still be brewing behind the scenes at Microsoft Research through Midori, Barrelfish, and Singularity.)

Longhorn did however try to incorporate a bunch of other research projects right from the get-go, most of which were spun off into individual projects or into existing products. Avalon was supposed to replace winforms, WinFS was supposed to replace NTFS, Palladium was supposed to be incorporated, etc. The development team was spinning their wheels trying to adapt to the latest demand to use the latest research products instead of developing along a stable path. By the time the "reset" came Microsoft had already missed their 3 year OS schedule and it was going to take another 3 to turn Longhorn into a releasable product. While many user applications (Explorer, for example) were partially rewritten in.NET, they represented only a small portion of the total code.

Windows 7 by comparison was released with teams focusing on milestones internally and not releasing or demonstrating any not-done-yet feature. Essentially each feature that a team proposed was a patchset on the Windows build and they would test it but if it did not make the cut, they didn't apply the patch to the milestone build. The Engineering Windows 7 blog goes into great detail about the development process that was vastly improved over Windows Vista's.

Perhaps I exaggerated a little, but there was a big push to try to focus user space as primarily managed code. Singularity et al are doing crazy stuff with managed code in the kernel amongst other things, which is interesting but not what I was alluding to in my original post.

The Engineering Windows 7 blog goes into great detail about the development process that was vastly improved over Windows Vista's.

I'm aware, I followed the blog while it was still active. I particularly found the GDI concurrency post interesting. I wonder if having a similar blog for Vista would have allowed them to realise earlier on that it was going out of con

While many user applications (Explorer, for example) were partially rewritten in.NET

... and I'm still waiting for the patch that allows me to hide the "Organize bar" and allows me to turn back on treeview lines, get rid of the "locations" crap and pretty much make it look like it used to.

Longhorn did however try to incorporate a bunch of other research projects right from the get-go, most of which were spun off into individual projects or into existing products. Avalon was supposed to replace winforms

I'm not sure what your sources are, but I dare say they are rather suspect, given that WinForms was never a part of Windows proper (it's a.NET library, which is a fairly straightforward OO wrapper on top of Win32 API, nothing more). It ships with Windows since Vista, in a sense that it comes as a part of.NET, and OS ships with.NET. But it's not something that affects the OS development as such.

Microsoft took a big risk with Longhorn and tried to write pretty much the whole OS in managed code (entirely different to everyone else) and it didn't pay off. Most of the delay came from throwing most of that work away and starting again back in native code.

Or, perhaps more accurately, of throwing away your whole codebase halfway through and restarting, and still expecting to meet your original deadline. If you expected it to take 4 years (for example), and find out your first year did nothing, you're now trying to complete a 4 year project in 3. Is it any wonder Vista had such difficulty?

It's amazing how programmed the top brass at Microsoft are to including this word "innovation" in every speech. I've hardly heard a pronouncement over the last ten years, particularly from Ballmer, and before him Bill Gates, that doesn't feature this word prominently.

I think it all kicked off when they were being hauled over the coals by the EU and threatened with anti-trust action in the US. They then decided that they had to give a better image of actually doing something worthwhile.

Of course, as you note, they are (given their R&D resources) about the most un-innovative company you could imagine.

It's amazing how programmed the top brass at Microsoft are to including this word "innovation" in every speech. I've hardly heard a pronouncement over the last ten years, particularly from Ballmer, and before him Bill Gates, that doesn't feature this word prominently.

Remember when innovating actually meant "taking something good, and make it a little bit better?" Not massively better, just a little bit. Now the term innovation gets thrown around to mean everything from re-releasing old software to creating entire new forms of human endeavors.

"Our new human teleporter is an innovation like the world has never seen before.""What is it innovating on?""...Paradigms!"

windows 7 is nice, but the cool things now are cell phones and tablets. for that you need a mobile OS with a footprint of under 1GB. Windows Phone 7 is still months away and a few years behind iPhone OS and Android.

Supposedly they designed Windows 7 with tablets in mind and added multi-touch support. However the only company I know that was working on a Windows 7 tablet (HP) has since dropped Windows 7, and instead bought out Palm so they could get WebOS.

windows 7 is nice, but the cool things now are cell phones and tablets. for that you need a mobile OS with a footprint of under 1GB. Windows Phone 7 is still months away and a few years behind iPhone OS and Android.

Are you saying they should stop making Windows 7 and PC's just because cell phones and tablets are somehow "cool" things now? I'd like to keep my computer, if you don't mind.

microsoft became big by starting in the cheapo PC market and working their way up. PC's were cheap crap in the 1980's compared to the cool workstations and mainframes. same thing with tablets and phones. for now they don't do as much but in 10-15 years the technologies will improve and who ever gains the marketshare today will rule in the future. I personally prefer Apple's fat client over Google's cloud model, but they are way ahead of MS in the mobile space

Cheap is relative though. That $8000 Compaq would be mighty expensive next to a Commodore 64 from that era which would run more around $400. Sure it was a lot less powerful, but for many users (myself included), we made do and did a lot of interesting things with those machines, which were even further down the cheaper side of the spectrum.

I think what honestly made IBM's take off originally was the fact that (after the BIOS was reverse engineered) you had tons of companies building them. Just more optio

And they only sort of cleaned things up with 7. Keep solidly in mind that 7 is nothing more than what Vista probably ought to have shipped with in the first place. Keep solidly in mind that it's NOT any more secure than XP (if you tell yourself that it is, keep deluding yourself...helps all the botnets...). If Ballmer was honestly interested in "innovation", he should have risked quite a bit more than he did with Vista- for all the issues, etc. they had, they could have gotten further along by taking a *

It is much more secure than xp.UAC, which allows old application, requiring admin rights under xp, to run under user's account. Firewall, which now can filter outbound connections and offers better configuration capabilities. Protected mode for ie, which mitigates most exploits. Holes in ie are really exploitable only on windows xp.Address space randomization. SEH is now secure, under xp it was possible to change exception handler's address (if the application itself had an exploitable buffer overflow, of

Innovation? Part of the big problem was that there weren't killer features worth upgrading for. You could cite Aero, but it was a massive resource hog and is chasing the tail of Mac OS X and Linux. It wasn't innovation.

In so many areas Vista made needless changes that weren't improvments or innovations. It seems like they had no direction and needed to shuffle things around enough to convince people this was a new Windows release.

Windows Repair Install is gone with no apparent reason.

Every major ocnfiguration dialog is moved to another location. You need more clicks to accomplish the same tasks. This was a major usability regression with no apparent reason.

Vista's failure was because Microsoft had no idea what it wanted Vista to be. It is a failing of leadership. Leadership also failed in not reaching out to hardware manufacturers and working closer with them. ATI and NVidia had trouble working with the new Vista driver API (which was a mess). OEMs had trouble figuring out what exactly constituted "Vista capable" hardware.

It isn't because you spent too much innovating. It is because you spent too much time running around in circles.

I was just about to reply with the same comment. There were clear goals expounded throughout almost a decade of vaporware announcements of NT, Chicago, and then Longhorn. The problem was that they couldn't get most of it to work properly, while the landscape of real innovation kept changing around them. To "adapt", they kept adding more and more items to their extensive promised features list, and it all came crumbling down eventually when they realized that six years have gone by from their last major release and the world was not holding its breath anymore.

Then Vista was put together by salvaging some parts and adding some shiny chrome, just to fill up the gaping void in their product line. No wonder it seems inconsistent and lacking of a coherent vision or direction--it barely had any of either.

Win95 was a leap ahead. From DOS and Win3.11. Sure, it was still kinda-sorta DOS-with-some-GUI under the hood, but it was the first time that the whole "DOS stuff" was neatly tucked away, not to be seen by the average user.

Win98 was the next big leap, a stable Win95, plus a few goodies, better networking, more out-of-the-box support for more hardware, more of everything.

W2k was the fusion of the NT line with the 9x line, the combination of the "office" and "game" areas, stability and compatibility. Plus USB support for the NT line.

XP was... well, mostly flashy and gadget-y, but also much easier networking, better (and out of the box) WiFi support, smoother installation and better security (no, really. Not perfect, but certainly better).

Vista was... well, new. And... well, slower. And... well, why the heck would I wanna use it? Even if I'm just in for the eye candy, Aero is not the big leap ahead in that area (and only available in the more expensive variants no Joe Randomuser ever buys).

A lot of "Vista ready" PCs didn't support Aero. It was a bit of a debacle, because in most people's eyes Vista=Aero. The common person has no idea what other differences there are, just that everything was clear black instead of bright blue.

Initially, Microsoft had a grand vision of a new operating system, built on managed technologies, declarative UI, semantic filesystem, transparent integration of different services, etc. It was a grand plan and quite innovative. Unfortunately, technology just wasn't there..NET was in its infancy and the staggering amount of completely new interdependent modules was just too much to swallow.

So MS had to scale back everything, and quite quickly. So Vista came out very unpolished and raw. Windows

Touting Aero was a clear sign that the developers didn't understand usability. You don't get an easier-to-use system by making it prettier. You restructure your information in a way that is clear and intuitive to make an easier-to-use system. If the user still has to go to a control panel to set a preference for their e-mail client, it isn't an easy-to-use system.

Vista broke compatibility with a lot of applications and hardware drivers, and ran slower. In exchange, the user got... another layer of confu

We tried too big a task and in the process wound up losing thousands of man hours of innovation

You wasted thousands of man-hours of innovation, but not for the reasons you think. You run a company with a long history and well-known culture of quashing real innovation (because, let's all be honest, Microsoft is big enough with enough smart people working there that real good ideas do see development - they just never seem to see release...). The development teams are so political (with the Office team at the top of the heap, as I understand it) resulting in corporate politics determining what ideas actually make it to market rather than the merits of the actual idea. How many innovative ideas have been canned by internal policy and infighting?

Vista was a dog but let's not blame Vista for lost man-hours of innovation - look at your corporate culture and you'll find the problem.

Vista was a dog but let's not blame Vista for lost man-hours of innovation - look at your corporate culture and you'll find the problem.

Not a chance without major upheaval in Redmond.

As long as Microsoft is still seen as a good investment (in other words: the stock is either growing or remaining fairly static but returning a good dividend), the investors won't make any serious effort to get Ballmer kicked out - and when was the last time you saw the incumbent CEO who presided over corporate culture going to hell making a serious effort to re-appraise something as fundamental as that? It'd mean admitting that everything he'd stood for for

Longhorn as it was called during its development scrapped some functionality during its development cycle. (It even got so much redefined that it was renamed from blackcomb to longhorn)

One very noteworthy is that everything was supposed to run on top of winFS, a database instead of a file system. On a lot of tools this was never completed. Also there would be more diversification between server and client versions. But as you know server and client diversification OS versions in vista/server 2008 are the same as XP/server 2003 edition.

But this just seems normal in any development process. In Unbunto you also see software tools that are no longer in the main package after a couple of years. If you knew what would be important in 4 or 5 years you could do optimal development, but the reality is that nobody can see that much in the future.

For those who've forgotten, the project that resulted in the Vista release was reset at least once. Remember Longhorn? From Wikipedia [wikipedia.org]:

Faced with ongoing delays and concerns about feature creep, Microsoft announced on August 27, 2004 that it was making significant changes. "Longhorn" development basically started afresh, building on the Windows Server 2003 codebase, and re-incorporating only the features that would be intended for an actual operating system release.

I dont think Windows 7 is any better than Windows Vista. Marginally faster compared to Vista but being faster than Vista is like winning special olympics, youre still a retard.

Microsoft has no connection whatsoever with their users and thats where their real problem lies. Their users wants their OS to run their applications as good as possible and make managing the computer easy. Microsoft wants the OS to be the users primary application. Jumping up and down in your users face screaming for attention when their primary goal is using their apps arent productive.

Until Microsofts leadership realizes their customers are their end users Windows will continue to suck as bad as ever.

I'm not sure what your definition of "easy" is of course.But pressing the winkey, start typing a name or command and pressing enter to launch about anything you can think of in Win7 is "easy" in my book.

Yes there are shitloads of configuration options but for most users Win7 is ready to go right out of the box. They've done a really good job with that.

Microsoft wants the OS to be the users primary application. Jumping up and down in your users face screaming for attention when their primary goal is using their apps arent productive.

Alas I've already commented in this thread or I'd mod you insightful. But this is exactly the point - it's something Apple fully understands, something that Linux vendors don't seem too sure about and something that Microsoft completely fails to understand.

The job of the operating system is to set everything up so it works then get the hell out of the way so the user can get on with doing what they want. As soon as the OS gets in the way, it's Doing It Wrong.

Somehow or other Microsoft's Office team does seem to have broadly figured that one out - while the new interface to Office does tend to engender feelings of "love it or hate it", at least it was developed with an understanding that people don't buy software in order to spend all day wrestling with the user interface. I would say Win7 is heading in that direction (I actually think there are quite a few significant improvements over XP, though they still haven't grasped the idea that if you can't be sure that everything will JFW, about the worst thing you can do is pretend it JFW and provide no hint anywhere as to why it patently doesn't), but still has a way to go.

Microsoft is faced with the necessity of branding their OS as 'new' and 'sexy', in the face of Appl's OS which is heavily advertised as 'new' and 'sexy'. Usability wise, the Apple desktop is actually a step backward from the Windows interface (but at least it has unix underneath for gearheads). Windows 7 removes some of the more egregious intrusions, but Slashdot isn't the primary customer of Windows -- the 100 million+ retail consumers deciding between a macintosh or a PC are the primary focus of the bells

Through all the marketing hype around Vista, you heard the voice of the few reviewers that MS forgot to buy: Vista? Why bother?

Vista was, next to Win95, the maybe most hyped OS ever. Even Apple, in all its ability to hype and market their products, could not hold a candle to the amount of time and money pumped into advertising Vista. But while the hype of Win95 came from the users, from people who never used or owned a computer but still just "had to have it", and where Apple manages to motivate its die hard users to work as their mouthpiece, Vista's hype was a lonely cry from MS alone. Partly, of course, this is due to MS being held in fairly low esteem by geeks around the globe (compared to Apple, who do have a fair amount of fans in the geek community, especially the very outspoken geek community, who fill blogs and review pages with their experience and joy they have from their latest Apple tool), but mostly it is simply due to Vista not performing well.

First, it did not offer anything really genuinely "new". There was no "wow, look at that! Never seen that before!" part of Vista. Every piece of Apple hard- or software so far always came with something "new". Some trick, some gadget, or maybe just some neat toy that was something to talk about in your review. Even if you never used it again after the novelty factor wore off. But it was something you could talk about. Something you could write about. Something you could review and say "hey, they invented something again". No such thing for Vista. You could basically just say "Well... it looks different... and some of the menues are different... oh, and hey, you can now simply search for your program instead of having to look for it in the program manag... oh, wait, no, Apple did that first... Umm.. yeah, but it's new on Windows!"

That doesn't pull people in. That's not attractive. And neither is offering the only eye candy feature (i.e. Aero) only to the upper price segments. Eye candy is what could have convinced Joe Randomuser, but he WILL NOT buy an "Enterprise" or "ultimate" edition! Talking about segmented systems, how many were there? 10 different versions? More? I don't remember, to be honest, but how should anyone but the most interested enthusiast know what version he needs? People, there's a reason why a car manufacturer only offers a handful of models per year and some extras to tack on (just to get a car analogy into the diatribe here). Because people do not want to spend hours trying to figure out what version they wanna buy! It's nice of MS to offer its users that choice, but the users don't even WANT that many choices. Even most Linux distros noticed that by now and offer a standard package that fits most users who don't want to bother sifting through the hundreds of options. Take a standard package, tack a few things you might want additionally to it and off you go!

Vista was more a marketing blunder than a "bad" OS. Ok, granted, it wasn't the best OS or the most "expected" OS MS ever built. No, it was not the worst, that spot is still occupied by ME. If MS should learn anything from Vista, then that it's not enough to pump a few million bucks into the PR and marketing machine to make people want an OS.

"We tried too big a task and in the process wound up losing thousands of man hours of innovation,"

.

Since when has Microsoft started to innovate? Outside of innovation in pushing the legalities of leveraging its monopoly, that is.

Everytime I read Ballmer talking about Microsoft innovation, I come away with the opinion that he is trying more to convince himself that Microsoft actually innovates (it doesn't), than he is trying to convince others.

Microsoft wasted time on Vista and Ballmer. The fact that Apple's market cap is so close to Microsoft's now is the ultimate embarrassment. Shareholders shouldn't be happy about the lack of "innovation" through his tenure.

And it affects us all. Even if you don't own MSFT directly, you probably have skin in the game through your 401-k, mutual funds, etc.

He's like that nasty fart in an elevator that you really, truly want to get away from but just can't. Shareholders need to pry the door open and let in some fresh air.

Wake me up when anything useful actually *changes* about any Microsoft OS. Last time was back in 2001 (possibly 2004 if you count XP SP2). The interface changes, the "hidden internals" change (i.e. upgrade your drivers to WDM drivers), but the way you use the damn thing doesn't. And each time it gets slower - slower to run, more demanding on resources AND, somehow, slower to navigate and use in everyday life. It also has useful features ripped out, customisability thrown out of the window, old features limited and junk thrown in.

(Why can't I make 7 look like 2000 / XP Classic? Hell, I can move EVERY individual button, widget, dropdown and toolbar on my browser, I can change every hotkey and have it load it up in any number of different configurations at a click. I used to be able to have a good level of similar control over XP's basic interface, and even Office's, but now I can't even get rid of that stupid Start Menu at all, or put the Control Panel back how it used to be, or (now) turn off the stupid Ribbon bar? I don't *CARE* if it's faster, more efficient, etc. for some people - it isn't for me, and I'm the one using this particular computer).

What happened to WinFS, for example? It seemed like a good idea, was the only thing that *really* got people interested in Vista and then failed to make any appearance whatsoever ever since.

Seriously, give me a call around Service Pack 2 of the "next big OS". The one with features that I feel I could use and which would speed up my use of my computer. In the meantime, I think I'll just "struggle" along being able to boot up really quickly, customise heavily and not need a super-machine to run things that have always run fine. Until then, Microsoft's offerings are completely irrelevant to me and have been since 2001/2004.

To be fair though, Vista laid the groundwork for Windows 7, which I have (almost) nothing but praise for...so maybe it was worth it. Besides, just like XP, as Vista got on in age it became much better.

>>>There were a lot of jokes about Vista being a beta for Windows 7. It turns out that Vista inadvertently filled that role.

Vista isn't merely a beta of Windows 7. It's the same product. Win7 is identical to Vista, but with optimized code so it can fit inside 512 megabytes* (like vista was supposed to do in the first place). Vista NT6.0 is to Seven NT6.1 as 98 is to 98SE, or 2000 is to XP, or MAC OS 10.6.0 is to 10.6.1.

** I've even seen Seven running on a 256 megabyte machine - Microsoft did

This all misses the point though because there were a lot of features they spent years working on that never made it into Vista let alone Windows 7. Microsoft aimed too high with Vista and fell short and the process wasted far too much developer time.

In my own opinion (and I've seen others state it, too), Windows 7 is just Windows Vista SP3. Microsoft had to break from the Vista brand because everyone (including the lay user) "knew" that Vista was a broken pile of junk. If they had heard Vista was bad and got a new computer with Vista on it, their mindset was to find all the little nuances that didn't seem just right and complain about it. Granted, there were many legitimate gripes, but even if Microsoft had fixed those, a user would still have the preconceived notion.

Alternatively, there's this new and improved Windows 7! It's great, it's flashy! It fixes everything Windows Vista was. And so the general user does not have any preconceived ideas and walks in feeling good about their purchase and looks for the good in the OS.

Microsoft probably streamlined a lot of code, background services, and process flow so that the user experience would be improved. Plus, they could fix their underestimated minimum requirement (I think), sell a brand new OS (instead of giving the fixes for free), and improve their brand name.

For myself, I still haven't migrated. Something about DRM running in the background, not wanting to support companies that treat their customers like the criminal, etc./me dons tinfoil hat.

There are many things to like about 7, but it retains all the usability regressions of Vista. Microsoft wasn't willing to admit Vista was a mistake, so they weren't willing to fix these issues.

UAC is still annoying to the point that I disable it completely. It still takes me longer to accomplish the same tasks. Aero is nice, but still a pale imitation of Compiz/Kwin. DirectX 11 has been completely ignored by the game industry.

Windows 7 has barfed on my RAID twice.

Once Microsoft's latest release claims it can now support patching without reboots, but literally every patch Tuesday since the first beta have still required reboots.

I run Windows 7 because it is the latest release, but I wouldn't say I have nothing but praise for it.

You know, I don't know what you do on a daily basis that UAC is an issue.

I like UAC for the simple reason that 99% of the time I'm not doing anything admin related, and like knowing that I'm not executing in a privileged mode. Occasionally, the UAC thing will pop up because Java or something has decided it wants to update itself and I get to choose when it updates and not it. Without it, I suspect that some bits of software would just update

I want to delete a shortcut on MY desktop, which prompts a UAC dialog, which I must address, despite the fact that I'm not changing the desktop for other users. After I confirm that, Windows prompts me yet again, asking if this is something I really want to do.

How can you defend that design?

Unncessary prompts like that just convince people to either turn it off, or just confirm everything.

A lot of installers actually install their desktop icons to the All Users desktop, which is irritating as all fuck. Then you need admin privilege to delete it which invokes UAC. The problem is that the All Users desktop mechanism is opaque to end users, which is just shitty Microsoft standard practice.

I think you're being harsh. I'm hardly a Microsoft fanboi. If anything the more typical charge against me would be Apple fanboi, but I own computers with MacOS, Windows 7, and Ubuntu as their primary OS; and use all three pretty extensively.

UAC is still annoying to the point that I disable it completely.

It's much improved over Vista ("You have moved the mouse Cancel or Allow?"). At this point it's no more annoying than Unbuntu or OSX prompting for a user password before installing software. In some ways it's less annoying, since you don't actually have to type your p

UAC is nearly useless. It tells you something is about to do something exceptional, but it doesn't tell you what it is trying to do, or even the exact executable.

As for the Windows 7 UI, it doesn't speed things up for me. With XP I can close windows faster (right click on task button press C, in contrast windows 7 requires additional mouse movement to close the appropriate thumbnailed window - this is slower). I can easily set things up to launch programs or tools by creating folders[1] and short cuts in the start menu (and using Windows Classic Mode).

I use both Windows 7 and XP daily, and Windows 7 isn't more stable, it's actually a disappointment (not as big a disappointment as Vista).

The advantages of Windows 7 appear to be:1) The per app volume control2) Better alignment on 4K boundaries (but it's not really XP's fault that new hardware has such issues)3) Better sandboxing (not that useful to me, since I don't use IE that much, and I run multiple browsers and some as different accounts).4) Going to be supported for more years5) Supports the latest DirectX stuff and graphics goodies.

The rest of the stuff just gets in the way of an "advanced" user willing to learn about how best to use the system - I haven't seen any features which actually help such users (the "god mode folder" is cool but it's more like a workaround to Window's 7 "sorry you need more clicks to do stuff now" UI)

Note: _www_username is the name of the user account which my normal browser runs under (this way I already have my own sandboxing) - so even if my browser is pwned the malware cannot access my documents and other stuff.

Once you do this, you can press winkey, 1, 3 to explore My Documents (and you should set up the folder view so that you see the details and not some useless icons, this way you can sort by date, size etc.

winkey, 1, F will start the explorer to explore the F drive

I've also set winkey, 4 to launch the command prompt.

In contrast on Windows 7, winkey+<number> will just launch/foreground the relevant pinned apps or opened apps. That just limits you to just 9 (or 10?) items, there appears no way to set up your windows system to do what I normally do anymore, without resorting to a 3rd party app. Thus Windows 7 is worse for me.

Seriously? It's much improved over Vista, and there have been two times where it actually has caught some bad joojoo that otherwise may have caused trouble. I don't mind it at all.

Seriously? You think that security isn't intrusive? Man, talk about naive. When it comes down to it I will always be far, far more surprised at UAC actually stopping something malicious rather than the fact that users complain about it.
That's not to say that UAC might still be the RightThing(tm) but that's a completely different argument.

I have a hardware RAID that Windows 7 took a crap on. The RAID couldn't repair itself for some crazy reason, and these were brand new hard drives that I had been using less than a month. This is when I discovered that you couldn't do a repair install anymore. This was in the beta days.

I have a copy of Windows 7 Home and Windows 7 Ultimate at home. I run Ultimate on my gaming desktop, and the RAID took a crap once again, which can't repair for some

Really the Vista analogue is Win2k. I think that Win2k:XP and Vista:Win7 are very parallel. I don't think people remember how truly awful Win2k was on day one. I installed it the week it was released and it was incompatible with so much of my hardware I was offline for three weeks until I just went back to 98SE (which I used until XP came out).

I also think that XP was just about MS's best OS out of the gate. Yes, it was vulnerable like swiss cheese, but even before SP1 it was otherwise very stable and polished if you could keep the malware at bay.

Vista was utter crap on an unimagined scale. One update screwed my system so bad that every 24-48 hours it would stop handling HTTP, POP, and IMAP, but IRC would still work, as would ICMP. The computer was also being used as a gateway at that time and HTTP requests would work THROUGH it from other computers, but not FROM it. No amount of releasing/renewing the IP, updating drivers/firmware, or bouncing services around had any effect. It had to be restarted a minimum of every two days. This behavior persisted until SP1 came out. Like I said, utter crap.

I still haven't had a chance to try Win7, though from all the positive feedback I definitely will when I get around to my next system overhaul.

You can thank Superfetch for that. If you have any decent amount of RAM Superfetch will take the unused RAM and load the programs you use the most, and if you use programs at a specific time it will make sure to load them before that time. Really gives it a kick in the pants. If you have a spare flash stick lying around I'd try Readyboost as well, as I've found that can also give a pretty decent speed boost.

The only thing that irritates the shit out of me about W7 is that damned devices and printers. In the

I'm not sure I'd classify 2k as the beta for xp. 2000 was definitely the successor to NT 4, and the last version with a distinct workstation variant. I remember being delighted with 2000 server in comparison to NT 4.

Windows ME fills the XP beta position, though. Nearly everyone hated it. It was released after 2000... kind of like how 98 was released after NT 4, which was released after 95.
The big difference I see, though, is that it was not NT based.
Anyway, people complained about XP for quite a whil

XP is basically 2000 SP5 with a new user interface. That's what makes 2000 the beta run for XP... Windows ME was based on the 9X kernel, meaning that it was really more of a successor to 95/98 than a predecessor to anything that came after it. It can't really be a beta run when it's a completely different UI and kernel...

And like 2000, Vista has gotten a *lot* better with the service packs that've come out since its initial release. It's actu

2000 was supposed to be XP, everyone was supposed to migrate from 9x to 2000...That didn't happen, so they released ME which seemed like it's sole purpose was to make the 9x series look as terrible as possible in order to convince people to move to 2000 or xp.

Windows ME was really MS-DOS v 7.3 (or whatever that would be with the numbering system). It was the final operating system from the DOS legacy that started back in the days when Bill Gates was actually contributing code for the OS. That was ultimately the problem with it, where it had to deal with all of that legacy code base and they tried to make it sort of like Windows 2000, but deliberately crippled it so it wouldn't compete against their other products and introduced a few features that actually bac

I was using Windows 2000 in October 1999, months before it was released. I saw several BSOD, but it was still better than Windows 98SE. After Microsoft released IE 5.5, the BSOD disappeared and for the first time I was able to run Windows for weeks at a time with a reboot or crash. Perhaps Windows NT was more stable, but I never used it.

Some of the beta versions of win2k were really unusable, but i had one of the RC releases and it ran very well on my Thinkpad 600E... When i updated it to the full version, it never seemed to work as well as the RC...

Actually, having actually used Win2K back in the day, it wasn't half bad if you put it into perspective. Win2K wasn't an upgrade to Windows 98. WinME was the upgrade to Windows 98. Win2K was the upgrade to Win NT 4.0.

And, really, I can't think of many things that worked worse in Win2k than on NT, other than the fact that Win2K needed more RAM. And speaking of devices and drivers, it was compatible with almost everything that used to work under NT (though not with anything that used to only work in the DOS p

Really? In my personal experience on numerous machines Windows 2000 was the most stable, least crashing version of Windows excepting possibly XP SP2. The jury is still out on 7 in my experience, but it is hugely better than Vista and I've had few problems. While I agree that Win2K had a lot of missing drivers on day 1, that got fixed rapidly and was really the only major problem with 2K whereas missing drivers were only one of Vista's myriad problems. The primary reason that XP was better "out of the ga

For myself, I would call Win 2k as actually superior to Win XP. While XP does more things and has more whistles and gadgets, Win 2k is good for what id does. The largest problems with Win 2k is that Microsoft has stopped supporting it and has deliberately thrown in a monkey wrench to kick people off of that (now competing for mindshare) OS.

The jump from Win NT 4 and the abomination called Windows ME (better yet, Windows 98) was huge, and it was a clear step in the correct direction. If you say that Windows 2000 was awful on day one, I take it you never tried "Windows 286" (aka Windows version 2). From day one with my experience on Win 2k was substantially increased stability and complete compatibility with NT 4. If it ran on NT 4, it would run on Win 2k and usually do better. There were a few problems with old DOS-era legacy apps and stuff that used obscure (aka "undocumented") API hooks from the Windows 95/98/ME line that failed on Windows 2000, but that should be expected too if you understood the operating system. XP was OK and does some stuff good, but Vista was actually a step backward.

Windows 7 was finally a chance to fix what was wrong and get back on the right track.

MS always take two goes to make a new OS - but apparently, this is somehow news

I keep hearing that Win2k was crap. Win2k was awesome! It not only took all the GUI from the more modern 98 line (over the basic 95 that NT4 used) but it also incorporated the "home user" experience that was sorely lacking from NT4. USB, Direct X (for games) and PnP were all part of Win2k and it still had the stability and relative security of NT4 (while lacking, it was still light years ahead of the 9x line. Need I say NTFS?)

NT3.51 was MS's fork of OS2. While it sucked when compared to modern OS's, it was still much better than any MS OS to date. Keep in mind that the top MS OS at this point was Windows 3.1 and DOS. NT's only server competition was Netware, which was stable, but nowhere near as scalable as OS2/NT3.51.

Win98 was the best 9x OS released. It offered working PnP, USB and even FAT32 (second edition). Again, a very nice OS for its day.

I agree that you shouldn't have been modded "Troll" just for voicing your opinion, even though I somewhat disagree with it. You were right that MS usually takes two tries to make a good OS though, but I believe you were wrong on the versions you called crap. Win98 was the second go of the 9x (SE was the true release of 98 IMHO). Win2k was the second attempt at using the NT kernel with the 9x GUI. Windows7 is the second attempt at the NT Kernel with the Vista GUI. All the second attempts were fine OS's