Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Renegade334 writes "The Inquirer has a story about MS Longhorn and its need for better than entry level graphics cards. This is due to the WGF (Windows Graphics Foundation) which will merge 2D and 3D graphics operations in one, and 3D menus and interfaces that require atleast Shader 2.0 compliant cards. Supposedly it will really affect the performance of the new Microsoft OS." This has been noted before in the system requirements for Longhorn, but it would seem the full impact is slowly being realized.

Mac OS X uses the graphics card heavily for much of its interfaces. All Macs sport at least a Radeon 9200 (Mobility in the iBook G4), and Apple takes advantage of those cards in plenty of apps... note the multi-person video chat layout & details in iChat AV, or the compositing

That's not a knock on Windows - just an aside, really. The consumer graphics of PCs have been steadily improving, and there's little reason to not make use of that power. The only problems could be in the low-end motherboards offering cheap integrated video. Inevitably, some people are left out in the cold. Time to start moving to nForce or Radeon IGP, PCChips!

Making use of the available graphics power just makes sense, and Apple was smart to be the first to realize this. After all, window compositing is something you're going to have to do at some point anyway; why not offload that task onto that part of the hardware that's actually designed to composit things?

But when you step into the realm of "hey, we've got this power-- let's waste it on something!". Then you're doing something really bad. Using pixel shaders to draw drop shadows on semitransparent textured menus or somesuch begins to fall into this territory.

In the first case you're taking the present advantages offered by the hardware and leveraging them to improve the consumer experience. In the second case you're taking advantages offered by your hardware and eliminating them-- removing the power of your 3D hardware (which technically is there for the applications, not the OS, to use) by making sure that the 3D hardware is continually tied up running the particle engine floating around the talking paper clip or Enlightenment logo or whatever. This degrades the potential consumer experience because it means the consumers don't get to use the hardware they paid for, the OS is too busy using it.

The difference between these two situations may be a little bit subtle and a larger bit subjective, but do you see the distinction here? Because given the curve of resource usage their OSes have followed in the past, I kind of doubt Microsoft does...

Making use of the available graphics power just makes sense, and Apple was smart to be the first to realize this.

Mmm, no. Commodore was the first to really do this. The original Amiga had native graphics capabilties that still aren't available (like multiple resolutions onscreen) in PC hardware. The OS used them, and used them well. When a more advanced Amiga came with more graphics capabilities, the OS automatically configured them and used them as well. Apple was me too, much later.:)

But that's OK. Apple knows how to market -- that more than makes up for coming expensive, late and/or weakly with a number of things. Plus they provide a really nice end user experience.

Mmm, no. Commodore was the first to really do this. The original Amiga had native graphics capabilties that still aren't available (like multiple resolutions onscreen) in PC hardware.

In the interest of historical accuracy, the Atari 400 and 800, first publicly available in 1979 (six years before the Amiga), allowed mixing multiple resolutions on screen. You built a display list of modes and the hardware interpreted them. You could mix text, graphics, and various resolutions of each. You could also trigger interrupts to occur on a specific display list command.

IIRC, what diferentiated the Amigas was that you could not only mix multiple resolutions onscreen, but multiple resolutions with different bit depths, palettes, and even mouse cursors, which were drawn by hardware. This is from the top of my head, as i (sadly) never owned an Amiga and only fiddled when i saw friends who owed one, but i recall reading about that and be grossly impressed. It was truly a machine ahead of it's time.

Hey, I *like* drop shadows and semi-transparency on menus and the like, it provides a "rich" environment and also helps to prioritize open windows. Perhaps you are a command line guru, I work with CAD software a lot and I appreciate the eye candy as a visual indicator. Then again, if it were up to me we'd toss all the CAD software and hardware and go back to board drafting - less "it's easy to revise because it's on the computer so let's do it a lot" attitude and more forethought required when designing.

"Keeping up to speed" these days has more to do with updating one's computer knowledge quotient and not enough to do with actually doing real-world stuff and improving skills in the disciplines that we use computers to help us with in the first place.

Yes, XP ran a hell of a lot better than Panther. The systems in comparison, btw, were a 400MHz Celeron with 128MB ram and a shitty ATI 8MB video card, and an iBook G3 500 with 128MB ram and whatever shitty video card was in that thing (I think it actually was a 16MB ATI). And specifically, the problems I would run into were directly tied to Quartz rendering.

With it's crippled 66mhz system bus (even the iMac 350 was 100mhz) and it's woeful ATi Rage 128 8MB, it is quite a poor performer under OS X. You can overclock them to 600 on a 100mhz system bus with no issues and they perform far, far better.

Look at x.org [x.org]. Look at what they want to do with switching everything over to OpenGL rendering. I think you might find quite a few simularities between Longhorn, OSX, and x.org. It's the trend, and I think it's a smart desision.

So what if you won't be able to use the windowing system unless you have an accelerated graphics card? Nearly all new(er) computers have graphics acceleration capability. It opens up a WHOLE lot more possibilities with what can be done within the windowing enviroment. PLUS it make

This is a definite truth, however to the best of my knowledge, people who specialize and are only familiar with server OS's like Windows heavily rely on being able to click around than knowing what to type. Kinda like Windows' user base, but to a lesser degree since your average Windows network admin probably knows a bit more than you average Windows user.

It would definitely make Windows look alot better in that market if they did in fact have a purely command line mode just like unix/linux which you could

"...to mimic the job that was being done with a mechanical typewriter a hundred years ago?"

Fucking Luddite. LaTeX for secretaries is stupid. Computers are getting faster. Software grows to take advantage of it. Passing rendering to the GPU is inevitable, and it would be stupid _not_ to do this.

Personnally I find LaTeX much simpler than any word processor including Word of OpenOffice. The fact is that even full professors or IT managers can't master word well enough to produce consistent fonts and reasonable tables across a whole document.

> Yeah, but today's high end> > will be low-end by the time [Longhorn] actually gets released.

Yeah, but the Open Source and Free Software drivers for video cards will still be stuck at the level of the Radeon 7500 when it comes to 3D acceleration, due to the (unfortunately, for valid competitive-analysis-type business reasons) concerns of video hardware manufacturers (namely ATI vs. nVIDIA) when it comes to disclosing specifications.

This is due to the WGF (Windows Graphics Foundation) which will
merge 2D and 3D graphics operations in one, and 3D menus and interfaces
that require atleast Shader 2.0 compliant cards.

That's just
plain stupid. Grandpa & Grandma want to check their email and pics of
the grandkids, why on earth should they require a Radeon
MegaXP293823-XtremeSLI+ to do that? I hope
there's an option to disable all that cycle-wasting crud or MS may be
shooting itself in the foot: how many offices will spend a few hundred
dollars on individual video cards just to upgrade the OS? What about those machines with onboard video (ala Dell?)

First, the GPU is the processing unit, the framebuffer is the memory where the bits are stored. Both are involved in any kind of rendering operation, 2D or 3D. The GPU operates on the bits on the framebuffer.

Second, modern graphics devices don't have any dedicated 2D hardware left in them. They all just use their 3D cores to do basic blit operations. Why waste silicon on specialist 2D blitting when you've got a gajillion megapixels of fillrate sitting right there in the 3D core?

Third, you are obviously unaware of how modern shader technology works. If I want to stream down 2D coordinates then I can do that just fine. In fact, shaders don't really care what all the numbers are, they just know that they are getting a certain number of inputs. If you choose to write a shader program that interprets them as coordinates to be transformed, then that's merely the common convention. Heck, I could just stream down 1D coordinates if I wanted to (actually, this is genuinely useful, if the coordinate is time and the shader is computing, say, a particle system). So there is really no inefficiency in using the 3D core to do 2D operations, because I can just transmit the minimum amount of data necessary by means of a suitably chosen shader.

but who says grandpa and grandma need to move to longhorn as soon as it comes out, when MS is just nowending support for WINNT4.0, as reported recently here on/.?

Grandpa and grandma will be just fine on 2000 or xp, or...and here's the crazy part...even 98. My father in law still uses win3.freaking-1 on a 486, for Christ's sake. Grandpa and grandma will be just fine.

That's just plain stupid. Grandpa & Grandma want to check their email and pics of the grandkids, why on earth should they require a Radeon MegaXP293823-XtremeSLI+ to do that?

I think you've touched on one of the more hilarious parts of the computer industry. It's not about what people NEED, it's what you can require them to need. Want the new security features of Longhorn? Want to do email faster? You'll need a better graphics card.

I've seen 98Lite, 2000Lite and XPLite. Perhaps the guys will make a LonghornLite that will enable you to use low-end graphics cards. Man, Microsoft should HIRE these guys. No. Put them on the TOP of the devel. team.

Having 10-20% of the price of your PC being in a bare minimum graphics card just seems ridiculous. What's next? Requiring 5.1 digital sound with multichannel reverb so Longhorn can tell the user "You've got mail!" ?

By the time long horn gets released(2006 or later), onboard graphics controllers will easily meet requirements. We are talking about 2 to 3 years, right now mid line cards $150 to $250 implement vertex and pixel shader versions 3.0. Sub $75 cards are mostly at 1.1, but by this summer I bet most of them will be 2.0 or even 3.0.

Well, eye candy may or may not be bad in OSX - if I had the money to waste, I'd get a Mac and dual boot with Linux. As far as KDE goes, you may be able to turn this "eye candy" off, and KDE isn't forced on you if you just want to use Linux.

Having any themes on in Windows XP on my Athlon XP 1900+, 512 Megs ram, and a Radeon AIW 7500 nearly kills it just scrolling down in the start menu (considering ALL sound and all other windows freeze, I say its close enough to killing it).

You, my friend, have some other problem with your system. Or you're flat out trolling. I use themes on XP on a 667MHz P3 w/ 384 megs of RAM with absolutely no trouble.

And don't even start with Apple. Their users are pre-conditioned to apply for a mortgage every time they have a product launch, so they're used to paying for a system with an over-spec video card.

Sheesh. You'd think that two days after Apple releases a $499 computer this kind of statement wouldn't still be popping up on Slashdot. Then again, relying on 10-year-old stereotypes that no longer apply seems to be something of a requirement for Slashdot posters.

Spending $480 over at Dell (even though I don't like them much either) gets you:

80 GB HDD512 MB RAMCeleron 2.6CDRW drive17" monitor!

So the processor might be a bit pokier than the G4, but you get twice the storage, twice the memory, a burner, AND a display. And it's still $19 less than the Apple offering. So tell me again how this is competitive?

I really fail to see how this will be useful, and help productivity. Personally, i dont think an operating system needs to be that fancy. Just like those who use the console now, "back in my day, we had to use 2d interfaces"

As I've experienced it, having an accelerator render your windows is really very helpful for usability. Rather than having things pop into place, you animate them. You run your animations quickly, so it's not annoying -- but a bit of motion can do several things:- Draw your eye towards whatever is moving. Your peripheral vision can see something moving better than it can see a sudden pop.- Give you a better sense of what is happening. If I press Minimize and the window disappears, I sometimes have to go hunting around my screen for where it disappeared to. If it animatedly shrinks, it helps your spatial memory to find it again. Having a decent graphics card to render the shrinking effect makes the transition smooth and nice.

Having a graphics card for your windowing system also allows for reflection, transparency, and other effects like that. I haven't seen a good use for those effects in user interface yet, but I think they could turn out useful.

That would improve KDE so it will only be like 50 times slower than Linux then.

See, that's the brilliance of it. KDE's strategy was, rather than go to all the trouble of writing a fast GUI, just start out slow as mud and then just wait around until the Mac and Windows GUIs get even slower!

Just in case you actually cared, KDE 4 will be able to use a pixel shader for rendering the menu. And, assuming both KDE 4 and Longhorn are on time, KDE 4 will come out first. And, seeing as KDE has made its last several releases to within a few weeks, it seems likely that at least KDE will be on time.

So, overall, I quite agree with you. Those slackers over at MS have some real explaining to do about why they'll be the last OS to have any real hardware acceleration.

No - No you're not the only one. How the larget software company in the world managed to produce a GUI as clunky and chunky as XP's astounds me.
And it's more than just the colours too - the new 'Start' menu is a disaster in usablilty. It's ok once you know your way around, but try asking a novice user to find the 'Programs' button. It takes maybe a minute for them to scan the confusing mess of buttons to find the programs - you know, those things that make the computer usable....

Honestly, do we NEED a 3d-accelerated interface? I'm sorry, but the "cute" factor vanish rapidly, and if it's gonna cost me a 200$ video card, I'll pass my turn. So basically, we will be required to buy a 3d card if we want to upgrade past Windows XP?

Anyone else think that Nvidia and ATI might have lobbied aggressively for this? I can't justify this... if it was an option, sure, no problem, but a necessity...

I don't know if you'd noticed, but you can't buy anything BUT a 3D card new these days. By the time longhorn is out, if you don't have a 3D card with PS2.0 support, that would make your PC about 5 years old

Guess what things change. Back in the 80s when the Mac was released People said the same thing. Why do you need a GUI Interface where we can get all that we need done in text mode. GUI is only for games and cute apps. Then by the Mid 90s GUI became nessary for most modern computing needs. Besides just allowing ability such as WYSWYG Word Processing. The windowing interface made it common to have multible apps open at the same time where you can see information on one app and the other. Yes Desqview could do that too in text mode but it was difficult to get the data you needed without the resolution. Then you were paying $200 or More just for a card that can do "Ultra High Resulution" 640x480 at 16 colors. Shortly after all the computers needed them there production price went down to match competition.

The same will happen with 3d cards after longhorn is released in some times in the distant future. The prices will go straight down, because there will be more then just 2 that will make a Longhorn compatible Video Card.

I can't justify this... if it was an option, sure, no problem, but a necessity... Nobody is forcing you to upgrade you will not be put in Jail if you use your 8088XT with MS DOS 2.0 with 256k of RAM and a CGA (2D 4 Colors at 320x240, 2 Colors 640x240, 16 color Text Mode) Video card. But honestly as time goes on the system requirements for new systems increase. It is the same for Most Linux Distributions, Mac OS, BSD, Solaris... It happens deal with it.

I frequent several diffrent gaming forums. I have noticed that there are always people trying to play games on intel integrated graphics. Since intel just barely supports standards, its not a suprise that many games dont run at all or hardly on those cards.

Hopefully it will encourage intel and other intergraded graphics makers to make decent video chipset or get replaced by demand. On the other hand, intel might make it just good enough for longhorn but not games.

there are always people trying to play games on intel integrated graphics

But... but... the sticker on the front of the case says 'Intel Extreme Graphics! How can anything beat 'Extreme'? But don't worry, by the time Longhorn hits the market, I bet we'll have 'Intel Excessive Graphics' and be all set!

I've used Windows since 3.0. I'm a Windows (.Net) developer. And I agree that the gee-whiz factor will be great. Animations, depth to menus... it'll be gorgeous.

But... It doesn't matter how fast computers get, Windows Explorer Shell always seems to become less snappy, even on fresh installs. XP made the start menu slower than ever as it retrieves nonessential metadata on the shortcuts. Myriad Shell extensions, over time, bring the Explorer UI to a crawl.

Sexy is great, but I have to use it every day. It's just not worth making the UI dog even worse.

That is the $64 question isn't it? Can Microsoft learn to make an OS that doesn't slow down massively over time. I just did a fresh install on my one machine that runs XP and its night and day. Over time XP just gets slower and slower. Of course the battle cry for MS defenders is "its the fault of 3rd party drivers and apps". Well, then make freaking OS that doesn't let "3rd party" apps run it into the ground. Why do I even need to use an app's uninstaller? Why by default doesn't XP know exactly how to remove every last bit of registry crap that got shoved in there in the first place? How come it take 10 minutes for the start menu to come up after I've been using the OS for a while? How come many explorer operations still lock up the OS and stop whatever work you doing cold? When will MS make an OS that you can actually multitask on no matter what's going on in the background? MS has a lot of work to do and somehow I get the feeling that they haven't learned their lessons yet.

... Really. How much 3D are you going to stuff on a display that except in a very few rare cases isn't able to display more then two dimensions?

Mac OS X makes use of some 3D hardware for slight tricks when the hardware is there (on a G4 or G5 it will use a rotating box effect when logging in or switching users, on a G3 it won't) and I'm sure there's some acceleration used in Expose to move windows around although that works on all the macs I've tried it on, but what exactly could they possibly do 3D wise t

No, not 3D interfaces in the way you're thinking. Think of it this way: every window is now an DirectX object. No need for redrawing by an app. Since every window is now a 3D object (one with only one pixel depth), you can do simple things like moving all the maintenance of a windows' DC from the app itself to the OS.

That's what Quartz Extreme does on OS X. This is just Quartz Extreme on PC.

Back at WinHEC in May (and before I believe) Microsoft gave out some more specific details about what the graphical requirements for Longhorn would be. Here's a summary [neoseeker.com] of the what they were expecting hardware requirements to look like. There is a more detailed version buried on their site somewhere but I'm too lazy to dig it up

There are these things called video games. Look real sweet on your PC. They need good video 'cards' (get it? video games, video cards, its all video related), and if MS wants to give me a pretty desktop in pseudo-reward for blowing $300+ on a NVideo Deathbringer 5k, then I'm happy about that.

I completely agree. Having a desktop that is not great on eye candy but usable is far preferable to an eye candy filled desktop that is even a bit less useable. I'd much rather see the desktop get more responsive and feel quicker as hardware speed increases rather than have the look improve but the "snappiness" stay the same or go down. It is always nice to be able to have an eye candy filled desktop, thats for sure, but there needs to be a way to get rid of it when not needed. Microsoft did it in XP, a

If you have powerpoint installed, check this out [microsoft.com]. It's a fairly in depth discussion on Longhorn with emphesis on the new Windows Graphics Foundation.

If not, I'll sumarize. Or you can google for essentially the same info, but this powerpoint file is well done.

One of the goals of longhorn is to further the requirements of signed drivers, and to offload the complexity of drivers into the new WGF. The idea being that it's better to have MS write the code once well, than to have lots of third party vendors

Let us all not forget that many years ago the video requirements of modern interfaces were substantially different than now. Things must progress and evolve. Interfaces will become heavier on some levels but easier on others, but you can clearly count on the advancements of technology to help OFFLOAD the strain to new devices and components. By Longhorn doing this, my guess is that my CPU will actually get less of a load on most things by making the graphics board do what it does better than a general purpose CPU.

You can't stop evolution simply because you can't keep up or you get comfortable.

I am consistently blown away by people who make comments like this:

"Am I one of the only ones who prefers usability, stability, and performance... to eye candy?"

Do you watch TV? Do you look at magazines? Style is here to stay my good friend. I don't know about you, but I DO care about what my OS looks like. If I wanted my OS to look and feel like a windowless brick room with flickering flourescent lighting, I'll skin it that way myself.

Do you even use modern software? Almost all of it is skinnable. Why do you think that's popular? Because people are bored? No, because modern software is generally an extension of your personality. My guess is yours is like vanilla ice cream.

On top of that, you are CLEARLY in the minority.

A couple scenarios:Do you drive an old beater for a car because it "does the job"?Do you live in a tiny room with an integrated flip down bed and sit on the floor to eat because it's a more efficient use of space?Do you wear burlap clothes because it seems more practical?

I'm sure you talk tough on computer crap, but you probably are wasteful in other areas. People like me DO care. I care about my car having the latest features. I care about my house being more than just a few walls with a ceiling. I care about personality and enjoying what I'm working with and where I live.

"But do I really need to get new hardware... for eye candy?"

Mr. Vanilla: Do you realize that every game id and Valve release sells new hardware? Oh, that's right, you wouldn't know because you're too busy with your CGA graphics board playing pong so you're not forced to "upgrade".

How is this different from Apples Quartz Extreame or soon to be realeased Core Image? Its not. It the natural evolution of things. While naysayers will shout "idont need this" and " Its not productive" , When you have several CPU Intensive apps open and running, wouldn't it be nice to know that your otherwise unused gpu is taking care of your windowing?

What about the high end audio cards so my computer can say "DooWeeeeeeeeeeooooooooOOOOO! BOOP!" as the cool 3-D Start Menu pops up when I hit the Start button and then another "BOOP! OOOOOooooooooeeeeeeewwwwDooo..." when I close the Start Menu?

A command line is good enough for me, but I suppose that when you think about it, a 2 dimensional graphical user desktop display is not really the be all and end all interfacing with a computer. It was as good as it was going to get when hardware was more expensive and slower... but to think that the GUI desktop is the end of computer evolution is very premature IMHO. We have the technology....

There are so many ways that the technology can be pushed, that Microsoft actually *may* end up innovating somethin

Is this going to be another case of where Microsoft tries to copy Apple, but misses the point?

Mac OS X 10.2 introduced "Quartz Extreme", which uses your graphics card to composite your screen. This meant that dragging windows around now required almost no CPU power at all. In 10.3, they introduced several 3-D effects to enhance the interface - most notably a rotating cube when you switch users.

There are two key points that Microsoft seems to be missing, though:

* Mac OS X looks exactly the same if you don't have a powerful enough graphics card, and screen redrawing is not too slow. Having a graphics card just makes the system more responsive because the CPU is doing less of the work.

* The system degrades gracefully - if you don't have a powerful enough graphics card or run out of video RAM, certain 3D transitions may be skipped. But everything will still function, and everything will look the same.

It's too early to tell, but it is starting to sound like Microsoft may be creating a new interface that requires a super graphics card, leaving those with only cheap integrated video with a completely different interface. To me that sounds like a recipe for tech support hell - novice users won't understand why their screen doesn't look like someone else's.

if requiring a graphics accelerator card is an unchangeable part of the Operating System, the system is obviously badly designed. Longhorn should separate the graphics modules from the interface. If the kernel doesn't detect an accelerated graphic, use the 2D system.

I'm anticipating that a lot of people are going to bitch and moan about how it's pointless eyecandy, but if Microsoft is able to do what Apple has been doing, then it could really add to the UI.
Things like expose and translucent windows can come in amazlingly handy in OS X (I've never found anything quite as useful as transparent terminal windows in OS X allowing me to have code open in one window, and documentation in the window behind it, and look through the code window to read documentation, especially when working with an API your not familiar with).
I think that as 3D accelerated UIs become more common, we'll see even more useful features popping up. It's not like there is any good reason for new computer to have a video card that won't run this, and the type of person who would upgrade would probably either already have a newer videocard anyway.
I just wish this would make it into X, but alas I suspect that it's the sort of thing that might take a while to get properly implemented and supported.

Things like expose and translucent windows can come in amazlingly handy in OS X (I've never found anything quite as useful as transparent terminal windows in OS X allowing me to have code open in one window, and documentation in the window behind it, and look through the code window to read documentation, especially when working with an API your not familiar with).

I just wish this would make it into X, but alas I suspect that it's the sort of thing that might take a while to get properly implemented and

I'd say that 3D acceleration is a Good Thing. After using QuartzExtreme on multiple macs, I have to say it makes a massive difference in most apps. It *does* speed up even moderately easy 2D things, like word processing apps. Also, where you notice the most difference is when switching between programs. Basically you've already got the images loaded in video ram, so a lot of stuff is instantaneous. And yeah, iChat AV wouldn't be quite as pretty on Win XP.

But the real question is: why are pixel shaders needed? Unless you're doing strange reflections or simulating bumps or playing around with reflectivity in realtime, I can't imagine a use for them. I certainly can't see why you'd need anything more than simple textured quads or triangles. Oh, and some sort of alpha support for shadows. All of that sounds like a TNT2-era card, like the one I used to use to do Quake II.

What this really feels like is Microsoft pushing hardware adoption again. Ever notice how new motherboards don't come with USB drivers for Windows XP? How you have to upgrade to the latest service pack to get USB support? Partly piracy curbing, and partly I think to keep a hold by forcing people to use approved hardware.

This is a very good thing, if only because it will force developers to think in terms of arbitrary units (like "inches on the screen") as opposed to hard-coding pixel dimensions into their software*. Recent high-resolution monitors have exposed painful problems of hard-coded pixel interfaces - like text that becomes virtually unreadable at 3840x2160.

As a side benefit, this move towards a more vector-oriented display architecture means anti-aliasing will be easy to perform. Imagine dragging a window around with sub-pixel precision, and having the window contents and edges anti-aliased with a high-quality filter.

Not to knock Apple, but from what I have heard, Microsoft's implementation goes further in making the graphics API completely resolution-independent.

* and if you still want to use bitmaps for certain things, go right ahead, just let the graphics card re-size them to the appropriate pixel dimensions with high-quality filtering.

I'm a computer animaton/FX guy and I need every little bit of speed out of my GPU... in many cases my GPU ends up holding me back, not my CPU. I don't really need menus and windows to be taking video RAM either.

I wish MS would work to make computers cheaper and more a part of everybody's life instead of trying to make companies spend $1000 to upgrade each system so they can continue to use Office (on top of the already unbelievable MS Office tax.)

When Longhorn finally ships, you get to spend money and time upgrading your video card and buying more RAM - or you can just buy a new machine ready to run, virus-free, and which requires only an upfront investment in a keyboard and mouse. Everyone has a TV - and the Mac mini connects to a TV out of the box.

And do you really think even a midrange PC today will be capable of running any decent video editing app in Longhorn?

Now remember, these people already have monitors, keyboards, and mice. The mini comes with none of these. Just replace your old, decrepit PC with a Mac mini.

Apple is introducing this new idea and expression of the home computer now, because it gives them time to gradually inform the market, generate buzz, and work up to a similar condition to what we se with the iPod today.

They will learn from this first, good product, and make something even better. The iMac was the first example of this thinking; iPod was the most successful. Start with only the best ideas and build upon them. Kill the bad ideas quickly. Drop the size, drop the cost. Apple is innovating at hyperspeed, catching up for years lost wandering in the wilderness.

If you're going to spend $500.00 on a new machine so you can run a new OS, what's to keep you from geting one of these Mac Mini things anyway? Especially when you can just hook it to the TV, put it in Simple Finder, and give one to granny for e-mailing pictures of her fancy dog to her friends with fancy dogs?

Just my two cents. Everyone's in the PC business has been secretly that afraid Apple would do this for years now. Now they're left to squeeze their margins even further, remaining at the sole mercy of Microsoft - who appear to be displaying an incredible ability to screw up nearly everything they've touched over the past couple of years.

I remember back in the day when DOS moved into Windows 3.0 and it was a question of whether the [mostly sound cards] device manufacturer supported windows and not if windows supported the device. It was understood that hardware alone isn't the only responsibility.

But let's not forget that Windows barely supports any recent hardware [graphics/sound/tv tu

Most people are probably using Ethernet, with a Realtek 8139 chipset, which is well supported. If not that, then the one onboard the motherboard (I don't know what nForce support is like, but the Rhine driver works for me).

There's been a slump in the computer sector due to the massive roll out around 2000. Not too many people buy a new computer within a couple years. It wouldn't surprise me if most people were still using the systems they bought 4 years ago. If they're using XP, it's a software upgrade only.

When XP came out my dad, a programmer for a large corporation, eventually bought a new computer from Dell with XP on it about a year ago. His previous system was a 350Mhz Dell. A programmer myself, my top system is a 1.2Ghz Duron running Win2K. I've had it for a couple years.

When Longhorn comes out it's time for an upgrade anyway and most people are going to buy prebuilt systems. Those prebuilt systems will have a (barely) sufficient graphics card.

GeForce FX 5500's are well under $100 already. In a couple years when Windows needs that kind of card to run, they'll be dirt cheap and onboard.

And it'll be just in time for when people are looking to upgrade their computer hardware anyway.

Complaining that MS is forcing upgrades is as silly as claiming ID Software forces hardware upgrades. I still use 2000, could use 98 if I wanted. I could also play Wolfenstein 3D and stick to a 386. Something needs to drive the market. If there was no need for better hardware, there'd be no better hardware. It's all artificially driven anyway. There's no objective reason why we need fancy pants graphics in any software. There's no objective reason we need high quality, drive space/CPU/Memory eating, audio/video.

In short, who cares that MS is making greater graphics demands for it's OS? They've done this with every release. Even Linux is making greater and greater demands. If you want the all the graphics pizzaz of Windows 3.11, use Windows 3.11. Some of us like an OS that looks "pretty."

If you want a plain text OS, then use DOS or ditch the GUI of Linux and have fun.

This is a broken window fallacy. You say that the OS requiring a 3d graphics card will cause people to buy more 3d graphics cards and expensive computers, you say, "aha, more money being spent, that is good for the economy". Not necesarily. The money on 3d graphics cards has to be spent to get your computer what it did well without 3d graphics cards (draw a gui). Unless the new UI adds a lot to the experience we have no net gain, we have just spent money to get back to where we originally were (a "usabl

My first thought was: "Gee how original! Hadn't heard of a good idea like that since.... Mac OS X maybe."

I'd be surprised if they really went wild with 3d interfaces like the 'Jurassic Park' file browser, or the cube with web pages mapped on it that was posted here awhile ago. I think they are just going to do what Apple has already done and what Keith Packard is working on for X-Windows.

You are probably right. Microsoft will only use it for flashy effects. At least Apple eventually got to arguably us