Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

mikemuch writes "While ATI still hasn't released a DX-10-capable graphics card, Nvidia today already released its affordable SKUs, in descending price and performance order the GeForce 8600 GTS and GeForce 8600 GT, and 8500 GT. The GTS costs $200-230, the GT from $150-170, and the 8500 reaching down to the $90 range. The architecture for the new GPUs is the same as for the 8800 line, but with lower clocks and fewer stream processors."

The Radeon X1950 beats the NVidia cards in every single test save for the "synthetic" crapmark test that has nothing to do with reality.

Interesting, that's not what I've been seeing in tests. In fact, in most tests it seems the 8800 GTX beats the X1950 XTX

In context, it's clear the GP was referring to the NVidia cards that were reviewed by the article. And he's mostly right. In only one (of many) actual gaming benchmark did any of the Nvidia cards reviewed outperform the X1950.

Where I believe the GP is mistaken is in his conclusions about the article. The article itself says, in conclusion:

The 256MB version of the Radeon X1950 Pro is faster in most games, and by a pretty good margin, too.

The article notes, correctly I think, that the X1650XT is not a good card for gamers to buy. It notes that the 1950 won't do DirectX10, and the budget NVidia cards may not be fast enough to do it well either.

However, it's also instructive to have a look at this review at Hard OCP [hardocp.com]. There, in two demanding games (Oblivion and STALKER), the 8600 GTS appears to win handily over the 1950XT. If those benchmarks are accurate, it suggests the ExtremeTech article may draw conclusions that are too favorable to the X1950.

I also got the impression that sane people compare apples to apples, and oranges to oranges.If you have 400$ and give them to ATI, you get a 1950XTX.If you have 400$ and give them to nVIDIA you get an 8800, possibly the lower-RAM (320MB) version.

You're much better off with the 8800, it tears the ATI card a new one.

Now if you have 200$ to spend, that's a whole different ballpark there.Giving them to nVidia will net you something that is, in most benchmarks, ALMOST on-par with the 400$ ATI card. *AND* is DX10

Which in most cases for gamers, doesn't usually matter-in most cases, the more powerful hardware is better than weaker hardware with new tech. However, with the way M$ is pushing Vista upgrading, how long will it be before there are less impressive games that require DX10 to run, and potentially DX10 hardware? Or what about DX10 games like Crysis? Maybe that will push performance past non DX10 cards. It's hard to say untill we can test things like that.

I must say, I will *allways* buy nVidia until ATI shape up their Linux drivers. Twinview makes dual monitors as easy in Linux as anywhere else, and that is something valuable to me.
But still, ATI cards *are* important - hopefully they mean nVidia will drop the price.
But until then I am happy with my passive 7600GS.

The biggest reason to get these cards over other existing ones is for DirectX 10.

I disagree completely. DX10 is a non-issue right now because there are no games to take advantage of it. However, the new processing unit design (no pixel and vertex processors, but a unified processor) greatly increases performance of current games, sometimes significantly. That's a pretty damn good reason for getting these cards (at least from Nvidia right now).

I disagree completely. There are many choices available from both AMD and nVidia that would be otherwise good upgrades-except for DirectX 10. That's a pretty big issue. Yeah, you'll be fine playing all the current games, and maybe new games for a year or so. But in a year and a half to two years, it won't matter how good the card you bought is, and it is possible to spend $350+ on a card that doesn't support DX10, because you not only won't be able to play that game at full settings in two years (hey, that'

I agree - DX10 is still on the horizon. Once its here (i.e. there are games you can buy that you want to play that use DX10) the graphics cards will be better and cheaper. Maybe the drivers will get fixed by then too.

Those links are about 5 months old. I bought Vista with the BFG 8800GTS OC 320MB [bfgtech.com] last weekend, and everything is running smoothly. Got the Aero effects and ran some DX10-only demos. So far so good.

Yes, look at the benchmarks. But (unfortunately) benchmarks only typically look at a few cards, and not the entire lineup. How much faster are these than my NVidia 6200?

Is there a site that lists every single NVidia card in the various form factors (AGP8x, PCI-E, etc.), and run the same benchmarks on them? Why can't they do this as part of their naming scheme? (ie: a 6600 being an average of 10% faster than a 6000 on a combination of all the benchmarks).

While that sounds great in theory, there are too many factors that go into the performance of a card to really roll it all into one number. Differences in Direct X and OpenGL performance alone would confuse things, not to mention the differences in feature support.
Even if they were to do that, in a few generations you would have incredibly unweildy product numbers. Just looking at the performance difference between a Geforce 256 and a Geforce 8800GTX should show how crazy large the numbers would get. P

Assuming your CPU is around the same vintage as your graphics card, upgrading to a (mythical at this point) AGP version of an 8600 may not be the best idea. Chances are the card would starve waiting for the processor and you'd basically be wasting money. You'd probably be better off upgrading the CPU, Memory, and Motherboard while you're at it to switch to PCI-Express before you upgrade to a modern video card. This isn't as expensive as it sounds, you could easily get a decent performer for $600 or so (I

The first number is the major generation of hardware. So these are the 8000 series cards, the 8th generation of GeForce hardware. All other things being equal, a new generation card of a similar number performs better than an older one. So a 7600GT should outperform a 6600GT and an 8600GT should outperform a 7600GT. However the primary reason to look at new major version numbers is new features. In this case, 8 series cards support DirectX 10, 7 series are DirectX 9.0c.

The second number is the minor version and generally increasing numbers indicate increasing speed. Usually, they indicate the amount of processing hardware so an 8800 has more pixel pipelines and shaders and such than an 8600. Then there's the letters. GTX > GTS > GT, not sure how it goes after that. Again, speed related.

What it really comes down to though is you need to look at benchmarks. There's no one magic metric for cards, they'll be better at some things worse at others. You need to see how it performs on the stuff you are doing to make the determination.

http://www.gpureview.com/show_cards.php [gpureview.com] is your friend. Allows you to select any ATI and nVidia card known and compare them side by side. Somebody back about four years ago here on slashdot pointed me in the right direction to that site and have been using it since.:)

I think when used correctly price can be a fairly good indicator of performance. Look at the manufacture retail price for different cards. The highest priced cards offer the most performance, likewise the lower priced cards over less performance.
In some cases this works between manufactures. NVIDIA and ATI typically offer the same performance for about the same price. They are obviously competing so it is never exact, but I have never seen one severely undercut the other.
I guess it makes sense, we w

There is a way to tell. All you have to know about the numbers is the higher the number the better the performance. The 7800 cards are the 7-series video chips. The 8800 cards are the 8-series chips. The next generation of Nvidia cards will be 9800 9-series chips.

Inside each of those series the same rule applies: The higher the number, generally the better performing card. The 7900 line performs better than the 7800 line; the 8800 line performs better than the 8600 line; and so on.

Something that I've wondered about for a while is when the graphics card manufacturers will wake up and realize that there are more markets for GPUs out there besides gamers. Sure, that may be a big market, but I think they could make a bundle selling semi-specialized cards for other niche markets. In particular, it wouldn't take much to produce home-theater/video versions of cards: basically take a medium-range card with hardware MPEG-2/4 decompression, scaling, and deinterlace, and put some standard video

In particular, it wouldn't take much to produce home-theater/video versions of cards: basically take a medium-range card with hardware MPEG-2/4 decompression, scaling, and deinterlace, and put some standard video outputs on it -- say HDMI, Component, and VGA (that way you could get to DVI easily using an HDMI adapter, and you can get to S-Video or Composite by combining the component signals).

That stuff all largely exists. It's just that gamer gear gets the marketing hype.

That said, while I'm not sure how these cards will perform, I have been using their big brother for a while. I've had a Leadtek 8800GTS (640mb) for a few months now, and it runs great. It would probably run better if I was using WinXP instead of Vista, but I'm happy with it.

Compared to their big brothers these new cards are highly castrated. These cards have 32 shader units to the 96 and 128 found on the 8800 GTS and GTX respectively, and the memory interface is only 128bit, while it's 320 and 384 on the GTS 640 and GTX. From the benchmarks I've been seeing, these cards aren't anything to get excited about. By the time DX10 games see the light of day we'll probably see far more powerful cards for less money. Anybody buying to play right now would be better off sticking with th

I've never paid more than $40 for a graphics card, and I've never found a game I couldn't play comfortably (high res, good frame rates). I buy a new $30-40 card every couple of years. Is there really any benefit to spending $500 on a card? I doubt I spent that much on my entire current PC.

Notice I'm not defending my purchase at all. I was only marginally impressed with that card over my previous $150 card. That's not a real good feeling, I can tell ya. I'll probably be buying one of the $200 8600's shortly after they come out. And only that because I'm going to be putting together a new system.

The 8800 is a new architecture. You can't do a straight shader to shader comparison. If you did, then the 8800 GTX with it's 128 units should be six times as powerful as your 24 shader card. It's not. When comparing apples to apples, the 8800 to the 8600/8500, you can see that the hardware simply isn't going to hold up to increasingly demanding games when it has 1/3 the memory throughput and 1/4 or 1/8 the number of shaders. Look at the benchmarks that are coming out, they don't lie. The 7900s are as powerf

I usually find their reviews to be the best around. Always very detailed, and from what I've seen always right on the money. (They seem impressed, but their bottom line seems to be that, for now, you're better of sticking with a 7600GT, 7900GS or X1950XT if you already have one.)

I don't know if this is exactly true but this was the case for the geforce 6-series:Ultra - Fully working and enabled gpu, fastest clockrate.GT - Fully working and enabled gpu, not as high clock as Ultra.GS - Partly disabled gpu.GTO - Partly disabled gpu, lower clockrate than GS./NU - The normal version is also called NU, Partly disabled and slower than GTO.LE - Even more disabled gpu.XT - As disabled as LE but with 128-bit memories.

XT was higher clocked than LE thought, but I think LE is still better.

Should have been:Ultra - Fully working and enabled gpu, fastest clockrate.GT - Fully working and enabled gpu, not as high clock as Ultra.GS - Partly disabled gpu.GTO - Partly disabled gpu, lower clockrate than GS.NU/nothing - The normal version is also called NU, Partly disabled and slower than GTO.LE - Even more disabled gpu.XT - As disabled as LE but with 128-bit memories.

And by the time games are coming out that require DX10, these cards will be so out of date it won't even be funny... unless you're the guy laughing at your friend who went out and spent a chunk of change on a card that doesn't really have any support just because he wanted to have the biggest ePeen on the block.

This is kinda the spot that I'm in. I'm running a 6200 AGP with 128MB right now, so pretty much any card on the market will be an improvement. I don't run a lot of graphically demanding stuff, but, then, if you'd asked me five year ago, I didn't have Google Earth and I'd have said I didn't need a GPU at all. The 8600GT looks like a nice card to grab to segue my way into PCI-e cards with something that will probably passably handle most things I throw at it for the next couple years. I'm not expecting mo

Yes, there is no compelling reason to upgrade to DX10 today just to get DX10.But to paraphrase your comment: As long you are buying a new video card, there's no point buying a card that doesn't support it.

If your still using a 5000 or 6000 series unit, an upgrade might well be in order. If you are buying a new mobo to upgrade cpu's and your existing card is AGP a new card is mandatory... for me, I think the sweet spot is the 320MB 8800GTS, but for someone on a tighter bugdet the new 8600 might be a better v

Well, I've been letting them go, but this one is too obviously done by some Microsoft fanboy. This is not a troll. I never troll. If I said something, it was either sarcastic and said for effect (and I generally provide plenty of context) or I fucking meant it. Clearly no one is motivated to go to Vista unless Microsoft has made some kind of sleazy deal with them.

Not taken as a trolling comment. I have to agree, I noticed the same.I am in contact with a good portion of the local dealers of computer hard- and software. The general consensus is that people do anything but break down their doors for Vista. If they buy it, it's usually bought with a new computer. Actually the campaign did more ill than good. It made it very "uncool" to go Vista, 'cause such a huge hype has been created around it while the reports are pretty bland. The hefty price tag and not being able

The current Forceware drivers for Vista are the buggiest, worst performaing drivers that Nvidia has ever put out. Take a look at their forums sometime; they are trying very hard to alienate their customers.

It doesn't matter how great these cards sound on paper. Without at least decent drivers they are worthless.

You apparently haven't worked retail.Yes, it's a Stock Keeping Unit. When a manager wants to talk about the variety on his shelf, he talks about the number of SKUs on it. Each SKU is a different item in the computer, but may be VERY close to another product in actuality. Yellow Rubberbands vs Red Rubberbands, for instance.

Like it or not, sometimes the real world carries over into our little tech paradise and we have to understand their terms. Even worse, sometimes we start using them ourselves! Oh noes

Parent is 100 percent right, however marketing and business tend to think in SKU terms for a variety of reasons.Apparently like all other acronyms someone outside of the original world heard it and thought it was cool and made them sound like they were part of the industry and it propagates from there.

Personally SKU has a lot of useful connotations, the best I can think of is "Total versions of a product" which in most businesses is important, but like I said the parent is right, this is part of the over us

I wouldn't worry about ATI/AMD not having DX10 hardware until their is content and a significant number of users that can use it.

1. You need a Game that supports DirectX 10, how many have been released so far?2. You need the user to be running Windows Vista to have support for DirectX 103. The user needs to have also purchase a DirectX 10 graphics card to complete the loop.

It is the chicken and the egg, and history hasn't been kind to the early adopters of graphics cards that are the FIRST to implement a new API.

I'm perfectly happy sticking with XP until...1) There's a moderately priced, high performance DX10 video card available ($200-$250)2) There's a way to address the DRM laden Vista (either a hack/patch/new version)3) There's a DX10 game that I have to have that doesn't include spyware, adware, or malware.

The only way to convince big corporations that their new direction sucks is to vote with your wallet. Don't buy whatever crap they want to shove in your face. I play bf2 a lot, but didn't buy bf2142. Why?

I, among others, have yet to see a convincing arguement to buy a DX-10 capable video card. I'm not upgrading to Vista, until they remove their DRM supportive crap and their awful driver signing nonsense. I'll switch to an over-priced Mac first.
I don't play FPS, which is most likely to be the biggest genre that actually thinks it needs DX-10.
My next logical upgrade will be to dual SLIs, unless I can't use dual monitors with them (I know some people who said they've had trouble with SLI and dual monitors, but I haven't researched it much because I'm not upgrading right now).

Now that they have switched to Intel hardware, Macs shouldn't cost $1000 more than the equivlent windows machine, but they still do. Apple has always overpriced their hardware and they continue to do so because they can get away with it, as their core user base is used to paying that much for their hardware. In truth, if companies would make more Linux version of software, I would only buy a Mac if I needed to do certain kinds of video editing (because it is still several thousand dollars cheaper than an Av

I am pretty happy with my Radeon 9550. It has a fairly small passive cooler, so I guess it doesn't produce so much heat and I haveOpen

Source

3D

Drivers

I was/am looking at a 7600GT (the version from MSI has a passive cooler that covers the whole front of the card, which was tested to be more effective than the heat pipe solutions), but opens source is so damn convenient, since you don't have to compile extra proprietary modules (it worked pretty well back when I used Nvidia with Debian, but it was always a pit

DX9 support: [ ]DX10 support: [ ]Wattage: ______Wattage: ______ (the real number this time)Plays a game my 4 year old card cannot play just fine [ ]

Since there is currently no game you can check off that last box for, and the wattages are complete shit, we can all just completely IGNORE ATI and nVidia until they get a F'ing clue about making cards for anything but 40 year old virgins still living in their parents basements.

Everyone keeps calling these "DX10" cards, despite that being a misnomer. They are SM4 cards, and DX10 happens to be the first version of DX to support SM4. OpenGL also supports the new shaders (and has for longer). When are we going to start hearing about developers switching to OGL to get geometry shaders (which produce some sick effects) in WinXP, still the most popular gaming OS?

nVidia would be fools not to as many games need GL and gaming is probably their biggest market. What they mean by "DirectX 10" is feature set basically. OpenGL doesn't really keep up to date with cards very well so features are usually expressed in terms of DX versions. For example DX 7 means you have at least fixed function T&L, DX 8 means semi-programmable shaders, DX 9 fully programmable and things like that. DX 10 specifies a bunch of new stuff, the Wikipedia entry on it is pretty good if you are interested.

As a practical matter it isn't real useful for end users at this point as nothing really supports it. However it may be of interest to programmers since DX 10 cards take shader programmability to a whole new level. It specifies a unified shader interface, and nVidia has chosen to unify the shader hardware as well (ATi says they have done the same). Thus effectively a DX10 card can be looked at as a stream processor, with a whole lot of units. Various things, like folding, are likely to be able to be designed to run in part on the GPU for massive speed gains. nVidia has a whole deal for helping that called CUDA.

But yes, GL support is there, I can confirm it. I have an 8800 and I play GL games all the time. They work great.

Just some enlightening information:
D3D and OpenGL are APIs that allow programmers to use the features that a graphics card is capable of. If a graphics card is capable of a new feature it is made available by extending the APIs of OpenGL and D3D. OpenGL does keep up to date with these new features via extensions. Whereas Direct3D have regular (annual?) full releases.
D3D10 features such as geometry shaders ARE available in OpenGL via extensions. These extensions are normally first created by a member of

If developers, even a few, thought they could make more money on Linux than Windows, or even turn a hefty enough profit by supporting both, they'd do it.

Thanks for companies like Introversion, Transgaming, and Codeweavers, and of course all the developers of Wine, Linux gaming is more popular than ever. Thanks to people like the folks behind Ogre3D, Newton, ODE, OpenAL, etc Cross-OS gaming is easier than ever.

I think this puts us right on the cusp of seeing a big change in Linux gaming. (And Mac OSX gaming, too.) But until then, Windows is -the- PC gaming OS and that's where hardware and software creators will be making their investments.

Now, I know the usual argument is that OpenGL is already cross-platform and should be supported. And I agree to a point... But ATI's OpenGL support has apparently always sucked, and you don't create a game that will suck for half the market if there's an easy alternative. (DirectX.) (Disclaimer: I have no first-hand experience with ATI cards. I've stuck with nVidia since Voodoo died.)

ATI's OpenGL support has apparently always sucked, and you don't create a game that will suck for half the market if there's an easy alternative. (DirectX.) (Disclaimer: I have no first-hand experience with ATI cards. I've stuck with nVidia since Voodoo died.)

I don't think you really understand. ATI's everything support has always sucked. It's not just OpenGL. ATI can't write a stable driver for any amount of money. But then, they don't have to, because people keep buying their crap.

I had similar issues, but the conflict was the ATI driver vs. The motherboard's drivers.. it was a SFF PC, so out went the ATI, my son got a nice ATI 9600Pro, and I haven't bought any ATI based video cards since.. oops, I did get one for my G4, but ATI was the only one available, and in stock... Just the same, I try to avoid them...

[i]And, by the way, your ridiculous inclusion of the curse "fuck" so many times does nothing but highlight the fact you are an immature child (whether in reality, or at heart).[/i]Actually, that would be my military background showing through you stupid fucking wanker.

Dual core, couple a gig - and I'm sure that the forensics folks have you beat hands down.

I for one have tried it on a dual core athlon 64, with 4GB of RAM, and approximately 6 terabytes of disk, with hundreds of thousands of files (web and med

"Seriously, an operating system release is just not worth invoking such strong emotions."

I have to disagree with this. Many, many people's lives and careers are intimately tied to their ability to use a computer. Given that MS is a monopoly, and for all intents and purposes hundreds of millions of people will have no other choice but to spend many hours a day interacting with MS's OS, it is absolutely fair to have it invoke very strong emotions. If you had said the same about a game console, where the

We're not consumers, but customers. But to keep us on the edge and because we love TLAs, lets call the next thing we're expected to buy SKUs. Just to keep us on the bleeding edge, and to alienate the other guys.