Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

An anonymous reader writes "The optimization troubles in GPU land aren't over yet. Yesterday Futuremark released a new anti-optimization patch for 3DMark03. Gainward (who sell graphics cards using NVIDIA hardware) today made a comment that Futuremark had disabled certain features in their ForceWare 52.16 drivers, thus resulting in huge performance drops. A few hours ago Futuremark made an official statement about this : 'The accusation is totally wrong because what it suggests is not even feasible technically. 3DMark03 does not talk to graphics driver, it talks to the DirectX API, which then talks to the driver. Thus, it is impossible for the application to disable GPU compiler...'"

Personally I think benchmarking programs of this kind are outdated. I could really care less when the ATI Radeon 12500XTS gets 100 more 3DMarks then the GeForce TXP xi5000 Extreme. What does that matter?

What matters to me is real world performance. And by all means, let the companies 'cheat' in that area. If it takes away little/no quality, and makes the games I play faster, more power to 'em.

I wish people would stop fighting over which stupid video card is better than which other stupid video card. All these people who care about benchmarks and all this other shit are the same people who think that audio cable made of gold and platinum will make their music sound better. All that matters is

a) is your video card good enough to do the things you do.

b) the price/performance ratio, unless you have tons of money.

That's all that matters. If your video card can run with vertical sync on and not lose frames in your favorite games, then you're all good. I've noticed Ati does a little better with the DirectX and Nvidia does a little better with the OpenGL. What it comes down to is they are both good cards that both work. Choose the one that suits your needs. Things like VIVO or DVI out are much more important features than 5fps more than the other guy.

I've got a GeForce FX5900, from gainward in fact. And you know what, it's not the fastest thing there is. It's fast enough for me though. I can play all my games. I will be able to play Doom3 and Half-Life 2. The card will last me 5 years. My TNT2 last me 5 years, there's no reason this one wont. It has VIVO, it has DVI, and I can plug two monitors into it. That's all that matters. If I lose some fps here or get some more over there, it wouldn't bother me a bit. I run vsync anyway so my fps is usually locked at 85.

So, what does your video card do different than mine? Well there's this one program that makes numbers show up on the screen. Your video card makes bigger numbers show up in this program and mine makes smaller numbers show up. Lucky you!

Who gives a fig if drivers are 'creatively optimised' for real games, so long as the quality doesn't suffer?

And how would you check that quality hasn't suffered - would you actually buy two cards and run them up side by side on two monitors to check?

Benchmarking apps provide "the rest of us" a reliable way to compare cards. Any "secret optimizations" or other attempts to silently manipulate benchmarks or game performannce must be seen as the dishonest tactics they are.

And how would you check that quality hasn't suffered - would you actually buy two cards and run them up side by side on two monitors to check?

Actually, that's basically what the major review sites do. When you see the IQ comparisons at a site like [H]ard|OCP [hardocp.com], they put the screenshots output by the game side by side with those from another card at the same spot in the game. Some sites also compare this to the DX reference renderer and the game's software mode.

... make me realize how stupid I've been for so long. I used to buy thousands of dollars of computer equipment to be able to play the latest games until a friend of mine asked me, "Why don't you spend $200 and get a console?" You know what, there is no good reason not to. I'm a moron.

I agree with you, especially at the moment. For the price of the latest, greatest video card alone you can get all three consoles brand new plus a game or, if you like the Nintendo line-up, the Gamecube plus eight games (up to double that if you go for the budget titles/used games). Not too shabby for a lot of hours of fun. Right now, the only games I play on PC are Ho

> For the price of the latest, greatest video card alone you can get all three consoles brand new plus a game...

And, for the price of a console, you can buy two video cards as powerful as the one inside it - or just one mid-range card, which will already produce better graphics than the console at a higher resolution. The point about the latest, greatest video card is that it has as much power as the next generation of consoles will in a few years time. I'll put it this way, I'm not holding my breath

I understand. First-person shooters are the light of the PC gamer's life, and if they can't run their FPS at 1600x1200 with full detail then the game sucks. I don't think people who enjoy their PS2s are crying rivers over the fact that Doom 3 isn't going to be on their console. More likely, they're looking forward to playing one of the 20+ potentially good games coming out for their system just in the rest of this month (Battlestar Galactica, Final Fantasy X-2, Fatal Frame 2, Monster Rancher 4, Metal Arm

I think you've made a god point there. All the people I know with a radeon8000/Ti4200 (or any other second-of-the-line-when-it-came-out video card ever since) are serius gamers, price/performance aware, and in general true game lovers that own at least one of the current generation consoles (but not three, I don't know any serius pc gamer who also own all the three consoles).

I know there sould be lots FPS-only gamers that are always buying the latest state of the art card, overcloakingit like if 280 fps ar

You don't suggest in your post that one abandon the PC in favor of the console, but that's the general impression it gives. There is no good reason not to get a console, but I'll give you a few good reasons not to give up on the PC:

Warcraft 3, StarCraft, Half Life 2 (may come to consoles withing a year of release), Massively Multiplayer Game X (minus SWG), BattleField 1942 and it's mods, hell mods in general (they may have had half life for the playstation 2 but you couldn't play counterstrike with it), hi

I know you said you didn't want to troubleshoot it extensively, but SOMETHING must have been screwed up. The ATI 9100 came out before the GF4, right? There's no way it could result in double the points of a GF5.

Now, I've owned both ATI and Nvidia, so I hope no one will call me an nvidia fan-boy, but that just doesn't seem right.

The 9100 came out around a year ago, long after the GF4 series (it was on the low-end of ATI's lineup at the time). The FX5200, which came out a few months later at the bottom of NVidia's line, does not perform particuarly well (it falls behind the GF4 MX in several cases). While the numbers (3100 vs. 7100) seem a little extreme to me, the basic scenario is perfectly plausable.