Post Your Comment

169 Comments

you guys dont seem to understand how software works i guess. ATI and NVIDIA arnt to make the same cards and the coding has to do with proformance.

Nvidia can run DOOMIII better ATi Cards? but then why would HAlf-life2 a game with less graphics and shading run differently? its all done with coding good for ati who is better with half-life and good for Nvidia for running best with DOOMIII. it doesnt matter anyway your ati and nvidia cards will soon be obsolete. so who caresReply

Silly peeps, the Ti4200 ($76) cost a fair bit less than $100- usually almost half the price of the entry-level Ati Card (at that 'performance standard' (9500-$113)). It's not as good a card, agreed- but if you're gloating because you paid 2x-6x to get a better card, and lo and behold, you did- well DUH!Yeah- the FX class of card from nvidia is (ahem) disappointing- okay- sadly disappointing- and Nvidia does suck for pushing a card priced way over it's performance- but come one! Some people pay up to $400 so they can 'better' play a few $50 games! You gotta love this market, right?In 6-9 months, the market will drive the technology to $99 or less (It always does)- then I buy. Maybe you early adopters who get 'cut' by the 'bleeding edge' will learn a little patience so supply & demand has a chance to work- instead of getting suckered by this hype-driven marketing. Don't get me wrong- I have always like Ati- just not to the point where I'd give them twice the price without getting twice the performance. Price <$100, then I'll be an Ati customer again.

Won't be long actually.

Maybe I'm 'out' $80, for buying this 'crappy' card. Oh well- at least it's not $150-$400 due to inadequate research or leapfrogging technology.(Come on- how many of you have a $200-$400 gfx card you had to salvage on ebay or is in your little bro's machine?) The technology evolves too quickly for me to waste my money- but that's just me- those of you who can afford to pay for performance I tip my hat to you.

This is the nature of technology after all- it's good when new technology makes old tachnology crap- and competition makes better pricing for everyone. Th system ain't perfect- but learn from it. All this "Nyah Nyah your card suck shit is juvie"

Great review! it puts in perspective what valve did with there presentation an telling the truth of the current line of NV hardwear. One thing i have been reading everywear is that Det50 will be great for NV cards and DX9 performance but nobody mentions that the CAT's 3.8 will do the same for ATI hardwear. Reply

How dare some f you criticize valves work. Have you actually read or seen some of the demonstrations? The physics alone far superior to that of DoomIII. As for the graphics? Where can you honestly say DIII is better? Sure the gamne looks great but they only ahev to model for one or two kinds of enviroments a freaking spaceship and maybe some outside terrain! HL2 folks have to make at least 8 diff. enviroments. And as for the gameplay HA don't even bring that up. Weeee im shooting zombies on a spaceship weeeeee no strategy or AI weeee. About the only thing DIII offers is per poly collision.Reply

Anand might have journalism skills but if your going to tell people to wait untill 12 midnight for an article then at least be prepared to deliver, not sit around and upload ya article at 12:25am or some shit like that.That is Im going to get something to eat, be back at 3pm, that 1am your time.Reply

anand's done this before (next week we will..., we will look at X 2 weeks from now..., etc) without any results. Not that I'm having a go, his journalism is fantastic - but better to *mean* what you say when you say these things. It comes across as rather unprofessional otherwise.Reply

I'm poster #126, but as OpenGl seems able to encompass and work with NVidia's hardware 'limitations' why couldn't MS have made DX9 more flexible given NVidia would have made any issues known before? MS seems to have let NVidia down rather dramatically.Reply

First off, I'm no fanboy. I only buy hardware on price/performance. As such I must say my only reaction to (recent) nVidia customers is sympathy. I would certainly be upset. I can't think it's entirely (perhaps even mostly) nVidia's fault given their R&D took place before the DX9 standard came out - the point when their hand was already played. Is it MS's fault then having been informed of Nvdia's hardware specs? Perhaps. My hope is that future games will come out with OpenGL support such that Nvidia customers will have an alternative. That would make such unreasonable poor framerates a temporary problem with some soon to be realeased sole DX9 games. Most expectation is already that the Doom3 engine will become the most widely adopted engine - perhaps this will ensure it even more so? As someone else pointed out OpenGL currently is not mature that everyone is looking to implement it in that game. Doom3 should definitely change that. In any case don't berate NVidia too much, the optimisations are for their customers benefit given they have been arguably failed by DX9. Nonetheless these benches are going to scare alot of people who don't understand the technical reasons. The current Nvidia *still are* good cards albeit under everything except DX9.

To cut it short please don't gloat or deride eachother. Alot of people have been unlucky. And surely all being consumers aren't we all in the same boat - I don't understand the gloating or fanboyism.

I think people with FXs shouldn't worry too much. OpenGL may become more of a standard than DX9 - what game developers want to alienate a large segment of the market. If you own an FX and HL2 was one of your most anticipated games then you won't be able to run it in DX9 in it's full glory. Instead you'll have to run it at a very high resolution with 4xAA etc - IMHO that's not bad, not bad at all. If that's not enough then buy ATI - your money.

to anyone who has said that valve has no coding skills, i'd LOVE to see you make a game as good looking and as fast as hl2. if you can't, then you have NO RIGHT to judge the competence of valve's programmers, as they are obviously better than you. i'd say HL2 proves they are extremely talented, and with everything that HL2 is throwing at the cpu and gpu, those numbers seem fine. keep in mind, the game isn't out yet.

(personally, i'm very happy to see thaat my aiw 9700 pro will run hl2 just fine :D )Reply

VALVe didnt make cs they merely bought it when it became extremely popular then proceeded to ruin it by making it so noob friendly it has lost all its depth and strategy cs is now a shitty dm game

hl2 will be good but i think most people are more interested in hl2 mods rather than the actual game most of which will be retail only and making a mod for hl2 with its advanced gfx etc will be a lot more difficult than it was for hl1making maps for hl2 might even require a team yet alone an entire new mod

the hype around hl2 is getting so ridiculous that it can only be a dissapointment if people arent careful - i didnt think hl1 was amazing it was very good but not amazing, if you expect the world from hl2 then you will probably dissapointed whereas someone who expects an ok game will be impresed if it only turns out to be a good game Reply

I'm just wondering, does anyone remember when the first sneak peaks of Doom3 came out and the catalyst drivers were broke for them. The 9800pro got like 10fps, shortly after the situation was resolved. I have a AIW 9800pro, but I also have a BFG fx5600 256 that oc's nicely to above ultra specs. Just wait until after the game ships to decide what is best and where if fits into the big picture.Reply

NV is saying that Det50 will be THEIR BEST DRIVERS EVER" i guess they will be great running normal DX9 code without optimazations. HELL NO they are going to do is even more! no fog, 12 presition, crap IQ, no AF+AA with Hi res. But ATI also has "THEIR BEST DRIVERS EVER" i now they will be DX9 "NO OPTM.", 24 presition, awsome IQ, AF+AA with Hi res. Too bad NV the shit hit the fence this generation is crap. And 9600 comming to sub 100 market in a few months. NV has lost High and Mid range market to ATI and DX9 in the low end will be Ati domain too. If you own ATI stock sold it until NV40 this gen is crap!! Reply

Basically the nVidia performance stinks, either way IMHO. If the new 5x.xx drivers fix it, then so be it, and that will be great for those cards and then they can run future Valve games.Game runs fine a Ti4600 using DX8.

However, the new ATI cards only have 24bit shaders!So would that make ALL current ATI cards without any way to run future Valve titles?

Perhaps I do not understand the technology fully, can someone elaborate on this? Reply

Here is how your performance will be, depending on the card you have:If you have a DX9 ATI card, your performance will be ok.If you have a current DX9 nVidia card, your performance will not be near as good as an equivalently priced ATI card. Of course nVidia will release a "fixed" card within about 12 months, it would be suicide not to.

If you're using a DX8.x part, like a Ti4200 or an equivalent generation Radeon, then the performance will be roughly the same.

Likewise, DooM3 is rendered using OpenGL, and therefore whatever card you own will run as well as it can run OpenGL. DirectX compliance will have no effect on your FPS in Doom. Some websites have benchmarks for OpenGL games, you can review these to get a good impression of how each kind of card will perform.Reply

Well I'm still running on my Ti4400 - will wait to see how "horrible" it is before I make any changes.

I think it is funny though. I've got some radeon and nvidia cards here. (I'm never cutting edge though - my fastest PC is only a p41.8)

What a silly thing to gloat or argue about. I was never fond of ATI because I was never satisfied with the timeliness or reliability of their drivers in the past (maybe that's changed now, I'm not sure.) When I upgrade I just buy whatever the best card is at the $150-$175 range.

To the point of whom is conspiring with whom is silly as well. There is absolutely nothing wrong with a company partering with another, or even making their product work better with another's. Even if that's not what is going on. There isn't anything illegal or nefarious about it. It's called the free market. So your silly talk about a class-action lawsuit against nvidia is meritless. They sold you a card that does work (compatible) with Directx9. Now since the card came out BEFORE the standard, and DX9 games came out AFTER the card, it's your own choice to purchase a card that may not be the BEST card.

Some of you need a serious lesson in market economics. The power you have is to vote with your wallets. Now that can start with a healthy debate of the facts, but this is a useless debate of speculation and conspiracy theories.

Valve's biggest interest is to provide a game to the largest audience possible, and to make gaming folks happy. That nets them more profits. And that's the wonderful thing about a free market economy. I highly doubt Valve would want to intentionally do anything to alienate the nvidia market, since there is a gigantic install base of nvidia based GPUs.

I'll play HL2 on my P41.8 w/GF4-Ti4400 128M card. If I want to turn on ALL the eye-candy and run at more FPS, then I'll have to spend some cash and build myself a new gaming rig. My guess is that it will probably run pretty well and I'll be perfectly satisfied on my current machine. After the initial release I'm guessing that future patches of HL2 and Det5 will eeek out a couple extra FPS, and that will just be an added bonus.

I will buy HL2 for a couple reasons. The first is rewarding Valve's loyalty to their customers. I spent probably $50 buying HL1 back in 1998 and I got 5 good years of gaming enjoyment. They maintained patches and support WAY BEYOND any resonable expectations for a game. I got some REALLY good extended play out of their game with FREE mods like CS and DoD. I will buy HL:2 to show that their loyalty to me will be rewarded in turn. I'd like to send a message to other game developers that open platformed games and long-term support is what we insist on, and reward with our money. The other reason is Valve showing that they will accept nothing less than cutting edge. HL1 was groundbreaking at the time, and HL2 looks like it will be the same.Reply

Dude, the problem with Nvidia is not because they intend to build a crappy card based on 3Dfx technologies. They went with the .13 micron process at TSMC that is rather new and too bleeding edge at the time. ATI is using a more conservative .15 micron process that is easier to get good yield out of so they are fine right now.

From what I heard, if Nvidia is more conservative back then when they design nv30 they would have been ok. ATI decided to wait and let everyone else at TSMC (including Nvidia) works out all the process problem before jumping aboard.Reply

It had taken Jen-Hsun Huang many a market cycle and an enormous amount of money, but on that fateful Friday in December, 3dfx was finally his. Yes, the ancient Voodoo technology was his at last.. but, unbeknownst to him, so was its curse.Reply

"There is no need to gloat to nVidia card owners, because you'd have to have a LOT of contacts to know that this was coming. "

Not at all. All you had to do was pay attention to the supposedly "useless" synthetic Pixel Shader 2.0 benchmakrs that have been around for MONTHS. They told exactly the same story - the GeForceFX family of cards have abominably bad PS2.0 performance unless you tailor-make a partial precision path specifically for them. That the partial precision path results in lower image quality isnt as important................when your drivers detect SCREEN CAPTURING and then MANIPULATE THE OUTPUT QUALITY to make it appear better!

If nvidia had designed a part that ran DX9 code properly and with decent speed, there would be no discussion here. The fact is they didn't. And their only recourse until NV40 has been to manipulate their drivers to an almost unbelievable extent, reducing quality and introduciig arbitrary clip planes at will.

I don't own an ATI PS2.0 capable part, but it has been obvious since the introduction of the 9700 that it is the one to buy.Reply

#106) Orginal HL is based off the orginal Quake engin which was based off OpenGL. If you notice, the DX version of HL sucks, it sucks alot. Looks terrible is buggy. It isn't supported as fully as the OpenGL version. THere are a buch of lighting effects in OpenGL version that aren't in DX mode. The only time I use DX mode is to debug. Now imagin porting the vastly more complex HL2 over.Reply

To get a good overview of this thread minus most of the fanboyism, use this index with my opinions:

#24 right on! too bad this thread didn't end here though.#33 .. um...no#39 320x240 (just like in Quake on 486 days)#42 agree#44 1993#62 because of IPC (efficiency)#71 correct!#72 LOL#76 BINGO#80 correct#81 it is ALWAYS a marketing war, silly.#86 heavy denial (and when the fairy godmother came, she turned him into a real DX9 card...)#93 e x a c t l y#103 DONT READ THIS ONE... I actually did, and always fell asleep and fell out of my chair. Nah, just kidding... but it's the best "summary" of this thread.#106 Actually if you've ever done Fortran/VB/C/C++/Java/Perl then you would know that "programming" isn't just fun-time... it's a lot of work, and it sucks if you have to do it twice.

#104, they did find a way to run Nvidia cards smoothly, DX 8. They can't guarantee full support with all cards, but they have done their best. I don't think HL2 would be geared towards ATI cards, but isntead thats just how it worked out.

I wonder if Doom 3 will have any similar problems like this... with Nvidia cardsReply

Well, IMHO I think Valve will suffer more because they aren't reaching the whole market of Nvidia users out there. For those guys who want to spend the extra money for hardware just to play HL2, good for you, but I personally don't have the money to throw around like that everytime I want to play the lastest game.Reply

Yes, they are. I doubt saying this is going to affect anything, but seriously, these flames are getting WAY out of hand.

People with GeForce FXes who are sore about their cards being slow about HL2, them's the breaks, it's not your fault. Who knew? But don't accuse Valve of going Futuremark on the community until there's at least a grain of evidence. And don't tell me "Valve and ATi working together is evidence enough," because IIRC, the partnership happened after Valve saw how the FXes did with HL2.. and, thus, after NV30. Ahem.

As for you ATi folks, yes, it's nice. There is no need to gloat to nVidia card owners, because you'd have to have a LOT of contacts to know that this was coming. The FXes do perfectly fine in normal benchmarks, with the obvious exception of NV30. (The 5200 Ultras and 5600 regular are pretty bad cards too, in my opinion, since the 5200U is as expensive as the Ti4200 and a lot slower, and the 5600 regular is pricier than the Ti4200 and slightly slower. You know, the Ti4200 really is a good card. .. uh oh. I've gotten sidetracked.) The FX 5600 Ultra 400/800 was the best midrange card around (well, since nobody knew about HL2 performance), even if it was tricky to find, and the FX 5900 Ultra dominated the high end. (The 9800 Pro came close, but unless one of them cheated I'd say it was a win for nVidia, albeit a small one.) They didn't make a stupid choice, they probably decided on an nVidia card because of benchmarks or because of good experiences with them. Okay? No more of this. It's stupid and immature.

And to people on both sides of the line, just because someone says something stupid is no reason to flame them. Maybe they're trolling, probably not, but either way you'll do better just politely explaining why what they said was incorrect and/or illogical. Name-calling just makes you look worse. And if there's an all-out flame, ignore it.

Why am I putting this in a comment thread? Hmm. I guess I have too much time on my hands. OTOH, this HAS gotten sort of ridiculous... well, whatever. It's not as if anyone's going to pay attention to any of what I typed, they'll just skip over it and say something about how stupid those goddamned blind fanATIcs have to be if they don't realize that Valve is totally being bribed by ATi and the Evil Moon People to cripple FXes in HL2 or how stupid those goddamned blind nVidiots are to buy GeForce FXes when they obviously should have a tech demo of HL2 on hand. Eh, I tried.

Whoever made the comment about OpenGL and DirectX was very right; Doom III is a very different game, and the FXes seem to only fail with lots of DX9 usage. They certainly perform well in OGL, though, looks like.

God, I remember all the reviews saying the FX 5200 Ultra was decent because while it was slower than the comparably priced Ti4200, it was DX9. Ha. =(

Since this is a video card thread-thingy, I guess I should end by stating what sort of video card I have and either insinuating that my use of it makes me unbiased (if I use a card from the company that I just explained my problems with, or if I use some cruddy aging integrated thing) or explaining that just because I use it doesn't mean I'm biased (if I use a card from the company I backed up). (You're supposed to include that in posts on these things, usually at the end, just like how in CPU-related posts you have to make a joke about cooking food on a Prescott, or how in heatsink-related posts you have to mention that your current [insert cooling solution here] does just fine for you-- and if it's a non-air-cooled system, you are required to make a happy emoticon afterwards, possibly one with sunglasses. If you don't do these things your opinion is automatically invalidated.) Well, I'm not going to, because then someone would almost certainly call me a fanboy.

I'd actually put some Value in what Gabe says if he wasn't on the payroll of ATI. ATI and Valve have been working together for quiet some time now. Now lets really think about this, Gabe = former Microsoft worker. Microsoft = known for making bs claims and undercutting the competition. Valve = makes more money if ATI cards sell better. Hmmm, should I really trust this guy? Probably not. I'll wait for a non-bias third party says Nvidia fucked up dx9. Till then, I put Gabe right next to the stats on AMD site comparing P4 and Athlon and the study sponsored by Microsoft that shows Windows 2003 is faster than Linux.

As for the 5 times longer on developement, I have very little respect for the staff at Valve in that area. They have repeatedly show that they aren't competent when it comes to coding. Reply

I think the chances of successfully suing nVidia for misrepresentation and fraud will go over as successfully as that one against Intel for the P4's lack of performance. Anyone notice how it just sort of faded from the media spotlight and then dried up? Never sue a big corporation under a Republican government: Nobody gives a shit about the people - a kinder gentler America: Step over the little guys, not on them.

Vis a vis Doom III, I remember reading just recently on HardOCP or ShackNews that TeH Carmack wasn't too impressed with the N3x cards either, and had to write a special mixed-mode path to address performance issues too.Reply

Something somewhere went wrong. And that something happened to be Nvidia AGAIN. There should be a class action lawsuit for fraud against Nvidia Selling supposed DX9 cards when the game publisher and Microsoft say that Nvidia wasnt following the DX9 coding standard. And toboot Gabe says that it took them 5 times longer to code for Nvidia cards which brings the cost of the game higher. So here you have the game being written in DX8 code for the Fake DX9 Nvidia cards to be playable. As Gabe sayed " Id be pissed " if I owned a Nvidia card.So for all the people that paid such a high price for your 5900 ultra get together and sue Nvidia. These companies have to be stopped sometime and be accountable for their actions. You have Gabe with all the proof you need to show fraud.Stand up and be counted and tell them " Your Not Going To Take It Anymore"Reply

I feel fairly unbiased in saying this (as my current card of choice is an integrated Intel810 graphics chip *wooo*) but I think it *isn't* fair to start abusing nVidia over this surprising lack of performance. My hat goes off to ATI, because if you look at what they've done as a company with their line of products from the Mach64 in the past to being an industry leader today (which they share with nVidia), then there has been an amazing amount of growth. The flip side is that nVidia has also achieved great things and is actually a younger company. It's already been mentioned that nVidia pioneered (*awaits flames*) 32-bit rendering depth, when the industry was focused on 16-bit (and back in those days I owned and endorsed 3dfx stuff, but once you see games like Quake3, UT running in 32-bit the difference was noticable), and the reason nVidia did well was because they made good products. All I can say is wait for the final product, and let us all remember that 3dfx dominated the 3D hardware market in the gaming community, and then they bet it all on some so-so hardware and lost.One last thing:If you look at the full-range of current and upcoming games, then nVidia and ATI share the benchmark-leads together, but in a lot of reviews I've seen the 5900 Ultra wins over the 9800Pro, then vice-versa. You can almost compare it to the CPU field where Intel dominates performance, but loses on price. (IMHO)With all that said I will probably buy a 9600Pro, and an Athlon, because of the price/performance ratio, and I'm confident I'll at least get playable performance on HL2 and Doom3.Thanks, etherboyReply

Seems like NVidia fooled buyers selleing dx9 card especially in the low end of the market.I just bought a 5200 card after seeing some benchmarks and seeing nothing really faster on the ATI side (9000, 9100, 9200).The point was : at equal price and perf. I take the DX9 card for the future.Now I feel fooled :(Reply

A lot of people here are comparing DOOM III to HL2. This is just not possible. You have to remember that DOOM III was coded using the OpenGL API and HL2 was coded using the Direct3D API.

With that said, OpenGL is a bit more flexible with support for hardware, since it's open-standard. Direct3D, on the other hand, is a bit more rigid in it's design. It takes 2 years for a GPU to be designed. These chips were being designed long before MS nailed down DX9 spec. Nobody is really to blame, Nvidia just picked the wrong way to design their card for DX9 compatibility.Reply

It seems to me that, as a 9500 pro owner, nVidia's gotten themselves into fairly hot water. Absorbing 3dfx was not a smart move, as they really haven't brought anything to the table that nVidia did not already have either in the marketplace, or in development. They need to think "lean & mean", like they did in the old days, where they addressed the issue of the day (32 bit colour back then), rather than hedging around it, breaking benchmarks, and generally carrying on like a spoiled and over-indulged child.Reply

Let me see...ATI-only conference, ATI footing the bill, ATI is business partner of Valve, ATI wins! What a surprise! Of couse, it wouldn't be any different if it was an Nvidia-only conference, would it?BTW, I'll buy no game that won't run on the video card I have.

Valve actually spent 5x more time on the NV30 path then they did on the default dx9 path, and the FX still got owned. So anyone accusing Valve of taking the time to code their game for the FX series needs to have their head checked.

This is just the first DX9 game (well there was also tomb raider witch showed the same difference in performance) witch confirms what dmark03 (at the time when cheating at it wasnt allowed) showed us.Reply

I think this is a marketing war - mainstream cards are the bulk of sales and whoever dominates that sector almost FORCES games producers to make products for THOSE cards - regardless of implementation...

"What more, he [Newell] said, smaller developers are not likely to have the resources Valve was able to bring to bear on the problem. "

"Half-Life 2 has a special NV3x codepath - even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2."

"ATI didn't need these special optimizations to perform well and Valve insists that they have not optimized the game specifically for any vendor."

What's with "Not optimized for any vendor" and "NV3x codepath"??? Valve is slapping themselves!!!

"The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;"

"the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra"

Geez, pit the Radeon 9600 Pro vs 5900 Ultra in the current crop of games and we all know that's not true.

1) Valve's stories don't exactly gel very well, I suspect they built Half-Life 2 from the ground with ATi class hardware in mind as per the standard DirectX9 specifications. (Nvidia just gotta blame themselves cause they didn't follow the specifications.) So, in the end Valve have to extend the development time to include the NV3x codepath, which obviously isn't working well enough.

2) Valve and Ati are probably in bed together as shown with HL2 bundling with ATi Radeons. However, ATi did an exellent job with the 9700/9800 Pro, besting everything Nvidia can conjure up in every market segment not only in terms of performance but price too.

3) Nvidia stumbled big time with the GeforceFX, it's overpriced compared to any equivelent ATi card as of now. (The merging of 3DFX and Nvidia technologies just didn't make the cut.)

Conclusion:

1) Radeon 9xxx are the best buys as of now. They run the current crop of games and also HL2 very well.

2) GeforceFXs are over-priced, but still can't beat Radeons 9xxx convincingly. For people with GeforceFXs now, I guess you made wrong purcase decision in regards for this generation of GFX cards.

3) For the time being, ATi owners are doing the laughing, but not the final laugh as of yet though. Speaking as a consumer, I hope that the competition continues to heat up, for I will buy whichever has the best price/performance.Reply

hehe, I got a 9800 pro 256mb (i know, iknow the extra 128mb makes no difference but im a sucker for the numbers!) and I nearly fainted when I saw the price for the equivalent FX card, now im LMAO about this, but I feel really sorry for the guys who payed a fortune for an FX who are probably in denial at the moment. Its a real pain that games are now fighting gfx card technology instead of being able to enhance their software with them. I think we will see the reverse of this FX situation when Doom III comes out though!Reply

Note that when anyone says their DX9 code is "not vendor-specific" that the reason NVIDIA's been having so much trouble is that MS basically sold the DX9 spec to ATI in no small part because of its constant squabbles with NVIDIA. Contrary to popular opinion, these hardware architectures were actually far along in development well before DX9 was nailed down. In a reversal of the DX8 development, DX9 was basically a software/API description of the R300. People bitch about NVIDIA using 16 and 32 bit FP and "not being to-spec," but you must realize that these major architectural decisions (all-FP24 vs. fixed+FP16+FP32, 64 instructions and multiple outputs vs. 4k instructions and one 128bit output supporting packing of multiple smaller-precision elements, etc.) were being weighed and worked out well before MS came down with the DX9 spec. The spec was developed, as usual, with the involvement of all the players in the hardware world, but ALL of the bitchy specifics were handed to ATI. Admittedly, this has happened in the past with NVIDIA, but it's particularly problematic once the DX spec starts defining code paths and internal representations for these immensely complex stream programs in today's vertex and fragment units. As such, though it's clearly an important target which NVIDIA bungled largely in its business relationship with Microsoft, DX9 could easily be considered an ATI-specific codepath as sticking to spec forces a very non-optimal code path for the way the NV3x pixel shader is architected.Reply

by new gfx, i just mean that they finally figured out how to make the Quake engine load textures that are > 8-bit, and then they read some soft shadowing/bumpmapping tutorials and cut/pasted that code in there as well.

concerning the confusingly low sys requirements #39 was referring to:

if you're running a TNT/GF2 or possibly (?) GF3, you'll probably have to turn OFF the fancy gfx that have gotten HL2 half the hype. just so you can play it, instead of watch it play (ie slideshow). so you'll basically be playing a physics upgrade mod for HL. along with all the new content for HL2 (maps/textures/models/story etc).

------------------------

as for the comparison between HL2/Doom3 that you lamers can't give up on:

HL2 undoubtably has more dynamic gameplay than Doom3. Doom3 definitely has more atmospheric mood-driven gameplay than HL2.

imo, there is no such thing as better gameplay. just as there is no such thing as a more fun game. it's just a matter of preference.

a product is what a product is.. if you prefer apples, eat apples.. if you prefer oranges, eat oranges....

if it's so important to you to argue why one is better than the other, then ur a politician..

terrorists are politicians too y'know.

to all u who bought new hardware to play the game before it comes out... i just avoid gambling all together and wait for the game to come out first.

Also, R3x0 hardware renders 8 textures per pass, while NVidia renders 4 or 8 textures per pass depending on code. Using Single texturing and advanced DX code (ie DX9) the engine works at 4 textures per cycle, even when using smaller precision shader code. The problem is the hardware, not the drivers.Reply

I find this all quite fascinating. Half life was the first game i played on the first computer i owned. I was running nVidia then and have been since (current = Ti-4200). I am about to upgrade and have been researching for hours a day about latest DX9 cards and must say that without question, ATI will be getting my cash this time...and from everything ive read/seen/heard...they have produced a superior product...PERIOD (please no "in the future..." posts cause i could be dead before nVidia catches up...i care about NOW)Reply

I've never owned an ATI card and I've owned more than a few nVidia cards (currently a 4600 in my main rig).. so I think I can make this statement without bias:

Some of you guys are desperate to make yourselves feel better about your ultra expensive nVidia FX cards. It's pathetic and sad.

Personally, I am going to wait and see how my 4600 will run HL2 before deciding if I need to upgrade (I can live without max eye candy)... and if I do I will probably buy an 9600Pro simply because it seems it might be the best price/performance value for HL2.

I have LITTLE patience for buggy drivers though and find my 4600 w/40.72 Dets to be ultra stable so ATI's drivers better not piss me off. :DReply

God, I love all the nvidiots out ther saying valves programers suck. Everything is not software, it is mustly hardware problems. Valve came up with a work around for nVidia not following spec. Be happy with that. Just because nVidia made "Wonder" drivers a couple years ago dosen't mean that it always works out that way. DX9 calls for 20 bit percision. That's what ATi uses and what Valve decided to use. Nvidia decided to use 16 and 32 for some reason. This is why nVidia doesn't liek 3dMark03. Because they didn't follow spec, they are mad about it.Reply

Geezus I'm not sure which fanbois are worse, the nvidiots or the fanATIcs (DigitalWanderer et al). Right now I'm leading towards the fanATICs but only because the nvidiots are more less hushed up these days.Reply

It is pretty remarkable - Valve have come out and said "Well, ATI cards are great and they work properly, but nVidia cards don't run properly. Also we can't make FSAA work in our game, we don't know how. But we're really good programmers and it's the nVidia card that is at fault."

Obviously Valve has a different team of developers since they did the original Half-Life.Reply

31, If making a game run properly on NV3x hardware entails not using proper DX9 high precision rendering, then yes.. Valve is guilty of optimizing for ATi. Otherwise, nobody is to blame but Nvidia (for designing the NV3x), and Microsoft (for designing the DX9 specifications).Reply

Figures, Nvidia's "code it crappy now, fix it later" mentality finally bit them in the ass. They shove this crappy code out that barely works in the minor games, and then release optimizations... errrmm.. patches... for every major game.

Stick with good 'ole ATI. Sure, their drivers might be slightly buggy, but at least they actually fix the problems.Reply

Rmember, we still know very little about performance even though he said that it is getting 60 FPS at 1024x768 on the 9800 Pro. Does that mean with Anti Aliasing on or off? How about Anisotropic Filtering? You can't just forget those guys when you argue the 9800 performs better.

Until we see exactly is being used in the tests, we can't accurately judge the performance.

Also, the comment about it runing in dx8 on 5200 and 5600 cards means that they understand the market and are making it playable. When they commented some time back about it running on a 4600, you have to remember that was when there was lots of debate regarding AA issues and they weren't very sure. That was one problem they saw more towards Nvidia and said they can try and make a work around for ATI cards.

Also, we've seen the wonders Nvidia can do with drivers some time back, and so we can expect something to change the situation now. I'm sure that Nvidia has known about these problems and is working on fixes or enhancements for its speed.

Let me also say that I'm not saying ATI isn't doing good, they are doing great and this is something that has fallen at a very good time frame if they can get the next generation cards to outdo Nvidia because the 9700 and 9800 have given them the higher performance for a much longer time.

What I'm looking forward to is seeing the AA and AF results as well as if they made any other optimizations such as for hyper threading...Reply

I could have sworn I mentioned somewhere here before that the truth would come out in the fullness of time...I guess the time is full now!

To any nVidia enthusiasts saying that the Det 50s will fix this, you might want to check out our story on the Det 50s over at www.elitebastards.com ....they're comparable in performance to the 45.23 set! :lol:Reply

"Gabe Newell: Valve and NVIDIA both know that we have a lot of shared customers, and we've invested a lot more time optimizing that rendering path to ensure the best experience for the most customers." - Makes you wonder what would happen if they hadn't. 20fps anyone?Reply

It makes sense if you think of it in terms of what kind of detail you'll be able to milk out of the card. GeforceFX will play HL2 better than GF2, but not at the level you would you expect for a premium price card.Reply

I don't quite understand how it's elledgedly ran "fine" on a 4600 (they claimed they ran HL2 with that card and it ran fine) but it runs like shit on a 5600 and even on a 5900. So running it in DX8 mode ona 5600 will only make it "playable?" (30+ fps?). Then how is a TNT card in the min requirements? Won't a GF2 run it in DX8 mode also?...How can it be playable on that and on a GF4 but not on an FX card? That doesn't make sense to me.

Also, someone mentionned STALKER. Well, they've once shown STALKER being played on an FX 5200, and it seemed to run pretty well to me. It had a lot of grass too, so I don't think the details were on "low." I own a 9800 Pro, but that still doesn't make me think that it's normal to claim HL2 runs well on a 4600, but very slowly on a 5900 and only at 60fps on a 9800 Pro. It doesn't make sense.

HL2 will beat the crap out of Doom3 any day. Just look at the gameplay. Why is GTA: Vice City so great? Certainly not for the graphics, not that HL has anything to be ashamed of in that department. I have a feelig that Doom 3 will look great, but play like crap, while HL2 will have it all :)

BTW: the leaked Doom3 demo works surprisingly well on my R9700 and if I recall correctly it was first showed off runnig on an Radeon9700. I know because I had just bought a GeForce 4 Ti4400 for $450 and was shocked that they didnt use a Ti4600.Reply

Hell why did they even call it dx9? Why did nvidia and ati make videocards specifically for dx9? </sarcasm off> Performance-wise, I'd say the diference isn't enough to give you a stroke but it's certainly noticeable.Reply

Why can't people make the distinction between DirectX and OpenGL? Everytime a DX9 game comes out and performs way better on the R3xx than NV3x based GPUs are they going to keep on citing Doom III? It's not even a DX9 or DX9 class game! The NV3x still even needs a special pathway in that game that runs at lower precision to beat the R3xx. I doubt even the guys developing Stalker can make the NV3x outshine the R3xx in a fullblown DX9 game.Reply

What would you prefer to play, based on the previews/videos/etc, Doom3 or HL2? I vote HL2 hands-down. id haven't made a compelling game since Doom2. Just an opinion, don't take offense, of course Carmack & Co are extremely gifted.Reply

Anand, change the title text!When I read this i get this image of u jumping up and down with a ATi flag advertising their hardware or something. Shitloads can can change before HL2 is released in November, and maybe it "rocks on nVidia" too on release.Reply

I don't want to take this into fanboy territory, but valve weren't "crappy coding guys" when HL1 came out several years ago were they? In fact, one could argue that if HL2 was designed to work on systems that are 2+ years old they can't be all that bad can they?Reply

Is it just me or is everyone ignoring the situation with Doom 3 benchmarks running noticeably faster on nVidia's hardware.

Okay, lets try something; I'll restate what Anand said but swap nVidia for ATi and HL2 for Doom3:"- with the NV3x codepath, nVidia is the clear performance leader under Doom3 with the FX5900U hitting around 60 fps at 10x7. The R9800 is noticeably slower.- the FX5700 ultra performs well, - it is a good competitor of the R9800;- nVidia didn't need these special optimizations to perform well and Carmack insists that they have not optimized the game specifically for any vendor.

Okay, just to make sure guys, go read Anand's Doom3 article. You'll see all the above holds, (if you extrapolate what we know, the FX5700 bit too).

You've also got to remember regardless of designs/pipes/whatever, *both* the FX59 and R98 are about 120M transistors, and nVidia is actually clocking these transistors faster (tho ATi can make up for this by using 24bit instead of 32bit per unit memory). Unless nV's engineers were plain stupid desiging the chip (unlikely), they have hardware that's easily on level with R98, and later driver's are likely to exploit this further. ATi's basically been optimising their R300/350 for years now, there's probably not nearly as much headroom for compiler optimisations in later drivers.

Basically I think when the Det.50's come out, the FX59 will even up with the R98 in HL2, and totally trounce it in Doom3. Maybe not, but just thought I'd restore some balance here :)

nV basically overshot a bit with their NV3x hardware generation while ATi stuck to DX9 fundamentals, so they've been sorta screwed from the start. I reckon they'll kick ass with NV40 though. I hope ATi do as well!

#21 Doom III 's looks with Half-Life's 2 interaction ("great gameplay") could be possible, with both great framerates @ ATi and Nvidia cards. But NOT with valve's crappy coding guys behind the wheel. That takes a Carmack or Sweeney.Reply

Oh well, at least I only have a Ti4200 so it doesn't come as such a shock that performance will be sucky! It'll be ATI for me next time for sure. If the 9800Pro is pulling 60fps at 1024x768 and is way ahead of the FX... wow, I feel for you owners of $400 FX5900's.Reply

People who bought Radeon 9500 Pros before prices went sky high, people who bought Radeon 9500s and soft-modded them, you may now spend the next week laughing at people with GeForce FX 5600s. Ready.. set..Reply

not it softmods just fine. Runs hella fast and all. Just some driver issues. For like a week no OpenGL programs would work. Did everything I could think of short of reformatting, couldn't get it to work. Then one day it just started working at random, go figure.Reply