Im not so sure I agree that the NV cards are not as good at implementing this engine's HDR effects due to the fact that the NV cards are all one of two extremes. They are either the highest end cards which crush the entire ati lineup and dont provide a good comparison, or they are the lowest-tier card in the roundup with no real direct competition. I mean just look at the memory bandwidth of the 6600gt as well as the pixel pipelines and ull find that it has no real comparison to the ati cards benched. Furthermore, I dont see how the ATI cards have anything to compare to either. It seems as if the top NV card had the least impact from implementing HDR and it just scaled down the list fairly evenly. Maybe the conclusion was made based upon the numbers of cards not posted, but from my perception, based on reading the numbers on here, I'm not seeing any real reason to state that NV cards do worse than ATI cards in this comparison. Furthermore, look at the x800 from bloom to full HDR. The 6600gt had less of a percentage hit as compared to the x800. Maybe I missed something, but does anyone else see this? Please correct me if im wrong. Reply

I caught that too. The GTX dropped 2.5 FPS from none to full. The x850 XT dropped 15.8. For those that like percentages, the NVidia dropped 3.5% while the ATI dropped 22.3%. How exactly is that better? I like how the GT went from second to last with no HDR to second best with full. I will concede the 6600 GT dropped over 50% with the full effect, but when FSAA first hit the scene the thought of running a low-mid range with AA was OBSURD. Reply

i know why... factors factors factors..... anyway its still not nice to only release if with DOD and the lost coast..... to bad if couldnt be released for all :-{ even as a update which I'm sure will arrive.. Reply

well, I'm running a 2405 with an x800xt, and HDR made the difference between smooth as butter and less than that. I liked the effect, but the fast pace of things in DoD made it really annoying to me more than anything else. Given time, I can get used to it, of course, but I'd rather be in single-player doing so (come on lost coast), then I can try my hand at MP again.

I can't understand the argument that it's an aid to players, though.
More realistic, I'll grant that. but how exactly does it help anyone but campers, I don't know. Besides the fact that people can just turn it off, it's an act of faith to assume that someone's gonna be blinded coming around a corner. (as I was multiple times when it was still enabled)

altogether though, HDR on, I thought this game was bliss for the eyes. Reply

I love the HDR, it does look more realistic and moreover BF2 looks crappy after playing DOD-S. BF2 was my fav. game until DOD-S, now its a tough call because BF2's gameplay is better but it doesn't look that great anymore.

Yes. Most of the people complaining about HDR haven't even tried it, it seems. See Wilson's post above, you have to see it in motion. Frankly the screenshots don't look anything like the game in motion. Has anyone here played GT4 for the PS2? They implemented some sort of advanced light thingies like this to great effect. The whole point is how something (wet road surface at sunrise) would look totally different depending on the angle you're looking at it from; blinding white one second, black the next, just like in real life.

Has anyone else tried HL2 after they added HDR to some of the maps? Looks good. And like they said *if you read up on it*, this is really a partial implementation of HDR, worked into the source engine. Of course it's really only worth it if you have a high-end videocard, otherwise you'll have to disable AA in order to get playable frames. In a very-aliased game like HL2, it's a tough call which I'd rather have :) Reply

quote:In other words, in a real battle, the sun in your eyes will affect your aim, thus adding to the realism of the game.

Yet when the goal is to win, the last thing I'd do is add something optional that impinges MY ability without equally affecting everyone else's. =

So maybe in a singleplayer game or maybe if it was something in a couple years that is defaulted to ON and doesn't easily turn off, it would be useful. But really, graphical settings should never be optional if they directly impact the player's ability like that.

That would be like returning to the days when people figured out you could turn off the FogOfWar or smoke effects so you could see the full draw distance and totally snipe people before they could even see you coming. Reply

valve says hdr gives an advantage because the contrast differences make it easier to spot players moving through the environment -- or something like that. Like a player who steps in front of a dark tunnel occludes the light outside and will really stand out.

Personally, I don't think it's that useful or detrimental ... I do think it adds some pretty nice realizm to the scene in many places. This HDR implimentation is not perfect, but its better than many others out there.

More like too bad they didnt do this from the beginning with Half-Life 2 and it didnt look any better to me from the start. I wonder when people will ever get it in thier heads that graphics != gameplay. A game could have all these fancy smancy effects but if it still plays badly the graphics do NOT make up for it. Read: Doom 3.

Nope theres nothing wrong with you. Doom3 was just as fun as HL2 to me - but in a different way. Matter of fact, D3 was the first game since the original Doom that actually had some scary moments for me. THAT is fun.

I pity those who cant appreciate that, probably because they did not play Doom way back in the day when it was the non-plus-ultra. Reply

Like? I thought bloom looked like crap, but seriously what else is there to do? With PPU units coming and dual core drivers handling some of the GPU loads, I just hope they're not merely limited by direct x Reply

meh. to me, from the screenshots, it looks better without it.
bloom alone really blows, the sand is all washed out.
full hdr is nice, but you can tell the textures on the buildings weren't created with HDR in mind, they wash out quite a bit with HDR.
I was reading about this in game developer, and IIRC, you have to modify your textures to really get a bang from HDR. Reply

I'm pretty unimpressed with this technology if this is all it will ever look like, not to mention, is this some great step forward from Doom 3's lighting effects, or is Valve just a year behind?

As far as the coverage goes, I know you can only test so many graphics cards, but why the x800 xt and x850 xt, they are so similar in market and performance. I would have substituted an x800 xl or x800 pro for the x800 xt. Reply

the real advantages can't be shown from a screenshot. it's moving between dark and light areas that really show off the capabilities of the engine. Blooms are nice and add a subtle effect to lights and reflections. But the adaptive exposure has the potential to change the way games are designed and played on a fundamental level. Stealth games would actually change the most with shadows and blooms helping to actually conceal enemies and players naturally.

Even shining a flashlight in someone's face could be a gameplay aspect. In a dark room, a flashlight would effectively blind the target if used correctly. Reply

The very bottom image looks the best to me. Compare it to the top which has alot of the jaggies but witht he HDR the jaggies are missing, it looks alot smoother. Wish I had a better GPU then a 9700 pro. Reply

Interesting article though like many others I was distinctly unimpressed by the static screenshots showing the "benefits" of HDR. Maybe it works a lot better while actually playing the game...

The choice of cards tested seemed a bit strange to me though. Either a 7800GT or GTX would have been enough for top-end nVidia performance, as would a single good card from the X800/X850 line-up to show how ATI compares with their current generation (ideally figures from an X1800 would be thrown in, but NDAs currently prevent that). The omission of a 6800GT or similar was the main problem with the benchmarks though, as many of us have one of them and would like to know well they fare.

Along with the 6600GT for current mid-range performance, ideally you'd also include an FX5900/5950 series and a 9800Pro as not everyone buys a new card when a new generation of hardware is released. The 9800Pro is still very capable and should be included in all reviews, and an FX5900/5950 should be included too for reference even if it does suffer badly with modern pixel-sharder intensive games, so that people can decide if an upgrade is worthwhile. Anything less than those cards would probably be a waste of time for this review though as they'd be too slow.

In fact I'd say a 9800Pro and FX5900/5950 should be included in *all* graphics-card / game-performance reviews, in addition to the usual 7800, 6800, 6600, X800/850. You must have them lying around somewhere ready to drop in a suitable box I'm sure :)

I'm looking forward to the updated/follow-up article with additional benchmarks, I understand if time was pressing you could only test on a limited number of cards. Reply

The move to PCIe does make everything harder for them, though, as they would have to build a second box which would be as identical as possible except for a different motherboard, introducing a few potential inconsistencies. Other than that, my thoughts exactly.

Although, frankly, this preview told me what I wanted to know. Great job, guys!!! Reply

Humorously enough, if you go to a bright beach on a bright day, the sand will look 'washed out'. Especially if you're viewing it through a TV camera(which limits the dynamic range of the image in a similar way that your monitor limits the dymanic range of the rendered scene.)

Plus, this is generation 1 real-time HDR(sorta), don't be TOO hard on them. ;P Anti-aliasing was poo-poo'd early on because it 'made everything blurry'. I can't live without it in most games(As long as I'm playing at 1024x768 or above) Reply