DX10: Crysis Warhead

Crysis Warhead is an expansion pack based on the original Crysis video game. Crysis Warhead is based in the future, where an ancient alien spacecraft has been discovered beneath the Earth on an island east of the Philippines. Crysis Warhead uses a refined version of the CryENGINE2 graphics engine. Like Crysis, Warhead uses the Microsoft Direct3D 10 (DirectX-10) API for graphics rendering.

Benchmark Reviews uses the HOC Crysis Warhead benchmark tool to test and measure graphic performance using the Airfield 1 demo scene. This short test places a high amount of stress on a graphics card because of detailed terrain and textures, but also for the test settings used. Using the DirectX-10 test with Very High Quality settings, the Airfield 1 demo scene receives 4x anti-aliasing and 16x anisotropic filtering to create maximum graphic load and separate the products according to their performance.

Using the highest quality DirectX-10 settings with 4x AA and 16x AF, only the most powerful graphics cards are expected to perform well in our Crysis Warhead benchmark tests. DirectX-11 extensions are not supported in Crysis: Warhead, and SSAO is not an available option.

well you can think 2 680s are better all ya want but atleast it wont put out twice the heat and require more power, and also if i liquid cool, i dont have to spend 300 on waterblocks for 2 680s. and also like always, everything that involves dual gpus are better, the waterblocks, the perks. everything. not saying sli 680s arent good, im just saying they arent a better solution to a 690. you just probably already own one and want to feel as if you have the better buy.

The 690 actually manages to hold up quite well when compared with 2 680's its usually only off by 2 fps and such so in my opinion having a single gtx 690 is better taking all things into consideration including the power draw and heat etc; still for multi monitor gaming i would go with 2 680's over a single 690.. or if i had the money, 2 690's

im not worried about my power supply, im worried about my electric bill. and your completely out of your mind if you honestly think 2 680s completely destroy a 690, that obviously shows you didnt read the benchmarks and know not what you say.

SLi/crossfire, 2x gpu setup delays because they need to communicate between each other first before making computations

everyone knows that SLi or crossfiring 2 gpu doesnt give you 2 times performance. some games doesnt fully support SLi usually you have to wait for game updates. You also need to wait gpu driver updates to make sli/crossfire work efficiently on new games especially on new release gpu cards

which by the way an sli of gtx680 needs to be plugged in two pcie which in many cases causes bottleneck because some mobo has diffrent pcie gen2, gen3 and sometimes even if on 2x pcie gen3 sli some mobo will run it as gen2 which is the bottle.. tooo much hassle..

I think it is more than worth it. There is more power and heat from 680 SLI setup than a single 690! It also matches 680 SLI in performance is quieter as well. And looks way sexier with the LED lit logo,Perfect for windowed cases. I currently run 480's in SLI and this 690 blows it away in all regards, I think these space heaters of mine are about to be replaced with a 690! ;)

I'll buy a 690, and in less than a year I'll buy one to SLI for half the price. They've already stated that it'll clock over 1,100 on stock cooling, so as far as you being "really sure", i'm not sure what you're smoking.

I had the 7950 gx2 back in the day and it ran as well as the 8800, and it was cheaper, minus dx10. A couple months later I bought another for half the price because i HAD ROOM. It lasted another 3-4 years, and still kicked ass.

If you're going to buy anything and they're the same price, why would you buy 2 that have less performance, instead of one at the same price using the same chips and more room to expand?

2 680s = a little bit more power and 200$ more money u have to spend. and more power usage. So if your getting 2 680s then your wasting money and may as well just get 1 690, Its not worth the extra money to get 2 680s

Well you spend MORE buying two 680's. 1250 = 999.99 I'd go with the 690, due to better power efficiency and less heat. Think before you post.And don't correct me on price, that is the price for two, actually useful 680's.

Prices have changed in the seven months since this article was published, and they were different a month ago when the comment you responded to was published. Also, consider this your warning and drop the attitude.

Almost brought a 680 the other day but held off. Glad I did now. This will be my next card. The results of this review justify the price tag. Great review and I wouldnt mind seeing the results of how two of these cards perform in sli.

The GTX 690 is great. Maybe youre able to overclock two 680 more but why? Only for a few % more? There is no need to do atm....

I´ve read an article about GeForce GTX 680 3-way SLI an that didnt impress me much... i think an 690 or an 2-way SLI of 680 is a gamers dream for the next time. Maybe NVIDIA has some driver work to do for more power with more cards, dunno....

But hey maybe they will benchmark an GTX 690 Quad SLI Pack here.... that would be more than great....

Too bad they focus on being the king showing off they have the fastest card. That's not want people want to buy. I mean yeah maybe a minority but they should focus on mid-range where most of us are all waiting to update!

sorry dude, i love mid range because i want everyone to enjoy games at what they can afford, but mid range doesnt push technology. only cards like the 690 and 680 and 590s and 6990's push technology forward. i love mid range gpus, but the high end stuff will forever be where its at. that said, i hope they release a great 670 or 660 for ya to enjoy.

They really don't focus on it, but people with money want the greatest card out there, regardless of price. There's a flagship product in every manufacturer's garage, regardless of what they make. This card it top dog right now, but it keeps competition from getting stale and makes it better for us in the long run. All the lower-end cards will now make a drop cheaper, and if somone ups the ante, they'll drop the price on this too. It's good... not "too bad".

I have been thinking of upgrading since GTA V was announced. My GTX 295 can handle all current demanding games with surprising ease, but it is at its limits when I want to push the GTA IV graphics. I hope the GTA V engine is better optimized for PC.

I will try to hold back my eagerness to buy a new video card until I feel it is really necessary.

I have a westmere setup, PCI e. 2.0 on rampage 3 extreme mobo. I and most others who can afford a $1000 Gpu will want to know if there is a benefit to upgrade when most games are Gpu bound. I'm personally waiting for haswell toc out in a few quarters.

I'm thinking the price might drop once 7990 shows, depending on the pric and performance. But not by a lot if anything, just like the article concludes 2 7970 in crossfire will not beat the gtx 690 so the 7990 will probably be the same. Wishful thinking.

A single 680 gets you over 75 FPS on Arkham City. The 690 give you 127. But it makes no difference because human eyes are INCAPABLE of processing anything about 60 FPS. It could give you 1,000,000,000 FPS and it would look exactly the same at 60. Not to mention, most monitors run at 60hz anyway, so they're incapable of displaying anything about that. And if you have a 120hz monitor, you wasted you money because like I said, your eyes can't tell the difference between the two.

Sure, if you're running a multiple monitor setup, higher FPS is what you want, but STILL, dual 680s in SLI is way better than a single 690.

That's the average frame rate, not the minimum. My tests show that the minimum dropped to 13-28 in five tests. Having played Batman, I can tell you that there are points in the game where FPS drops and get you killed.

Also, I disagree that SLI 680s is better than one GTX 690. It's your money, so buy what you want, but there's no 'right' answer.

You sir are incorrect, you've obviously never used a 120Hz monitor before. When there's a lot of fast moving images on screen, hell even just looking around using the mouse in a game, there is definitely an increase in smoothness per frames with a 120Hz, the motion just looks far more fluid. It's like going from 30-40Hz to 60Hz. Even just moving the mouse curser quickly on screen your eyes can detect a less jerky curser movement.

your kidding me right? obviously you have never owned a 120hz monitor because you would not be saying you couldnt see a difference past 60hz, i mean thats like saying you couldnt tell the difference between 1080p and 1600p. go buy a 120hz monitor and you will know your wrong.

there is no such thing as 1600p? well you better go tell Dell that because they seem to think those 30inch monitors they sell are 1600p. its called 2560x1600. i own the dell u3011. look it up, im kinda surprised that someone who is looking up the gtx 690 knows nothing about its max resolution. strange fwiw....

yes, it stands for progressive. the u3011 is definately not interlaced so it is progressive. monitors dont always use the term 1600p or 1200p or 1080p, they normally just use their real resolution suchs as 1920x1080, or 1920x1200, or 2560x1600 and so on. just because they dont throw on the p in the description doesnt mean it isnt progressive scan. resolutions are definately 1080p,1200p,1600p, they even have a new higher res right now at like 4k

I want my next laptop to have a video card to be as fast as this, for CUDA work mostly. Maybe in 10 years when we reach the end of transistor shrinkage, AMD integrated graphics will do it in a lightweight laptop.

Obviously there would be alot more heat when overclocking two gtx 680's and let's be realistic. Almost anyone who would get two gtx 680's would most likely overclock, wich in turn would produce more heat. And with a reference design this would not be a good decision to overclock two reference gtx 680's. As you know the gtx 690 does not have chopped downed chips in it. it is two full working gtx 680's in one solution for around the same price. and obviously two gtx 680's would NOT obliterate a gtx 690. all that's really missing is a few hundred MHZ, wich is not really highly noticable in the long run.Now talking down the line when gtx 680's are Zotac highly overclocked and cheaper than a gtx 690, that might be a great alternative to one single gtx 690. I mention zotac, because i have shopped for many gtx and gt cards back in 2010 and 2011 and although i purchased a pny gtx 470, zotac still had the better design and fastest overclocks i have seen honestly.what this comes down to is, is the customer angry that they just bought a reference design gtx 680 and is stuck with this because they cannot afford a gtx 690, maybe you should have waited and saved some more cash??? and also, does the customer prefer two graphics cards over 1, for a few hundred dollars less than a gtx 690, and having the two gtx 680's be slightly more powerful than a single gtx 690 solution.

Damm I just got a 580 classified, then the 680 came out, and now the 690 is out, Can I sli with different models of grafix cards, ex 580 with 560 ?? or 680 with 580 ? Id like to get a new 690, but you know I just got this 580 classified, and I dont want a $600 Paperweight.

I used to have a few ATI/AMD Radeon cards, but I always had horrible driver problems and in game graphics bugs. I used to think that these problems were normal until I switched to Nvidea.So I am afraid to buy Radeon now.

I have tried a few generations of Nvidia cards and all my driver problems have gone away.

So you can see why I would be hard pressed to ever go back to the video cards that gave me so much trouble.I am not saying I will never ever try an AMD card again, But at the moment I decline.

Just made the switch from HD6990 to GTX690. The first time in nearly a decade that I've chosen Nvidia over AMD. I didn't really need an upgrade, but was sick and tired of buying games and actually completing them weeks before AMD shipped their crossfire optimisations... So far the way it feels is that AMD ships more powerful hardware, but this extra power is wasted by poor driver quality.