AMD and its add-in-board (AIB) partners have decided to kick of yet another price war campaign by lowering the price of the Radeon HD 7990 Malta dual-GPU graphics card from around US $1,200 down to US $699.99.

The price cut has been already implemented at the USA Newegg.com and we are seeing same thing all around Europe. While it was priced at anywhere between US $1,100 and US $1,200 at Newegg.com, the HD 7990 is currently listed at anywhere between US $699.99 and US $789.99. In Europe, the HD 7990 was previously selling at around €950+ depending on the region while most of the listings now put it around €700. The German-based Mindfactory.de even has the XFX Radeon HD 7990 listed as low €599.75.

The AMD Radeon HD 7990 features two 28nm fully enabled Tahiti GPUs for a total of 4096 stream processors and 6GB of GDDR5 memory (3GB per GPU). The GPUs are working at 950MHz base and 1000MHz Boost clocks and while 6GB of GDDR5 memory paired up with 2x384-bit memory interface ended up clocked at 6000MHz.

AMD has already released a new Catalyst beta driver that fixes some of the micro-stuttering issues and with the right Crossfire profiles that are released occasionally, it certainly puts a decent fight against, now much higher priced GTX Titan and the GTX 690 graphics cards. Of course, Never Settle bundle with a total of eight AAA titles with a value of around US $100, gives it an even better edge against the competition.

The current price puts the Radeon HD 7990 directly against a bit cheaper Nvidia Geforce GTX 780 graphics card and it looks like AMD decided to make its first move in the new price war campaign.

You can check out Newegg.com prices here and some European listings here.

Strangely enough, although we expected this from AMD first, it’s actually Power Color who first strapped a PCB with two Tahiti XT GPUs. In fact, the company offers two versions of the card, one of which is our today’s test card - the Devil 13, whereas the other is clocked lower and simply named the HD 7990.

Club 3D and VTX 3D followed up with their own HD 7990 cards, but it’s actually the same card with slightly lower clocks. It appears that some other AMD partners will jump aboard the HD 7990 train, which is always a good thing. We know HIS is working on the card, but they call it the HD 7970 X2.

Power Color’s HD 7990, the slower one, is clocked at 900MHz (with turbo BIOS at 925MHz), while the Devil 13’s GPU runs at 925MHz (turbo BIOS at 1000MHz). Note that the company didn’t use AMD’s recently launched Tahiti XT 1GHz Edition chips. Apart from higher clocks, the new Tahiti XT chips come with PowerTune and Boost, which is similar to Nvidia’s auto-overclocking once there’s thermal and power headroom. Luckily, the Devil 13’s turbo BIOS means 1000MHz, so users won’t really notice any more meaningful difference.

The Devil 13 HD 7990 card comes in special packaging with a few gifts thrown in, which we’ll discuss in more detail next.

GPUZ confirmed the Devil 13 runs at 925MHz and that loading turbo BIOS ups it to 1000MHz. Unfortunately, loading the BIOS can’t be done on the fly and although the Turbo can be switched whenever you wish, the new BIOS will load only after you restart your computer. The turbo BIOS switch is placed on the I/O panel for accessibility reasons.

Inside the pretty large box are a few standard items you’ll need, as well as a few surprises.

The package has no pictures of the card, but does it really need to – it’s sleek and modern, while hinting at the beast inside. The Devil 13 HD 7990 comes with a three year warranty.

The Devil 13 HD 7990 looks powerful. It’s huge though, measuring 31.75cm (12.5 inches) in length, 13.8cm in height. It takes up three slots and weighs in at 1.77kg, believe it or not. It goes without saying that the card will have to be secured well, which is why the company bundles the PowerJack.

The fans take up one slot, while the heatsink takes up two. In between the two 85mm fans is a single 75mm one. Fan RPM can be regulated via Catalyst Overdrive. All the fans are connected via a single 4-pin connector.

The PCB carries two Tahiti XT graphics processors. We’re talking about first generation chips that run at 925MHz, rather than the 1GHz Edition versions. The internal bridge (PEX 8747 from PLX) is in charge of communication between GPUs and comes with its own little heatsink.

Close to the CrossFire connector is another that resembles a fan connector, but it’s actually the BIOS connector that’s routed to the turbo switch on the I/O panel. The turbo switch will overclock the GPU by 8 percent, i.e. from 925MHz to 1000MHz, and will glow once pressed. Note that the new BIOS will not load until you’ve restarted the machine.

The card packs 6GB of GDDR5, so each GPU has 3GB at its disposal. The memory is Hynix’ and runs at 1375MHz. However, it’s specified to run at 1500MHz, so we expect to boost it with some overclocking.

Considering it’s a three slot cooler, we expected the heatsink to be bigger. Taking the cooler apart will require a lot of screwdriver work and patience. The screws that keep the cooler attached to the card are on the back of the card. Each of these screws has four washers for better grip.

The Devil 13 HD 7990 needs three 8-pin power connectors, which, coupled with PCI-Express power, provides up to 525W. Both GPUs have 6-phase power. On the back of the card are 6 LEDs that display power status.

The rest of the HD 7900 series cards have similar connector layout, i.e. two DVIs, two mini-DisplayPorts and one HDMI. The Devil 13 however comes with an active mini-DisplayPort-to-DVI dongle, which means the card can drive three displays via DVIs, but also you may use all the outputs at the same time.

The backplate covers the back of the card and its main functions are to secure the card and make it stable, while looking good in the process. Some of the memory chips ended up on the back as well.

Dual GPU Devil 13 HD 7990 comes with a turbo switch that overclocks the GPU from 925MHz to 1000MHz. The switch doesn’t affect memory, which will remain at 1375MHz (5500MHz GDDR5 effectively).

Note that the memory is specified at 1500MHz and we had no trouble upping the clock from 1375MHz to this number. The GPU cooperated quite well too, and we easily overclocked it to 1125MHz without any voltage or fan RPM changes.

PowerColor has its own overclocking tool dubbed the Powerup Tuner, which offers basic options to overclock the GPU, memory or manage fan RPM.

The Devil 13 HD 7990 we received had no issues with GPU thermals, unlike the first batch of these cards that was sent out to reviewers. It’s strange how the problem went largely unnoticed at first, but thankfully it didn’t require redesigning the cooler. Namely, the GPU overheated because the cooler wasn’t tightened enough, but the company solved this by adding extra washers, so retail versions of the card have normal thermals. You can check out the washers below.

When idle, the Devil 13 isn’t fully inaudible, but you won’t hear it much either. Once the GPU is loaded, the fans will be noticeably loud, but still far from unpleasant because the noise isn’t high pitched. The GTX 690 is quieter although it has dual slot cooling with a single fan. Note that Devil 13 cooler takes up three slots and has three fans. You can manage fan RPM via most popular tools.

If there’s anything good about loud fans, it’s quality thermals. Namely, GPU temperatures didn’t exceed 76°C. After we overclocked the card to 1125MHz GPU and 1500MHz memory, the figure increased by two (78°C).

Power Color Devil 13 HD 7990 graphics card quickly took the top spot when it comes to power consumption. Nvidia’s GTX 690 ended up on average some 50W better than the Devil 13. Once the turbo BIOS switch is activated, the card will consume 10 to 15W more, but this is to be expected as it also involves an 8 percent GPU clock boost.

Power Color HD 7990 Devil 13 is the first card to combine two Tahiti XT graphics processors onto a single PCB. We must say we expected AMD to come up with the card first, but it wasn’t the case and many partners are already preparing to launch their own HD 7990 cards. In fact, HIS’ HD 7970 X2 is almost done, so Power Color’s Devil 13 will soon get a worthy opponent from its own family.

The Devil 13’s fiercest adversary though is the GTX 690, which turned out superior in quite a few fields. When it comes to performance, there are no clear cut winners, at least not as decisively as we used to see. The Devil 13 HD 7990 will wipe the floor with the GTX 690 in some games, but the GTX 690 repays the “favor” in others. Nvidia’s champ leads when it comes to silence, consumption and driver stability.

We got the impression that the GTX 690’s SLI runs better than the HD 7990’s CrossFire. However, the latest application profile solves certain CrossFire issues in some games, most notable of which in our case was Crysis 2.

The Devil 13 HD 7990 runs at 925MHz for the GPU and 1375MHz for GDDR5 memory, but switching on turbo BIOS will overclock the GPU to 1000MHz. We didn’t expect much when it comes to further overclocking potential, but overclocking the GPU to 1125MHz was a breeze.

Of course, all that muscle comes at a price and the Devil 13 HD 7990 is the hungriest card we’ve had on our test lately. Due to high power demands, the Devil 13 has three 8-pin power connectors.

The triple-slot cooler looks powerful and does a pretty good job, but we were slightly disappointed with noise levels. The fans, three of them to be exact, are noticeably louder than the GTX 690’s single fan.

The Devil 13 HD 7990 can drive five monitors (up to three DVI displays thanks to bundled active mini-DP-to-DVI dongle).

Power Color’s HD 7990 that runs at 900MHz (slower than the Devil 13) costs around €880, which is what the cheapest GTX 690 goes for. The Devil 13 HD 7990, with its turbo overclocking, is priced €50 higher.

Power Color made and launched the HD 7990 on its own and this effort should not go unrewarded. It is thanks to this company that we got to see a Radeon with two Tahiti XT graphics processors, while the GTX 690 got itself a worthy opponent.

Certified Windows 8 drivers are already available for graphics cards from both sides, AMD and Nvidia. We decided to use the latest beta drivers that, although currently lacking certificates, promise better performance. We used dual GPU graphics cards because they put greater demands on the driver.

Dual GPU HD 7990 ended up significantly slower in some tests in Windows 8 than in 7, so we checked the results with a single GPU HD 7950 as well. Note however that AMD has not yet certified any drivers for its dual GPU HD 7990, because the card hasn’t been officially introduced yet. Power Color is the first AMD partner that went for the HD 7990 and dubbed it Devil 13, showing just what two Tahiti GPUs are capable of. We’ve heard of few other partners that introduced the card, but apparently it is a Power Color card under another name.

If you’re planning on strapping your rig with an HD 7990, you should know it has certain issues on PCI-E 3.0 slots with some motherboards. This can be fixed with a BIOS update (if such is available for your motherboard). We used the HD 7990 card with EVGA’s X79 FTW motherboard, but not before we disabled GEN 3 support in the motherboard BIOS.

We had no visible microstuttering etc. while gaming. The HD 7990 wouldn’t run Crysis 2 in CrossFire under Win 7. There is an application profile that’s supposed to fix this, but when we got down to it, we already exceeded the number of allowed installations for the game. EA support is currently offline (as well as previous two days) so we cannot investigate the matter at this time. We tried our luck with EA’s German support team, but we gave up after half an hour of waiting on hold. Unfortunately, this isn’t the first time we’ve called tech support over authentication and we really feel for gamers or users who have to go through this.

As you can see from the results below, gaming results in Win 7 and Win 8 are pretty much the same.

Gaming under Win 7 or Win 8 largely depends on graphics drivers, so Nvidia and AMD made sure to provide quality drivers for Windows 8. In general, gaming with Win 8 isn’t much different, although we’re sure many will stick to the lucky 7.

Gainward’s GTX 690 graphics card is based on Nvidia’s reference design, just like the rest of the aircooled GTX 690 cards that can be found. Nvidia openly stated that they don’t want the card’s air cooler replaced, although they did allow EVGA to strap it with water cooling. You can already find many different water blocks available, but also few air coolers such as Arctic Twin Turbo GTX 690, which launched a few days ago.

We’ve already tested the GTX 690, but the card performed slightly better today. However, it is not due to overclocking but rather due to the new drivers.

GTX 690 uses GK104 GPU, which is a part of Kepler family. Base GPU clock stands at 915MHz, which is 9% lower than the GTX 680’s Base Clock (1006MHz). In fact, this is the only difference when compared to GTX 680’s GPU. The rest of the specs reveal that the GK104 use on GTX 690 is identical to the one on GTX 680 cards. There’s no difference in memory either – the bandwidth is identical. The memory runs at 6008MHz (GDDR5) and each GPU has 2048MB of GDDR5 at its disposal.

The packaging that Gainward used for GTX 680/670 cards is already quite bulky, so there was no need to make a bigger one. The design is solid and will keep the card safe. Additionally, Nvidia’s decision to put the cards’ names on the sides made shopping for a new card much easier.

The box holds a single 2x6-pin to 8-pin power cable, driver CD and Quick Start Installation Guide. The CD includes EXPERTool graphics card management utility and a good part of the Quick Start Manual is reserved for it. Naturally, we’ll discuss this in more detail in our OC section.

Gainward GTX 690 is a spitting image of the reference card. When it comes to this particular cooler, Nvidia has every reason to be proud and if there was a graphics card beauty pageant, it would definitely rank highly. However, the GTX 690’s cooling deserves praise for much more than its looks – it’s efficient and isn’t loud. No wonder Nvidia likes to show this card off so much.

The cooler is two slots wide. On top of the cooler you’ll find “GEFORCE GTX” logo. The LED under the sign glows when the PC is ON.

The fan is enclosed in a magnesium bracket. Not only does this improve the looks, it ensures superior thermal and audio properties than most commonly used plastics and aluminum.

It’s obvious that the cooler received special treatment and every single inch of it breathes style. The side panels are made of polycarbonate and let users peek into the heatsink. The cooler actually has two separate heatsinks. The fan is placed in the middle, so as to cool both heatsinks at the same time.

Both heatsinks sit on nickel-plated vapor chambers. Vapor Chamber technology enables for making compact coolers, and GTX 690 is a perfect example of this.

Two GK104 graphics chips are linked via PCI-Express 3.0 SLI bridge chip (PLX design with 48 lanes). As for power supply, Nvidia used 10-phase power for the two GPUs and two phases for memory. The PCB is made of ten layers with 2OZ of copper per layer. All the components are low-profile, so that the cooler could sit on them comfortably and decrease turbulence and noise.

The card uses Samsung GDDR5 memory. The chips (model K4G20325FD-FC03) are specified to run at 1500 MHz (6000MHz GDDR5 effectively).

Complexity of the design is evident from the quite populated back of the PCB.

GTX 690 has four video connectors and all can be used simultaneously. The card has plenty of power to run 3D on three displays. The DVI connectors, all three of them, are dual-link capable. Two of them are DVI-I which means that they won’t support an analog VGA display. HDMI sound chip is in the GPU and that card comes with a DVI-to-HDMI dongle.

GTX 690 comes with a single SLI connector, meaning it can run in quad SLI.

Gainward GTX 690 is a card based on reference design and although it won’t allow for more extreme overclocking, you can get some extra performance with little to no effort. Our reviewing sample allowed us to up the GPU Base clock from reference 915MHz to 1015MHz, which is an 11 percent overclock. The highest clock we hit after our overclocking was 1176MHz. The memory cooperated as well and we upped the clock by 250MHz (1000MHz GDDR5 effectively). Together this translated to up to 16 percent higher gaming performance.

Of course, we took Ganiward’s EXPERTool out for a spin, as it now supports Kepler chips. Unfortunately, we could not get readings from both GPUs at the same time, but overclocking is easily doable from the panel with the same name.

Nvidia is pleased with reference air cooling and so are we. In short, it’s better than any high end dual-GPU reference cooling solution we’ve tried. At the same time, the cooler doesn’t get too loud, not even when the card works full throttle, which is always a plus, especially for a card of this caliber.

The fans won’t get loud in AUTO mode, as their RPM does not exceed 2300. Only when you go beyond 2600RPM will you be able to hear it good from inside your rig. The card remains almost inaudible when idle. To put things in perspective – the GTX 690 is about as quiet as a single GTX 680, but definitely quieter than two GTX 680s in SLI.

We didn’t have to speed up the fans manually for our overclocking, but we did it anyway to see how much will the temperature drop. Running at 1170MHz in AUTO fan mode, the GPU went up to 84°C. We noticed that the fans weren’t significantly faster than at reference clocks, which explained the higher GPU temperature.

Once we maxed out the fans, we measured 77°C. Note that the maximum allowed RPM, which is 2900, made the fans quite loud, but not what we’d call unbearable.

Power Consumption

Consumption is significantly lower and as much has 90W lower than what a GTX 680 SLI would draw.

Nvidia did a great job and Kepler has proven its dominance over Fermi generation, both in power consumption and thermals. These two advantages in particular are what made it possible to build a dual-GPU card that’s quieter than all the earlier dual-GPU cards, while being faster at the same time.

Gainward could have perhaps personalized the cooler, but we won’t pick hairs and besides – if it ain’t broken, it don’t need fixing. The reference cooling is, dare we say it, sexy and offers good performance at decent noise levels. In fact, considering that this is a dual GPU card, the thermals-to-noise ratio is quite nice. Additionally, the GTX 690 is quieter than two GTX 680 cards in SLI mode.

As far as performance goes, the GTX 690 is slightly behind two GTX 680 in SLI, mostly due to lower GPU clocks. In fact, the GTX 690’s GPUs run 9 percent slower that the ones on GTX 680 cards, which is practically the only difference between these GPUs.

Overclocking potential isn’t as good as the GTX 680’s, but we managed to squeeze out up to 16 percent better performance with very little effort. When it comes to consumption, the GTX 690 beats a GTX 680 SLI system by almost 90W.

The GTX 690’s performance is mouth watering and although some may say it’s an overkill for gaming, remember that 1920x1080 is a joke for what this card can do. If you want to play at 2560x1600 or are planning for some surround gaming, then you’ll want only the best. Two overclocked GTX 670 cards may be more affordable, and the same goes for two GTX 680s in SLI, but the GTX 690 offers a serious punch in a single package.

Gainward GTX 690 currently goes for about €900, which is a big plus as it cost €1000 only a few months ago. Not your average pocket money for sure, but not your average budget card either. Besides, we’re talking about the fastest graphics card on the market and that sort of performance never came cheap. In conclusion, if you’re looking for the leanest and meanest card on the market, Gainward GTX 690 has your name written all over it.

According to a post over at Hardware.fr and other industry sources, PLX and its PEX8747 PCI-Express Gen 3.0 bridge may be the reason for delays seen in GPU and motherboard markets.

PEX8747 is currently the only PCI-Express Gen 3.0 bridge chip on the market. It has been used on most high-end motherboards, Nvidia's dual-GPU GTX 690 graphics card and most probably the recently pictured dual-GPU HD 7970 X2 graphics card as well.

According to available info, PLX has experienced a high demand for the chip. While some companies managed to get the inital batch of the chip, others were not that lucky.

The same reason could be behind the HD 7990 delay. As you already know, there has been a lot of talk about that card and although we've heard rumors it could show up at Computex, they quickly went missing.

Of course, AMD could use this delay to pair up some Tahiti XT2 chips for the company's HD 7990 cards, bringing higher clocks and lower power consumption to the table. Nvidia, on the other hand, is stuck with the launched GTX 690 which is really hard to find.

Of course, PLX is probably ramping up production and things should be better eventually.

Nvidia may have the red light on for custom GTX 690 graphics cards but that didn't stop Asus from coming up with their own Mars III dual GK104 solution that features no less than 8GB of memory on a single PCB.

As far as the specs go, the new Asus Republic of Gamers Mars III graphics car features two 28nm Kepler GK104 GPUs that are, at least according to various sources, working at clocks that match if not exceed those of the GTX 680. In order to keep those two GPUs well fed with power, Mars III will need three 8-pin PCI-Express power connectors, something that we are seeing on custom HD 7970 X2 graphics cards. Not to dissapoint on the memory side, Asus managed to squeeze a total of 8GB of memory, 4GB for each GPU.

New ROG Mars III graphics card is cooled by a hefty, but still dual-slot cooler, with no less than three fans. It features three DVI and one mini-DisplayPort outputs.

As it was the case with previous ROG Mars graphics cards, this one will also be a limited edition graphics card with limited quantities. Asus ROG Mars graphics cards have been known for their insane price tags and most probably this one will not be a cheaper surprise either. It will most likely end up to be much more expensive than your average GTX 690 and is expected to show up sometimes later this year.

Geforce GTX 690 launched on May 3, but it wasn’t until recently that first batches of cards were shipped to stores. So far you’ll only find reference design cards, but we’re sure that some water blocks or special coolers will crop up soon. Nvidia’s GTX 690 launched at $999, which is quite saucy, especially to the average gamer. However, the golden rule applies – no pay, no play. Well, at least not at this fps.

GTX 690 is quite scarce in Europe and it starts from €934.92. To make matters worse, available cards are priced at €958.50 and higher. Today, we’ll preview EVGA’s GTX 690 4GB based on the reference design.

We’re not sure any aesthetic effort, especially at such short notice, would’ve surpassed Nvidia’s design – the picture below speaks louder than words.

The only difference from the reference design is that EVGA made a LED controller utility to control the lighting. Please note that this utility is ONLY compatible with the EVGA GeForce GTX 690.

EVGA currently offers two GTX 690 cards. One of them is a standard EVGA GTX 690, which we’re testing, while the other is EVGA’s Signature Edition. Naturally, EVGA’s planning for more aces up its sleeve and hinted that a special model is also in the works.

Both the standard and the Signature Edition share the same specs including a total of 3072 CUDA cores, 915MHz base and 1019MHz boost clocks and a total 4GB of GDDR5 (2GB per GPU) memory clocked at 6008MHz and paired up with a 256-bit memory interface for each GPU. If you recall, the memory’s reference clock is the same as on the GTX 680. However, GTX 680’s GPU runs at 1006MHz.

Signature Edition doesn't change anything but throws in a special bundle that includes limited edition EVGA gaming poster, GTX 690 Signature t-shirt and EVGA 690 signature mousepad. The plain version is currently listed at EVGA's web shop for US $999.99 while the Signature Edition goes for US $1049.99. We can’t help but wonder about the value one gets for $50 extra, since it may be considered a bit too steep for those who make EVGA their brand of choice.

GPU Boost works on GTX 690 as well, so you can easily expect GPU clocks beyond 1GHz while gaming. Below you’ll find GPU clocks we got playing Metro 2033 at 2560x1600. As you can see, clocks went up to 1058MHz.

The LED under the sign „GEFORCE GTX“ shines constantly but EVGA allows users to play around a bit. The picture below shows the LED controller.

The first drop down let us select the card logo we want to adjust (option for Quad systems). The second drop down let us select various different options. IMPORTANT NOTE: Anything other than manual control REQUIRES EVGA Precision X to be running in the background! For FPS Monitoring make sure that you have FPS monitoring enabled in EVGA Precision X.

We were very pleased with noise levels and while you’ll hear the fan from a closed case, it’s not particularly loud. Thermals are on par with single-GPU GTX 680. Bear in mind though that GTX 680’s GPU runs at higher clocks, while GTX 690 carries two GPUs clocked lower. All in all, thermals and noise levels are good.

The card has two dual-link DVI-I outs, one dual-link DVI-D out and one mini DisplayPort out. EVGA used protective dust caps for the connectors.

We played Metro 2033 at 2560x1600 with maxed out effects. Compared to GTX 680, we see a whopping 76 percent better performance. Once we’re done with the full review, we’ll have GTX 680 SLI results that will paint an even better picture of the GTX 690, so stay tuned.

They are all clocked at 2x 915MHz, have 1502MHz memory, the Shaders work at 2x 915MHz and come with 2x 256 bit memory interface for 2x 1536 stream processors and 2x 128 texture memory units. This card is way too expensive for our taste but it is at least neater solution than two GTX 680 or two Radeon HD 7970 cards.

We did a quick check of GTX 690 pricing using our favourite price search engine and, although not available, the GTX 690 is listed all over Europe and the lowest price stands at €927,76.

The cheapest one comes from MSI and is listed at Austrian e-tailer Mylemon.at. Alternate is currently listing a total of four cards but the price ranges from €979 to €999. The same situation is with every major retailer/e-tailer around and most common price is €999.

The UK based Overclockers and Scan sellers are pretty much the same and the card is still not available. Overclockers.co.uk lists their own branded card with a prce set at £839.99 incl. VAT, but most cards are listed somewhere between £869.99 and £899.99 incl. VAT.

Of course, all currently listed GTX 690 graphics cards are based on Nvidia's reference design and we are quite sure that it will stay that way for quite some time. We already wrote before that this one might show up together with the GTX 670 or on the 10th of May, despite the fact that we had a chance to see reviews a few days ago. Of course, there is no doubt that it will be the scarcest graphics card around due to high price, low demand and the fact that the availability situation with GK104 and GTX 680 cards is also not that great.

Following the GTX 690 launch yesterday, EVGA has announced its own lineup based on this dual-GPU beast, including standard EVGA GTX 690, a Signature Edition card as well as a hint that a special model is also in the works.

Both the standard and the Signature Edition share the same specs including a total of 3072 CUDA cores, 915MHz base and 1019MHz boost clocks and a total 4GB of GDDR5 (2GB per GPU) memory clocked at 6008MHz and paired up with a 256-bit memory interface for each GPU.

As expected considering that Nvidia certainly made quite an effort in designing this graphics card, the Signature Edition doesn't change anything but throws in a special bundle that includes limited edition EVGA gaming poster, GTX 690 Signature t-shirt and EVGA 690 signature mousepad.

On its GTX 690 product page, EVGA also hinted that there is a some sort of "special edition" coming and we must say that we are certainly looking forward to it, although Nvidia was pretty keen on saying that we won't see that many custom cards.

The plain version is currently listed at EVGA's web shop for US $999.99 while the Signature Edition goes for US $1049.99. You can check out the EVGA product page here.