Currently available only at the US Nvidia Store, the new SLI LED bridges are based on a rigid type design so the graphics cards need to be in specific slots in order for them to work. Nvidia has launched three new SLI LED bridges, including 2-way, 2-way spaced and the 3-way SLI LED bridge.

The same Nvidia SLI LED bridges were seen in systems during the Nvidia Game24 event and are currently available for the consumers, at least in the US and Canada.

According to Nvidia, the new SLI LED Bridges will work on on the GeForce GTX 770, GTX 780, GTX 780 Ti, GTX Titan, GTX Titan Black as well as the new Geforce GTX 970 and the Geforce GTX 980 graphics cards. Of course, bear in mind that these will most likely only work on reference design graphics cards while we are not so sure about custom ones.

Unfortunately, the new SLI LED Bridges are currently only available at the US/Canada Nvidia Store, and hopefully, these will be available for the rest of the world soon. The price is set at US $29.99 for the 2-way SLI LED Bridges and US $39.99 for the 3-way SLI LED Bridge.

ASUS has launched a new member of the Mars lineup, under its Republic of Gamers brand and line of premium products, the ASUS ROG Mars 760. Although it is not the Mars everyone expected, the new ROG Mars 760 still packs enough punch to be faster than the mighty Nvidia GTX Titan.

Based on two GK 104 GPUs with 1152 CUDA cores per GPU for a total of 2304, the new ROG Mars 760 is basically an GTX 760 SLI setup done on a single PCB. These specifications, paired up with the fact that both GPUs are clocked at 1006MHz base and 1072MHz Boost GPU clocks, 4GB of GDDR5 memory (2GB per GPU) clocked at 6000MHz and dual 256-bit memory interface, were enough to make it faster than the GTX Titan and in some cases even faster than the GTX 780 Ti. The GPUs are interconnected via the PLX PEX8747 PCI-Express 3.0 bridge.

With an impressive 12-phase DIGI+ VRM, Super Allow Power components and (Nichicon GT-series) Black Metallic Caps, the new ROG Mars 760 will also offer good stability as well as an enhanced OC potential. The new ROG Mars 760 is paired up with a dual-slot DirectCU II cooler with two separate heatsinks (one for each GPU), two fans which should keep it well cooled even under load and glowing MARS LED logo.

It supports SLI, in case you want to run a pair for Quad-SLI configuration, has two dual-link DVI-I outputs, one dual-link DVI-D output, one Mini DisplayPort output and HDMI output via dongle.

According to ASUS, and unlike previous ROG Mars products, the Mars 760 will not be a limited run 1000-pieces piece of hardware, but rather a normal graphics card aiming to bring Mars down to a much reasonable levels when it comes to price and affordability. Unfortunately, ASUS still did not release any details regarding the actual price or the availability date.

As we demonstrated in our previous GTX 650 Ti Boost reviews, from Gainward’s and EVGA’s stables, Nvidia’s new card seems to tick all the right boxes and hit the sweet spot. It can cope with 1080p gaming at high detail settings, it can deliver reasonable frame rates and it doesn’t cost an arm and a leg.

Performance is adequate for most titles at 1080p, and even when the GTX 650 Ti Boost runs out of steam, it is relatively easy to tinker with detail levels and get playable frame rates. It is quite simply a very good choice for gamers on a budget, who don’t want to sacrifice a lot of visual quality, but don’t want to break the bank and get a high-end card.

However, the GTX 650 Ti Boost is the cheapest SLI capable card on the market. It doesn’t sound like much of a selling point in this market segment, but on the other hand the Boost is a relatively capable card, so the decision to include SLI support sounds interesting. True, it might be better to invest €300 in a high-end Radeon or Geforce rather than go for an SLI setup with two GTX 650 Ti Boost cards, but then again SLI could come in handy when it’s time to upgrade.

Before we take a look at the SLI results, let’s take a look at the GTX 650 Ti Boost itself. Based on the same GK106 silicon used in the plain GTX 650 Ti and GTX 660, the GTX 650 Ti Boost seems to combine the best of both worlds. It has 768 CUDA cores, somewhat less than the GTX 660, which has 960 cores. One SMX module on the GTX 650 Ti and GTX 650 Ti Boost is disabled, so they end up with fewer cores.

The GTX 650 Ti Boost features 64 texture units, while the GTX 660 has 80. However, the Boost card shares some features with the GTX 660 as well, namely Nvidia’s GPU Boost technology, the 192-bit memory bus, 2GB of GDDR5 memory and 24 ROPs.

As the name suggests, GPU Boost allows the card to boost GPU clocks depending on load and thermals. Thanks to the new bus and faster memory chips, the new card has 66 percent more bandwidth compared to the plain GTX 650 Ti, which uses a 128-bit memory bus. This is clearly a huge difference.

The GTX 650 Ti Boost SLI wasn’t too loud, but it was clearly audible. For comparison, the XFX HD 7950 Black Edition was significantly quieter and it used up less power. Of course, the level of noise varies from card to card, depending on the AIB. In this case, XFX did a very good job with the 7950 Black Edition. The dual fan cooler makes XFX’s card one of the quietest HD 7950 cards on the market. We used two GTX 650 Ti Boost cards from two different AIBs, EVGA and Gainward. Neither was too loud, but of course, an SLI setup is a lot louder than a single card. In terms of power consumption, two cards tend to be a lot less power efficient than a single, more powerful card. Fortunately the GTX 650 Ti Boost isn’t much of a power hog and the results were acceptable, provided you have a 650W PSU, or better.

Judging by the benchmarks, two GTX 650 Ti Boost cards in SLI can deliver some top notch results at 1080p. The cheapest GTX 650 Ti Boost sells for about €150-160, which means you should be able to get two for €310 plus shipping, of course. An HD 7950 costs €260, while the GTX 670 goes for €310. The GTX 650 Ti Boost SLI setup clearly outpaces the HD 7950, but then again the HD 7950 costs about €40-50 less.

The dilemma is whether there is any point in getting two weaker cards chained in SLI rather than a single, more powerful card. Unsurprisingly, most users will go for the latter option.

However, the addition of SLI on an affordable mid-range card might be a boon for gamers on a budget. It basically means that you can get a €150 card and upgrade your system down the road with a second card, probably at a much lower price. That’s why the decision to include SLI on the Boost makes sense. In case you decide you need more performance after all, upgrading your rig a year from now with a second card should be easy. In contrast, getting a second €250-300 card is a bit more painful for the old credit card.

On the whole, SLI and CrossFire drivers work very well, much better than a few years ago. Driver profiles for new games are being constantly updated, so you shouldn’t have any issues with upcoming titles. In other words, consumers should not avoid SLI or CrossFire. They are no longer reserved for ultra high end gaming rigs and with practically no driver issues they are a perfectly sound choice for the average gamer.

EVGA and Vince “k|ngp|n” Lucido have set a high bar in the new 3DMark Fire Strike benchmark, setting a new world record for each category including Fire Strike and Fire Strike Extreme in SLI mode as well as Fire Strike and Fire Strike Extreme on a single card.

EVGA noted that EVGA Pricison X 4.0 utiliy was used to overclock the graphics cards and the record was obtained using the upcoming EVGA X79 Dark motherboard and two EVGA Geforce GTX Titan graphics cards. The results are certainly impressive and according to 3DMark Fire Strike Hall of Fame, Kingpin was running Intel Core i7-3930K CPU and managed to get 22,054 points in 3DMark Fire Strike with two GTX Titan graphics cards and 11,559 points in Fire Strike Extreme, again with SLI.

With a single GPU, Kingpin and EVGA's hardware scored 14,600 in 3DMark Fire Strike and 7,308 in 3DMark Fire Strike Extreme benchmark. Currently, Kingpin is firmly holding the first place followed closely by Shammy with Asus hadware.

So far we've had a few GTX 670 graphics cards in our tests. Throughout the tests, we've seen that overclocked GTX 670 cards can score comparably to GTX 680 and what's even better, many of Nvidia's partners offer factory overclocked GTX 670s running at GTX 680 clocks. One such card is Gainward's GTX 670 Phantom 2GB.

The Phantom's GPU runs at 1006MHz while the memory is at 1527MHz. Note that the reference clocks are 915MHz for the GPU and 1502MHz for memory.

An important difference compared to the GTX 680 is that the GTX 680 comes with eight SMX units and 1536 CUDA cores (each unit containing 192 CUDA cores), while the GTX 670 has seven SMX units and 1344 CUDA cores. Nvidia kept the identical memory system used on its GTX 680 card, meaning four 64-bit memory controllers (256-bit memory interface) and 2GB of GDDR5 memory.

We already tested the GTX 670 Phantom, here, and today we'll show you what it's capable of when running in SLI. What we're particularly interested in is the price-performance ratio. We're hoping that two GTX 670 Phantom cards will score similarly to two GTX 680, which would mean that they outscore a single GTX 690. Note that buying two GTX 670 Phantom cards can save up to 250 euro compared to two GTX 680 cards or a single GTX 690.

With the launch of GTX 670, Gainward launched a new version of its ExperTool, which is now in version II. ExperTool allows for overclocking Kepler based graphics, displaying sensor readouts or doing simple fan RPM control. It looks much better than the previous versions as well. You can find it here. We must say it would've been great if Gainward threw in a sensor readouts for the second card in SLI chain as well.

Tha card comes in a really neat looking box with a handle for carrying.

Phantom cooler has its own style. The fans are hidden and they can be seen only when looking straight at the card.

Three heatpipes are in charge of transferring the heat from the coolers base to the heatsink. Two 8cm fans take care of cooling the heatsink.

GTX 670 Phantom graphics card comes with two dual-link DVIs, standard HDMI and DisplayPort connectors. The card can run up to four displays simultaneously.

The cooler base is made of aluminum instead of more commonly used copper.

GTX 670 Phantom is about 24.7cm long, which is about the same as the reference GTX 670, but the Phantom cooler will take up three slots while the reference cooler is only two slots wide. This can prove to be a problem if you are planning on 3-way SLI.

Nvidia decided to use a minimalistic PCB, which is only 17.2cm long. Actually the cooler is to blame for the GTX 670's length of 24.7cm. As you can see from the pictures below, Phantom's PCB is slightly changed. All memory chips are placed on the GPU side, while with the reference design, odd and even memory chips were placed on opposite sides of the PCB.

Beside the difference in distribution of the memory chips, Gainward ’s PCB looks similar to the Nvidia’s reference PCB showed on picture below.

Nvidia GTX 670 2GB

GTX 670 comes with 2GB of memory. It has eight memory chips, just like the reference card. Gainward’s GTX 670 Phantom runs Hynix memory chips (model No: H5GQ2H24AFR-R0C), which are specified to run at 1500MHz (6000MHz GDDR5 effectively).

Gainward GTX 670 Phantom’s cooler has to deal with the factory overclock but it manages to do its job well, at least when it comes to keeping thermals in check. Our test with a single GTX 670 Phantom graphics card revealed temperatures up to 79 degrees Celsius, but SLI drove the first card to 85 degrees Celsius.

When it comes to noise levels, a single GTX 670 Phantom is about the same as the reference GTX 670. Cooling performance and noise are even better after considering the high factory overclock of the Phantom, but we have indeed seen better. Unfortunately, our GTX 670 Phantom SLI ran a bit loud after some gaming. The fans weren't too loud, but they are louder two GTX 680s in SLI.

The fans are quiet in idle mode.

For overclocking we left the fans in AUTO mode, since manual settings didn’t affect overclocking that much. Thermals were good even after our overclock but the fans are really loud.

Power Consumption

With two GTX 670 graphics cards we can save more than 200 euros and gain similar performance to GTX 680 SLI. We need to overclock those GTX 670 cards off course, but for those who do not want to deal with overclocking, two GTX 670 Phantom cards are a viable option. The GTX 670 Phantom sports a factory overclocked GPU which is set at 1006MHz, and this is exactly the same clock used on the GTX 680.

Performance of a single GTX 670 Phantom graphics card is close to that of GTX 680 but not the same, mainly because GTX 680 has 1536 CUDA cores while GTX 670 has 1344. We haven't noticed any significant difference in games except in tessellation heavy tests. Memory subsystems on both cards are the same 256-bit ones and each card has 2GB of GDDR5 memory. As expected, GTX 670 SLI power consumption is a bit lower compared to the GTX 680 SLI.

The performance boost we got with SLI is great. We could play any game at 2560x1600. Additional overclocking is similar to what we scored with a single GTX 670 Phantom card.

The only thing we did not like with GTX 670 Phantom SLI is fan noise. The fans are not too loud but are not comfortable either. Two GTX 680 cards in SLI are a bit quieter compared to the GTX 670 Phantom SLI.

If you value quiet operation and power consumption, the best decision would be to go for GTX 690. 1000 euro buys two GTX 680 cards, or a single GTX 690. At the same time, 760 euro for two GTX 670 Phantom cards sounds like a much more reasonable choice for most of us.

In short we just showed that performance-wise, two factory overclocked GTX 670 Phantom cards can hold their own against the GTX 680 SLI, and they are certainly a more affordable option. Bear in mind though that Phantom cooling is three slots wide.

Nvidia has leaked a video showing one of its latest mobile GPUs in some Crysis 2 SLI action.

Although the company has not revealed the details, we are apparently looking at a dual GTX 580M setup on a beastly Clevo 17-incher. The GTX 580M packs 384 cores and it is clocked at 620MHz/1240MHz. The memory clock stands at 1500MHz, it uses a 256-bit bus and bandwidth is rated at 96GB/s. Basically, we are looking at the fastest mobile gaming GPU around.

In addition, Nvidia will also introduce the GTX 570M, which boasts impressive performance as well. It has 336 cores and it’s clocked at 535MHz/1070MHz. Memory speed, bus and bandwidth are the same as on the GTX 580M.

There is still no word on TDPs or pricing, but these GPUs are clearly aimed at high-end gaming laptops, i.e. very pricey and heavy designs.

Nvidia is expected to launch the GTX 580M and GTX 570M shortly, the rumor mill says sometime next week. You can check out the video after the break.

Zotac has launched yet another card that will be a part of its Multiview series of graphics cards, the GTX 550 Ti Multiview.

Multiview series graphics cards are pretty much identical to the regular ones except for the fact that they supports up to three displays connected at the same time. The new GTX 550 Ti features 192 CUDA cores and ticks at 900MHz for the GPU, 1800MHz for shaders and 4100MHz for 1GB of GDDR 5 memory paired up with a 192-bit memory interface. It comes with two DVI, DisplayPort and two HDMI outputs that can be used to connect three displays via two HDMI ports and either the DVI or DP port.

The card still uses the dual-slot cooler seen on other Zotac cards and features support for 2-way SLI, DirectX 11, Nvidia CUDA and PhysX. Unfortunately, Zotac didn't reveal any details regarding the availability of the card or its actual price but you can expect it to be priced at about €130.

Although it was initially announced and released as an OEM-only card, Nvidia's Geforce GT 545 will find its way to retail/e-tail thanks to the EVGA.

The card is based on Nvidia's reference design and features 144 CUDA cores and works at 720MHz for the GPU and 1800MHz for 1.5GB of DDR3 memory paired up with a 192-bit memory interface. It features support for DirectX 11, Nvidia PhysX, CUDA and 3D Vision, and can be paired with a same card for some 2-way SLI fun. As noted, the card retained the reference look so we are looking at a single-slot card with an active cooler with a small fan.

The new GT 545 has HDMI, D-Sub and DVI outputs and it should start to appear in retail/e-tails shops pretty soon with a price tag set at around US $150.

Although it was already rumoured back in March, Nvidia has now officially confirmed that it is licensing SLI technology to motherboard makers for use with AMD CPUs, so that AMD's upcoming, performance-oriented Bulldozer chip can be used with a pair of Green Goblin GPUs.

On its blog, NVidia's Tom Petersen confirmed the rumours that have been around for quite some time and announced that motherboards based on AMD's upcoming 990FX, 990X and 970 chipsets will have SLI support, in case you want to go that way. Tom Petersen mentioned Asus, MSI, Gigabyte and ASRock as first manufacturers that will offer SLI-enabled boards, but other vendors are also expected to join.

Tom Peterson also showed a rather perculiar picture of Asus' Crosshair V Formula motherboard box that shows Nvidia SLI sticker among all other red and one Phenom II sticker, something that honestly just looks weird.

It looks like you won't have to abandon your Nvidia card if you are waiting for some AM3+ fun after all. You can find Tom Peterson's blog here.

Confirming rumours started earlier, Dell's Alienware division has now rolled out the updated M11x "gaming netbook" and the new M14x and M18x notebooks. The M11x and the M14x should be available as of today, while the bigest of the bunch, the M18x, should be available sometimes in May.

As it was rumoured earlier, the updated M11x (R3) packs the same 11.6-inch 1366x768 screen but comes with either a 1.4GHz Core i5-2537M or the more powerful 2.6GHz Core i7-2617M CPU. The graphics got updated to Geforce GT 540M with up to 2GB of dedicated memory for some Optimus graphics switching technology, while other specs include up to 16GB of memory, up to 750GB of HDD space (or a 256GB SSD), Gigabit Ethernet, 802.11bgn WiFi, Bluetooth 3.0, 3G/4G connectivity, USB 3.0 port, card reader, 2 megapixel webcam, HDMI output and a 8-cell 63Whr battery that should be enough for up to 8 hours of work time.

The M14x packs a larger 14-inch 1366x768 or 1600x900 screen, a 2nd gen Core i7 quad-core CPU, up to 8GB of memory, up to 750GB of HDD space and pretty much all the same specs as the smaller M11x. The graphics choice is the 1.5GB or 3GB GT 555M GPU. It has the same 8-cell battery, but it is enough for 6 hours of work time. Alienware threw in a 60GHz WirelessHD tech into this one in case you want to push your content on the other screen.

The largest of the bunch, the M18x, features the 18.4-inch fullHD screen, Intel's Core i7 Extreme quad-core CPU (inluding one that gets overclocked to 4GHz), Nvidia or AMD graphics in SLI or CrossfireX, graphics, up to 32GB of memory, and all the other features that you would expect from a high-end Alienware desktop replacement notebook. This one also features the WirelessHD tech.

The base price for the M11x starts at US $999, while its bigger M14x has a base price of US $1,199. Both can already be found on Dell's site, while the M18x should show its face and price (rumoured to start at US $999) sometime in May.