To give you the full name, the MSI N650 Titanium TwinFrozr 2GD5/OC Boost Edition is $170 after MIR, whereas you can pick up the HD 7850 that [H]ard|OCP chose to contrast against for a mere $130 after rebate. That price difference means that NVIDIA really has to perform quite a bit better than the AMD card to beat it in a performance per price perspective. From the numbers in the review you can clearly see that the 650Ti is the better performing card, especially with the respectable overclock that [H] managed which does make it the best card under $200; on the other hand if your budget is tight the performance gap is not as big as the price gap which might make that HD 7850 a better choice.

By the way, that NVIDIA card has a Boost clock which means that it might steal some of your megahertz away when it gets too hot, which is apparently a horrible experience and if you somehow disable that feature and cook your GPU ... obviously that is not your fault.

"Today we evaluate MSI's high-end GeForce GTX 650 Ti BOOST line with the flagship overclocked Gaming Edition MSI N650Ti TF 2GD5/OC BE. With falling prices on AMD Radeon video cards we will compare it to the AMD Radeon HD 7850 to see which will emerge as the victor in the sub-$200 price price range."

Over the past couple of months there have been several leaks about a potential NVIDIA-branded tablet based on the Tegra 4 SoC. Most speculated that NVIDIA had decided to enter into the hardware market directly with a "Tegra Tab" in a similar vein to the release of NVIDIA SHIELD. As it turns out though NVIDIA has created a platform for which other companies can rebrand and resell an Android tablet.

According to NVIDIA, the Tegra Note platform will enable partners to bring 7-in tablets to market packed with the feature set NVIDIA has been promising since the launch of the Tegra 4 SoC. Those include stylus support, high quality audio, HDR camera capabilities and 100% native Android operating systems.

Maybe more interesting are the partners that NVIDIA is teaming with for this launch. While companies like ASUS have already done the development work to prepare various size tablets based on Tegra chips in the past, NVIDIA is going to introduce a couple of its graphics cards partners to the mobility ecosystem: EVGA and PNY in North America.

While we have questions about the capability for either of these companies to truly support a tabletin today's market but the truth is likely that NVIDIA is handling most if not all of the logistics on this project. What is not in question is the potential for high value: these tablets will start with a suggested retail price of $199.

We already know most of the technical details about the Tegra 4 SoC including the 4+1 Cortex A15 CPU cores and the 72-core GPU. NVIDIA claims they will get 10 hours of video playback with this platform but I would like to get data on the weight and battery size before calling that a win. The display resolution is a bit lower than other competing high-end options in the market today but the sub-$200 price point does mean there had to be some corners cut.

UPDATE: I asked NVIDIA for more information on the size, weight and battery capacity and got a quick answer. The battery capacity is 4100 mAh and the entire device weighs 320g. Compared to the Google Nexus 7, the current strongest 7-in tablet in my opinion, that is a 4% larger battery (vs 3950 mAh) and 10% heavier device (vs 290g). The Tegra Note reference is also a bit thicker at 9.6mm compared to the 8.65mm of the Nexus 7.

Summary of Events

In January of 2013 I revealed a new testing methodology for graphics cards that I dubbed Frame Rating. At the time I was only able to talk about the process, using capture hardware to record the output directly from the DVI connections on graphics cards, but over the course of a few months started to release data and information using this technology. I followed up the story in January with a collection of videos that displayed some of the capture video and what kind of performance issues and anomalies we were able to easily find.

Our testing proved that AMD CrossFire was not improving gaming experiences in the same way that NVIDIA SLI was. Also, we showed that other testing tools like FRAPS were inadequate in showcasing this problem. If you are at all unfamiliar with this testing process or the results it showed, please check out the Frame Rating Dissected story above.

At the time, we tested 5760x1080 resolution using AMD Eyefinity and NVIDIA Surround but found there were too many issues and problems with our scripts and the results they were presenting to give reasonably assured performance metrics. Running AMD + Eyefinity was obviously causing some problems but I wasn’t quite able to pinpoint what they were and how severe it might have been. Instead I posted graphs like this:

We were able to show NVIDIA GTX 680 performance and scaling in SLI at 5760x1080 but we only were giving results for the Radeon HD 7970 GHz Edition in a single GPU configuration.

Since those stories were released, AMD has been very active. At first they were hesitant to believe our results and called into question our processes and the ability for gamers to really see the frame rate issues we were describing. However, after months of work and pressure from quite a few press outlets, AMD released a 13.8 beta driver that offered a Frame Pacing option in the 3D controls that enables the ability to evenly space out frames in multi-GPU configurations producing a smoother gaming experience.

The results were great! The new AMD driver produced very consistent frame times and put CrossFire on a similar playing field to NVIDIA’s SLI technology. There were limitation though: the driver only fixed DX10/11 games and only addressed resolutions of 2560x1440 and below.

But the story won’t end there. CrossFire and Eyefinity are still very important in a lot of gamers minds and with the constant price drops in 1920x1080 panels, more and more gamers are taking (or thinking of taking) the plunge to the world of Eyefinity and Surround. As it turns out though, there are some more problems and complications with Eyefinity and high-resolution gaming (multi-head 4K) that are cropping up and deserve discussion.

If you weren't a fan of NVIDIA's last offer of in game currency for PTW Free To Play online games then how about Batman: Arkham Origins for free? If you pick up a 600 or 700 series GPU before the end of the year then you will picking up a copy for free. The TITAN and GTX 690 are not named specifically, nor is the rumoured GTX 790 but it is unlikely you would be singled out. NVIDIA will also being showing off a sneak peek of the game at PAX Prime in September.

SANTA CLARA, Calif.—Aug. 30, 2013—NVIDIA today announced it is working with Warner Bros. Interactive Entertainment and WB Games Montréal to make Batman™: Arkham Origins, the next installment in the blockbuster Batman: Arkham videogame franchise, a technically advanced and intensely realistic chapter in the award-winning saga for PC players.

Gamers who purchase a qualifying GPU from a participating partner will receive a free PC edition of Batman: Arkham Origins, which will be released worldwide on Oct. 25, 2013.

Developed by WB Games Montréal, Batman: Arkham Origins features an expanded Gotham City and introduces an original prequel storyline set several years before the events of Batman: Arkham Asylum and Batman: Arkham City. Taking place before the rise of Gotham City’s most dangerous criminals, the game showcases a young Batman as he faces a defining moment of his early career and sets his path to becoming the Dark Knight.

Batman has immense power, strength and speed—the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham’s dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA™ antialiasing, soft shadows and various NVIDIA PhysX® engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.

“The Batman: Arkham games are visually stunning and it’s great that we are able to continue building upon the amazing graphics with Batman: Arkham Origins,” said Samantha Ryan, Senior Vice President, Production and Development, Warner Bros. Interactive Entertainment. “With NVIDIA’s continued support, we are able to deliver an incredibly immersive gameplay experience."

NVIDIA will be unveiling a sneak peek of Batman: Arkham Origins at PAX Prime in Seattle, during the NVIDIA stage presentation at the Paramount Theater on Monday, Sept. 2 at 10 a.m. PT. Entry is free.

Additionally, any PAX attendees that purchase a qualified bundle from the special kiosk at the NVIDIA booth on the show floor will receive for free a limited edition Batman lithograph — one of only 1,000 being produced.

Remember NVIDIA's Shield, that game streaming device Ryan was playing with at QuakeCon but which doesn't seem to fit the role of just a gaming device since it can harness the power of other nearby NVIDIA GPUs? The Register is proposing a rather interesting usage scenario for the Shield by using the GRID VCA technology which is the basis of communications with NVIDA's servers and virtualized GPUs, which is also happens to function well with many of the virtualization programs currently in use.

When they saw Windows games being played on a Shield at VM World they realized that there would be nothing impossible about providing Office 365 as a service if you were running Server 2012 with RemoteFX installed. With HDMI out you can have the monitor of your choice and the Bluetooth capability means you can support a keyboard and mouse and suddenly you have the coolest thinclient on the block. In fact you might even be able to sit near a server with several Tesla cards installed and run CAD programs if someone could figure out how to stream a CAD program to the Shield.

Or you could just game at work.

"Some grumble that the Bring Your Own Device (BYOD) concept deserves to be called Spend Your Own Money in recognition of the cost of providing a computer hitting workers' hip pockets instead of employers'.

Such grumbles may be less sustainable now that NVIDIA's $US299 SHIELD portable gaming console can run Windows applications."

A New TriFrozr Cooler

Graphics cards are by far the most interesting topic we cover at PC Perspective. Between the battles of NVIDIA and AMD as well as the competition between board partners like EVGA, ASUS, MSI and Galaxy, there is very rarely a moment in time when we don't have a different GPU product of some kind on an active test bed. Both NVIDIA and AMD release reference cards (for the most part) with each and every new product launch and it then takes some time for board partners to really put their own stamp on the designs. Other than the figurative stamp that is the sticker on the fan.

One of the companies that has recently become well known for very custom, non-reference graphics card designs is MSI and the pinnacle of the company's engineering falls into the Lightning brand. As far back as the MSI GTX 260 Lightning and as recently as the MSI HD 7970 Lightning, these cards have combined unique cooling, custom power design and good amount of over engineering to really produce a card that has few rivals.

Today we are looking at the brand new MSI GeForce GTX 780 Lightning, a complete revamp of the GTX 780 that was released in May. Based on the same GK110 GPU as the GTX Titan card, with two fewer SMX units, the GTX 780 easily the second fastest single GPU card on the market. MSI is hoping to make the enthusiasts even more excited about the card with the Lightning design that brings a brand new TriFrozr cooler, impressive power design and overclocking capabilities that basic users and LN2 junkies can take advantage of. Just what DO you get for $750 these days?

Activision recently announced a technical partnership with NVIDIA at GamesCom. The two companies are "working hand in hand" on the development of the PC version of Call of Duty: Ghosts to implement the kinds of graphical features and technologies that PC gamers expect of a new triple-A title.

According to a NVIDIA Geforce blog post, NVIDIA developers are working on-site at Infinity Ward. NVIDIA is helping Infinity Ward to enhance the Sub D tessellation, displacement mapping, and HDR lighting. Additionally, the NVIDIA engineers are working to integrate support for the company's TXAA (temporal anti-aliasing) and PhysX technologies. The Infinity Ward game developers are also taking advantage of the APEX Turbulence PhysX tool-kit to enable realistic, physics-based, smoke clouds that will react with the environment and player actions.

Activision and Infinity Ward are also enabling the use of dedicated multiplayer servers for Call of Duty: Ghosts. In addition, Call of Duty Elite will be available for the PC version of the game including a smartphone app that allows stat tracking and profile management from a mobile device.

The Geforce blog claims that the PC version is intended to be the definitive CoD: Ghosts version, which is always nice to see. More graphical effects and features are being worked on, but IW and NVIDIA are keeping them under wraps for now.

The PC is in a really good place right now between console cycles where developers are finally starting to realize the power of the PC and what it is able to offer in terms of graphical performance and control options. PC-first development is something that I have been wanting to see for a long time (develop for the PC and port to consoles rather than the other way around), and now that PC versions are once again getting due credit and development attention (and resources), along with the upcoming consoles being based on x86 hardware... these types of technical partnerships where the PC version is being positioned as the best version are hopefully the start of a trend that will see a new surge in PC gaming!

NVIDIA announced on Wednesday that it had formed an alliance with Ubisoft to collaborate on Ubisoft's upcoming PC game titles (coming this fall). The alliance involves the NVIDIA Developer Technology Team "working closely" with the Ubisoft development studio on several new PC titles. The team NVIDIA-enhanced PC games covered by this new alliance includes Tom Clancy's Splinter Cell: Blacklist, Assassin's Creed IV Black Flag, and Watch Dogs.

NVIDIA Senior VP of Content and Technology Tony Tamasi stated in a press release that "Ubisoft understands that PC gamers demand a truly elite experience -- the best resolutions, the smoothest frame rates and the latest gaming breakthroughs." NVIDIA has reportedly worked with the Ubisoft game developers throughout the entire development process to incorporate the company's graphics technologies.

Tom Clancy's Splinter Cell: Blacklist is the first game to come out of the alliance. It features PC gaming graphics technologies such as DirectX 11 effects, parallax mapping, ambient occlusion, tessellation, HBAO+ (horizon-based ambient occlusion), and NVIDIA's own TXAA and Surround support. The latest Splinter Cell game also comes bundled with NVIDIA graphics cards.

NVIDIA did not go into details on what sort of extra PC-centric graphics features the other Ubisoft games will have, but it should be similar to those in Splinter Cell: Blacklist. Curiously, the press release makes no mention of NVIDIA's The Way It's Meant To Be Played program, though it seems that this alliance may even go a step further than that in terms of development team interaction and shared resources.

The new GeForce 326.80 beta driver is now available to download. An essential update for gamers sneaking into Tom Clancy’s Splinter Cell Blacklist, today’s driver ensures maximum performance and system compatibility in the brand new stealth title, which is jam-packed with PC-exclusive features and technology, including NVIDIA HBAO+ Ambient Occlusion, NVIDIA TXAA Temporal Anti-Aliasing, out-of-the-box NVIDIA SLI support, and much much more. For a full rundown, head on back to GeForce.com tomorrow when we’ll detail all of Blacklist’s impressive tech.

New in GeForce R326 Drivers Performance Boost

Increases performance by up to 19% for GeForce 400/500/600/700 series GPUs in several PC games vs. GeForce 320.49 WHQL-certified drivers. Results will vary depending on your GPU and system configuration.