Yesterday, Nvidia announced a new line of laptop GPUs for 2013, the GeForce 700M series. Don't get too excited, though—these GPUs are identical to their 600M series predecessors in most ways. As CEO Jen-Hsun Huang said in his keynote at the 2013 GPU Technology Conference, the company is sticking with its current "Kepler" architecture until 2014, so these chips use the same architecture as last year's models. They also use the same 28nm process and GDDR5 (or DDR3) memory as the 600M series. So what's the difference?

The answer is "not a whole lot," though there are some tidbits here and there. AnandTech has done a characteristically thorough job of wrangling the specifications of the chips, showing a group of chips with marginally higher clock speeds than their predecessors—the GeForce GT 745M, for example, has a GPU clock speed of "up to" 837MHz and memory clock speed of "up to" 5GHz, compared to 710MHz and 4GHz for the GeForce GT 645M. These increases should be good for gains of about "15-25 percent relative to the previous models," provided that PC makers don't lower these clock speeds too much to save power. This is a respectable boost over current chips given that the architecture and manufacturing processes are both the same, but not enough to warrant upgrading that laptop you bought a couple of months ago.

The specific GPUs being announced are all aimed at mainstream laptops rather than high-end gaming laptops; the GeForce GT 750M, 745M, and 740M GPUs cover the middle end of the market, while the GeForce GT 735M and 720M are aimed at lower-end laptops that want something slightly better performing than Intel's integrated graphics. High-end gaming laptops will continue to be served by "last-generation" products like the GeForce GTX 675MX and 680MX, though since these chips are still based on the Kepler architecture, new buyers won't be missing out on much.

A few other, smaller innovations are also on the table: the 700M series supports GPU Boost 2.0, a tweaked version of the technology that allows GeForce GPUs to exceed their stated clock speed as long as there's enough power to do it and the chips aren't running too hot (similar in concept to Intel's Turbo Boost feature on the CPU side). Version 2.0's main innovation is that it can adjust speeds based on the temperature of the GPU and the surrounding system; version 1.0 made its decisions based only on power usage.

Nvidia says that games can get a 10-15 percent bump through GPU Boost 2.0, but that figure depends on the laptop in question. Even with integrated graphics, laptops can get quite hot while gaming, and I wouldn't expect there to be a whole lot of thermal headroom available for big, sustained clock speed boosts. This feature makes more sense in a desktop with ample airflow and cooling, and indeed GPU Boost 2.0 was originally introduced in Nvidia's ultra high-end Titan graphics card.

Other advertised features are already supported by current 600-series GPUs, including Nvidia's Optimus technology (which seamlessly switches between the dedicated Nvidia chip and integrated Intel chip to save power) and the "GeForce Experience" software that tries to automatically enable the best graphics settings in a given game that will run smoothly on a given GPU.

Rebranding GPUs is by no means a new phenomenon, and Nvidia is by no means the only company doing it. OEMs in particular like their new products to look like upgrades in every way from last year's models, and bumping the model number of a GPU (regardless of the underlying architecture) is one of the easiest ways to do this. As TSMC's 28nm process grows more reliable and more mature, increasing clock speeds beyond what was possible earlier in the process' lifespan is also an easy way to pass the benefits of that maturity on to the customer. Buyers should simply be aware that these "new" 700-series chips don't present much by way of increased performance or decreased power consumption over last year's 600-series chips. If you're waiting for something truly new, both Nvidia and AMD are due to release all-new GPU architectures next year.

Latest Ars Video >

War Stories | Ultima Online: The virtual ecology

When creating Ultima Online, Richard Garriott had grand dreams. He and Starr Long planned on implementing a virtual ecology into their massively multiplayer online role-playing game. It was an ambitious system, one that would have cows that graze and predators that eat herbivores. However, once the game went live a small problem had arisen...

War Stories | Ultima Online: The virtual ecology

War Stories | Ultima Online: The virtual ecology

When creating Ultima Online, Richard Garriott had grand dreams. He and Starr Long planned on implementing a virtual ecology into their massively multiplayer online role-playing game. It was an ambitious system, one that would have cows that graze and predators that eat herbivores. However, once the game went live a small problem had arisen...

Andrew Cunningham
Andrew has a B.A. in Classics from Kenyon College and has over five years of experience in IT. His work has appeared on Charge Shot!!! and AnandTech, and he records a weekly book podcast called Overdue. Twitter@AndrewWrites