Posted
by
Soulskill
on Tuesday February 09, 2010 @12:49PM
from the that's-some-prime-namespace dept.

Vigile writes "Transformers jokes aside, NVIDIA's newest technology offering hopes to radically change the way notebook computers are built and how customers use them. The promise of both extended battery life and high performance mobile computing has seemed like a pipe dream, and even the most recent updates to 'switchable graphics' left much to be desired in terms of the user experience. Having both an integrated and discrete graphics chip in your notebook does little good if you never switch between the two. Optimus allows the system to seamlessly and instantly change between IGP and discrete NVIDIA GPUs based on the task being run, including games, GPU encoding or Flash video playback. Using new software and hardware technology, notebooks using Optimus can power on and pass control to the GPU in a matter of 300ms and power both the GPU and PCIe lanes completely off when not in use. This can be done without being forced to reboot or even close out your applications, making it a hands-free solution for the customer."

...I'm all for it. But by how much will it extend the battery life? And when they say it will "Drastically" change the notebook market I doubt that; netbooks folks won't care about 3D and Desktop Replacement folks don't care if their machine is plugged in. Mabye in a smaller segment of mobile gamers this will make a difference.

I would have thought that, instead of switching between a 'low power' video chip, and a 'high power' GPU, they would have concentrated on just making the Nvidia graphics cards use lower power when not doing things like rendering 3D graphics, or decoding video? I mean, mobile CPU's have some smarts built into them to allow them to vary how much power they consume, can't they do that with GPUs?

The problem with GPU throttling is it's far more visible (pun intended). If your CPU is rapidly switching between 3.0ghz and, say, 1.2ghz, you probably won't notice at all, but if your game or video app has uneven framerates or the dreaded micro-stutter, you will feel the overwhelming urge to smash your laptop against the nearest brick wall.

GPUs typically have two power modes: power-saving (idle), and full-blast (gaming). Your device drivers kick it into high-power mode whenever you launch a 3D app, so the stutter of switching speeds happens before any animation takes place, and it stays that way until you exit the game. This is representative of typical GPU usage: you're either using it to the max, or not at all. I don't know anyone who runs their games at lower quality settings just to "save power on the GPU", you'll push the flashiest pixels your hardware can handle.

What would be quite appreciated is if the high-end GPUs had a true low-power mode that shuts off all the excess pwnage, but that's just my bias. I tend to buy the fastest GPU I can afford, and stick it out for a few years until it starts bothering me. My latest acquisition, the GTX 295, is a power hog. Even when sitting idle at the desktop, my PC chugs a hearty 400 watts to do nothing, roughly 300w to the two GPUs and the remainder for the CPU and motherboard. While gaming, this number swells to around 800w, again 3/4 of that goes to the GPUs. I'm fine with the 800w active consumption, it's the idle power draw that bothers me, because I only game for an hour or two a night, 3-4 nights a week. If I replace those two GPUs with a low-end card, my 2D performance is unaffected yet power usage drops to a much cozier 100w. Why the big GPUs need 200 more watts to do absolutely nothing, that defies even the most usurious logic. Now given the greater number of high-end desktop vs laptop GPUs, I think they should figure out how to shut down parts of the desktop GPU when not in use, rather than investing in some never-gonna-sell IGP+GPU trickery. The $25 drop on my monthly hydro bill would more than justify the expense of a higher-efficiency device. Hell, that's enough to buy the latest GPU every year!

The new thing seems to be that you can actually switch between the onboard and 'real' GPU on the fly and fast while everything is running.

The previous laptops with switchable graphics, such as my Sony Vaio which had a Geforce and an Intel chips, did have to at least reboot the graphics system (on OS X) or reboot the whole computer (Windows) in order to go to the power saving mode.

In my experience, I usually was too lazy / didn't want to close my work and kept using the good GPU all the time. The only times I'd work up the enthusiasm to actually switch over was before a flight or something where I'd know I'd not need the power.