GPUs aren’t limited to just 3D gaming and video decoding anymore. Senior Technical Editor Jason Saundalkar talks about what new GPUs can do and why you should care…

Graphics cards have evolved tremendously since their humble beginnings. At first they were standard components capable of tackling only basic 2D and 3D work all the while using the system's central processing unit (CPU). Now though, they're able to deliver spectacular visuals without bothering the CPU too much and, in a reversal of roles, can off-load app processing work from the CPU.

The first, massive-step forward for graphics cards came when 3DFX introduced its Voodoo graphics accelerator. This was a dedicated 3D card that needed to be paired with an existing 2D card. Although this wasn't the most elegant of designs, it worked and worked well. A few years on, dedicated cards disappeared as manufacturers produced 2D/3D cards. A year on saw the introduction of graphics processing units (GPUs), which actually off-loaded a chunk of graphics processing work from the CPU (up to this point the CPU still processed a hefty amount of graphics code even when a 3D card was present).

Moving forward a little, these GPUs then began off-loading video decoding tasks from the CPU. This really took off with the High Definition (HD) craze, and with new cards sporting High Definition Multimedia Interface (HDMI) ports, they tackled sound as well. Today, graphics cards have evolved further still and can now actually ‘accelerate' GPU-compatible, everyday applications, taking yet more burden off the computer's CPU.

Both AMD and nVidia offer GPUs that can handle tasks previously handled only by the processor. Now you can use your GPU, instead of your CPU, to handle distributed computing projects such as ‘Folding@home'. This particular project focuses on the study of protein folding in an effort to understand and eventually cure diseases such as Alzheimer's, Huntington's, Parkinson's disease and more. Previously this project was a CPU exclusive one but today there are Folding-clients that allow your GPU to ‘fold' at a much faster rate than even a top-end CPU can. Beyond this, GPUs are also able to encode video and, by year's end, there will be several other applications that will use the power of GPUs.

Now, if you're thinking, ‘Today's CPUs run at much higher frequencies than GPUs, so how could they possibly be faster at application work than a processor?' you have to remember just one thing - strength in numbers. You see while today's top, consumer-level chips sport four cores that run at 3.2GHz each, today's high-end GPUs, such as nVidia's GeForce GTX 280, feature as many as 240 distinct cores! That's a huge difference and when it comes to complex software that's multi-threaded, GPUs really shine.

Using CUDA-compatible demo software provided by nVidia, a video encode that took a little over 20 minutes on a 3.2GHz quad-core processor took under four minutes using XFX's GeForce GTX 280. That's an astonishing difference and although this isn't an apples-to-apples test (as the CUDA software wouldn't run on the CPU thus forcing me to use a different encoder), it does give you a firsthand look at the sort of performance improvement you can expect. Besides the performance benefit, your CPU is also free to do other tasks so you could easily surf the net and listen to music without slowing down the video encode process.

But as good as GPUs are at handling complex, multi-threaded software, they are as inept when it comes to single-threaded software. For example, Microsoft Word could be modified to run on a GPU, but since it doesn't rely on multi-threading and rather requires a single, very fast core to offer speedy performance, even an old 1.5GHz CPU would outperform the GTX 280 without breaking a sweat.

At present, most of the software that you and I use is still very much designed for CPUs but keep in mind that in the near future, you'll see more and more applications that are GPU accelerated. So, if you're building a new PC or upgrading your existing one, don't pour all your budget into a very fast CPU, as you could later miss out on some amazing performance improvements.