Ad

Monday, May 09, 2011

Why I'm wary of GPU's

One of the things about getting older is you start to notice patterns; it seems like some ideas just keep coming around:

I might have missed the 3D movies of the 50's, but I did see the horror that was Jaws 3D in the 80's.

The new Push Pop Press interactive book is impressive with HD video, faster effects and it's touch screen interactivity, but it really doesn't add any new ideas that didn't exist in interactive books that were produced in the mid 90's.

So sometimes it's hard to get excited about an idea that looks like something you've seen before...

When Adobe CS5 came out, I was intrigued by the Mercury Playback Engine and it's support of GPU acceleration.

But the one thing that made me hesitate about diving into GPU's is good old Moore's Law. That and the fact that I remember back in the late 80's there was a period where there was a variety of add-on acceleration cards for Macs. We were told that the add-on cards would provide a way to improve computer performance independent of the processor. Except that within a year or two, CPUs increased performance, and the add-on cards mostly disappeared.

Sure, if you need the fastest system, and have the money, investing in GPU card's right now makes sense, but for those on a budget, will Moore's Law do in your GPU card before you know it? That's what I wondered.

Because of this, I was more than a little intrigued to see this article at ars technica which suggests that history may repeat itself:

The simple fact is that, with performance from integrated GPUs rising at a rapid pace, the discrete GPU market is about to start shrinking right out from under NVIDIA. Intel's upcoming Ivy Bridge platform will feature an on-die GPU that begins to threaten the mid-range of the discrete market the way that Sandy Bridge threatens the bottom end; and the on-die GPU with AMD's Llano is rumored to be some three times the performance of Intel's Sandy Bridge.

Those who cannot learn from history are doomed to repeat it- George Santayana

3 comments:

And yet, you can get a decent nVidia GPU under $100 that's just enough to take advantage of the GPU acceleration mode in the Mercury Playback Engine -- and that small investment, lasting through the long time it takes for nVidia to get its asses handed to them, will represent one hell of a lot of added productivity. In an ideal world, none of us should ever have to wait for: "rendering." (I hope that term is obsolete someday soon.)

$100? I looked at Adobe's list of approved GPU's, and the cheapest I could find was $300. I didn't check every one on the list, but I checked at least half, and most were $700 to $1,500... with two under $400

I might have presumed you were versed and vested in that now-cooled debacle. Adobe cut a nasty, secretive deal with nVidia to engineer this functionality that would mostly benefit Adobe, in exchange for limiting the approved list to "high-end" versions of nearly identical, same-spec'ed GPUs -- so-called Quadro cards have slightly more stringent quality assurance, and deluxe tech support. Yet they cost "quadro" the amount of money compared to their equivalent counterparts, also from nVidia.

The most hilarious part was that Adobe officially claimed, forever and still, that they had no idea what we were talking about. Yet it's so obvious.

Scratch that, the most hilarious part is that the work-around against the approved hardware list is a simple edit to a text file that takes about five seconds: