NVIDIA brings Fermi Quadro 4000 kicking, mostly screaming, to Mac

The NVIDIA Fermi Quadro 4000 is now available for the Mac. Unfortunately, Open …

After some false starts, the Fermi Quadro 4000 Mac Edition was officially announced today. Since this is a Mac version of a card that was launched at SIGGRAPH 2010, not much about it is secret. The card is almost identical to the PC version: 256 CUDA cores, 256-bit memory interface, 2GB memory and the same suggested retail price of $1,199.

The only physical difference is that the PC version has two DisplayPort outputs.

For programs like Maya that support quad-buffered stereo, a 3D output is available via an optional adapter. Despite its sporting a DisplayPort output, 10-bit per channel output is not currently supported, according to NVIDIA. It should be doable with a future driver update, since Mac OS X is capable of it with qualifying displays. The 4000 also lacks the optional ECC mode of the Quadro 5000 and above, so it's of debatable value to scientific users, since results could contain errors.

CUDA, not games

This is the only Mac card available from NVIDIA, marking a full exit from the consumer Mac space (if you could have ever called a Mac Pro a consumer machine). NVIDIA is clearly hoping to combine the popularity of the Mac Pro in video with the Fermi's impressive GPGPU power to move these cards. CUDA has made some remarkable gains in professional video over the last couple years, thanks in part to Adobe's use of CUDA in programs like Premiere Pro CS5’s Mercury Playback Engine, and high-end video plug-ins like Genarts Sapphire and The Foundry’s Kronos:

￼

The increased proliferation of GPU-based 3D renderers should also make this appealing to users of CUDA applications like Octane Render and Bunkspeed Shot. Now that 64-bit versions of CUDA are available for OS X, there’s little reason to doubt these cards are going to scream for these types of GPU applications.

36 Reader Comments

What is the technical limitation that stops vendors supporting whatever OpenGL version they want with their own driver, like they do on Windows or Linux? Is there no Apple GL function for accessing extension function pointers?

What the hell is Apple's problem with current OpenGL implementations? Is it not exotic or expensive or polished enough?

Its always fascinated me the way Apple would stack in bad ass core after core on top of power house busses and super fast memory, only to shove in whatever crap video card they could find on sale in yesteryear's gaming card store, unless you pay for it, then you get a _decent_ workstation card and a prayer that your software will somehow support it.

What the hell is Apple's problem with current OpenGL implementations? Is it not exotic or expensive or polished enough?

Its always fascinated me the way Apple would stack in bad ass core after core on top of power house busses and super fast memory, only to shove in whatever crap video card they could find on sale in yesteryear's gaming card store, unless you pay for it, then you get a _decent_ workstation card and a prayer that your software will somehow support it.

Apple don't really care about their Macs (and the pros that use them) anymore - it's all about the iOS toys now.

What the hell is Apple's problem with current OpenGL implementations? Is it not exotic or expensive or polished enough?

Its always fascinated me the way Apple would stack in bad ass core after core on top of power house busses and super fast memory, only to shove in whatever crap video card they could find on sale in yesteryear's gaming card store, unless you pay for it, then you get a _decent_ workstation card and a prayer that your software will somehow support it.

No kidding, what was that, GeForce 120, or something like that, that shipped with MacPros? That was a freaking laptop graphics chipset, those cars sold for $50 on NewEgg. So yeah, you get a decked out Xeon workstation with a $50 video card in it.

The company's first quarter revenue reached more than $1 billion with a profit of $137.6 million, but the second saw a lower total revenue of $811.2 million. In the same quarter last year, the company achieved a total revenue of $776.5, posting a loss of $105 million.

While Nvidia saw strong results in the professional Quadro graphics card, Tesla GPU and Tegra mobile chip sectors, the company said that GeForce cards performed significantly worse than expected in Europe and China, resulting in a large inventory write-down.

Quote:

Really? Please point me to this magical PC video card; most of the cards that I find list an SRP in the $800s and selling price in the $700s.

What the hell is Apple's problem with current OpenGL implementations? Is it not exotic or expensive or polished enough?

Its always fascinated me the way Apple would stack in bad ass core after core on top of power house busses and super fast memory, only to shove in whatever crap video card they could find on sale in yesteryear's gaming card store, unless you pay for it, then you get a _decent_ workstation card and a prayer that your software will somehow support it.

No kidding, what was that, GeForce 120, or something like that, that shipped with MacPros? That was a freaking laptop graphics chipset, those cars sold for $50 on NewEgg. So yeah, you get a decked out Xeon workstation with a $50 video card in it.

That makes a lot of sense. Not everyone needs a GPU for their workstation.

Having the option to go to a professional card, with full OS support should be the other side of that equation, but Apple failed on that front.

Its not just the cards: Its the Open GL and the drivers too. Consistently they have been way behind the curve. Stick a top end card in a top end machine with a rat's a** for a driver, you have rat's a** output. No excuse.

What the hell is Apple's problem with current OpenGL implementations? Is it not exotic or expensive or polished enough?

Its always fascinated me the way Apple would stack in bad ass core after core on top of power house busses and super fast memory, only to shove in whatever crap video card they could find on sale in yesteryear's gaming card store, unless you pay for it, then you get a _decent_ workstation card and a prayer that your software will somehow support it.

No kidding, what was that, GeForce 120, or something like that, that shipped with MacPros? That was a freaking laptop graphics chipset, those cars sold for $50 on NewEgg. So yeah, you get a decked out Xeon workstation with a $50 video card in it.

That makes a lot of sense. Not everyone needs a GPU for their workstation.

Having the option to go to a professional card, with full OS support should be the other side of that equation, but Apple failed on that front.

Basically this (2008 Mac Pro here). I don't care that Apple defaulted to a low power card in the past, anymore then I care about their mice or keyboards. It basically goes without saying that I'm going to get those myself, same with memory and drives. But the lack of an option for modern high performance (and good OS support) at any price has always been galling. I don't even mind too heavily at this point if the GTX 580 or whatever is there and 20% more then the PC version, the irritation comes in when there's just nothing available, period, and when something finally does show up it's:A) At least one generation old, same price as when it was freshly introducedB) A single, niche, ultra high end cardC) Has miserable OS supportOr usually some combination of the above. At this point who knows exactly where the explanation lies. I think it's probably fair to speculate that it comes from the top, that Steve just doesn't get graphics applications/games, doesn't care about the mid/highend, and never will. Clearly there is some understand that Apple can't quite dump that model entirely yet, but at the same time it's the red headed stepchild, even though it is ultra pricing.

Since I'm not interested in rendering images on the monitor, just raw compute power for parallel loads wouldn't I be better off just buying an nVidia Tesla card and use the existing video card for image display purposes?

What is the technical limitation that stops vendors supporting whatever OpenGL version they want with their own driver, like they do on Windows or Linux? Is there no Apple GL function for accessing extension function pointers?

What's missing is mainly the newer versions of GLSL. Once that is taken care of, adding the few missing extensions for 3.2 compability would be a fairly quick job (they did more between 10.6.2 and the first graphics update to 10.6.2). I don't think nVidia or ATi can just add GLSL functions - they have to replicate the entire thing, which likely isn't trivial.

Zak wrote:

No kidding, what was that, GeForce 120, or something like that, that shipped with MacPros? That was a freaking laptop graphics chipset, those cars sold for $50 on NewEgg. So yeah, you get a decked out Xeon workstation with a $50 video card in it.

No, it was actually a desktop GT120 (aka 9500 GT). The iMacs are the ones that have laptop cards. It did sell for $50 though, you're right there. HP and Dell do the same with their workstations - not everyone wants a big, hot, expensive GPU, so they put something cheap in the base model.

Since I'm not interested in rendering images on the monitor, just raw compute power for parallel loads wouldn't I be better off just buying an nVidia Tesla card and use the existing video card for image display purposes?

Yes, the best way to keep your system responsive while GPU rendering is to use a dedicated card for the screen and another for the processing. When I run renders with Octane, it completely lags the displays unless I connect it to another card that's not being used. After that, it's not even noticeable.

nVidia's spec page actually reports that OpenGL 3.1 is supported on OS X. Is this a typo or are things finally moving forward in OS X? Maybe someone can confirm this with nVidia? Full OpenGL 3.1 support would be a good place for Snow Leopard to end on and with OpenGL 4.1 hopefully coming in Lion.

Since I'm not interested in rendering images on the monitor, just raw compute power for parallel loads wouldn't I be better off just buying an nVidia Tesla card and use the existing video card for image display purposes?

Yes, the best way to keep your system responsive while GPU rendering is to use a dedicated card for the screen and another for the processing. When I run renders with Octane, it completely lags the displays unless I connect it to another card that's not being used. After that, it's not even noticeable.

It looks like NVIDIA is still pursuing their external Quadro Plex solution as well, though I haven't seen much about it since launch a few years back (2006 I think?). However it also appears that there is still no Mac support either. I wonder if that has some chance of happening in the future though, if NVIDIA is serious about pushing GPU acceleration over gaming applications to an extent. It might be a lot easier to add some support for an external solution as a pure compute boost while counting on there still being an internal graphics card to handle actual display.

The quality of Ars articles has seriously declined. This is basically a press release copy. You should write what core is the card based on, what are the clocks, comparable to what other cards it is. Secondly gaming cards loose money, could you tell us how have you arrived onto this?

nVidia's spec page actually reports that OpenGL 3.1 is supported on OS X. Is this a typo or are things finally moving forward in OS X? Maybe someone can confirm this with nVidia? Full OpenGL 3.1 support would be a good place for Snow Leopard to end on and with OpenGL 4.1 hopefully coming in Lion.

This is wrong. I just spoke to a rep at NVIDIA and confirmed that OS X currently has most of the 3.0 extensions but not support for GLSL 1.3, which is the important part:

I'm running a Radeon 5870 but it's the same for the Quadro 4000, since it's using Apple's lagging GL implementation.

Quote:

The quality of Ars articles has seriously declined. This is basically a press release copy. You should write what core is the card based on, what are the clocks, comparable to what other cards it is. Secondly gaming cards loose money, could you tell us how have you arrived onto this?

Hey look, everyone – a crusty video card specs guy! You should team up with angry bike couriers and maybe the negativity will cancel each other out. Read my post for links to the gaming sector of NVIDIA performing badly.

If gaming cards, or any card, loosed money that would be awesome! Or did you mean "LOSE money," Mr. Quality of Ars Has Declined? Here's a little mental image that might help everyone remember: Goatse man has a loose (rhymes with moose) asshole. You could lose (sounds like "looz") your arm in that wet, gaping maw.

The company's first quarter revenue reached more than $1 billion with a profit of $137.6 million, but the second saw a lower total revenue of $811.2 million. In the same quarter last year, the company achieved a total revenue of $776.5, posting a loss of $105 million.

While Nvidia saw strong results in the professional Quadro graphics card, Tesla GPU and Tegra mobile chip sectors, the company said that GeForce cards performed significantly worse than expected in Europe and China, resulting in a large inventory write-down.

Quote:

Really? Please point me to this magical PC video card; most of the cards that I find list an SRP in the $800s and selling price in the $700s.

Good news for Nvidia shareholders: the company seems to have bounced back from its none-too-pleasing second quarter, posting an $84.9-million net profit for its third fiscal quarter ended October 31....

nVidia's 2nd quarter was when they were dealing with the disastrous Fermi launch, about six months behind schedule and with ridiculously poor yields from TSMC. AMD was having similar problems at the time.

To suggest that the Quadro/Tesla cards are nVidia's profit center is hilariously incorrect. They contribute to the bottom line, but they're by no means even half of what nVidia/AMD do in standard GPU business. Those cards move very small quantities. The consumer components are what make or break their quarters, and when the launch of a new consumer architecture goes tits up like Fermi did (where nVidia had to beeline the G104 chipset to staunch the bleeding that selling nothing but cut-down Fermis was causing), their financials end up getting pretty badly hit.

From this quarter's 8-K release (not Q2 that you were quoting), nVidia moved $581.9 million revenue in standard GPUs to $210.1 million revenue in pro (quadro/tesla). The standard GPU business is lower margin, to be sure, but the pro end is entirely dependent upon research and designs done for consumer GPUs; without consumer GPUs being sold in the quantity they are, there's no way a chip like Fermi would even exist.

Also there's this lovely tidbit: Q2's loss was because of a one-time weak/die packaging material charge, and without it Q2 would have had a net profit of 20.1 million, from that same 8-K.

ok - removed mention of losing money in the last quarter. I think it goes without saying that these cards rely on tech from the gaming parts. But the margins are higher, so they'd obviously want these markets to grow and I think that's where they see growth on the desktop. It's also one area where they're still leading AMD by a wide margin, so they are making a push into GPGPU and high-end apps as a means of entrenching themselves with CUDA, without having to compete directly in a tit-for-tat way they do with gaming. They're doing a good job of it.

If gaming cards, or any card, loosed money that would be awesome! Or did you mean "LOSE money," Mr. Quality of Ars Has Declined? Here's a little mental image that might help everyone remember: Goatse man has a loose (rhymes with moose) asshole. You could lose (sounds like "looz") your arm in that wet, gaping maw.

Your loose use of metaphors makes me want to lose my lunch.

You could've at least used a car metaphor. That was... disturbing when I'm in the office taking a 5min mental break and looking to Ars for relaxation.

nVidia's spec page actually reports that OpenGL 3.1 is supported on OS X. Is this a typo or are things finally moving forward in OS X? Maybe someone can confirm this with nVidia? Full OpenGL 3.1 support would be a good place for Snow Leopard to end on and with OpenGL 4.1 hopefully coming in Lion.

Maybe they know something we don't. GLSL 1.4 looks to be a smallish update of 1.3, so it's not an unreasonable target to aim for in a future point update.

But the lack of an option for modern high performance (and good OS support) at any price has always been galling. I don't even mind too heavily at this point if the GTX 580 or whatever is there and 20% more then the PC version, the irritation comes in when there's just nothing available, period, and when something finally does show up it's:A) At least one generation old, same price as when it was freshly introducedB) A single, niche, ultra high end cardC) Has miserable OS supportOr usually some combination of the above. At this point who knows exactly where the explanation lies. I think it's probably fair to speculate that it comes from the top, that Steve just doesn't get graphics applications/games, doesn't care about the mid/highend, and never will. Clearly there is some understand that Apple can't quite dump that model entirely yet, but at the same time it's the red headed stepchild, even though it is ultra pricing.

This is what has kept me from getting a Mac Pro, over the years.

The supposed advantage of "upgradable" video cards is a myth on this beast, and its been that way since the first Mac Pro came out... not like that "they're only concerned with iOS now" troll posted, above.

Its been years... so there has to be something "institutional" about it, or they would have fixed the problem by now.

Aha! So the driver has been updated. This bodes well for performance improvements. We still seem to be stuck on OpenGL 2.1, though. A quick run of OpenGL Extensions Viewer confirms this: OpenGL 3.0 support is still only 22 out of 23 extensions, 3.1 only 1 out of 8 extensions, 3.2 only 3 out of 9 extensions. Same as in 10.6.4."

Apart from that, the 5770 and 5870 gaming cards from ATI do work in the 2006-current Mac Pros, so there are GPU upgrade options available outside of this new pro card. See the thread in the Mac Ach for reference.