Nvidia has finally mentioned it has included beta support for Maxwell in its most newest drivers. I've been trying to tell some people for a while that Maxwell support wasn't included in drivers until now and that our dear 9 series cards have been running on Nvidia's Unified Driver Architecture (UDA) which is forward compatible, especially for reference design graphics cards. Now Nvidia has confirmed it with their latest blog post as well as promising big performance gains henceforth. Enjoy, but bear in mind that at this stage the performance benefits are for mostly for Kepler GPUs. Maxwell should see Windows-like gains later.

'Our new driver for the Mac Pro offers up to 80 percent improved performance for Macs with Kepler GPUs. And, for the first time, our driver includes beta support for MacBook Pros and iMacs with Kepler GPUs, as well as beta support for those using Maxwell GPUs in older Mac Pro systems.'

In related news, the current El Cap beta driver continues to work in Public Beta 6. Thanks to @flowrider for being first to test.

Yes, it's the Web Driver released on 8/18/15 for Yosemite 10.10.5 and is point specific. It is well known that it supported both Kepler as well as Maxwell Cards. Folks have already said that driver is faster than previous drivers. And AFAIK, the 346.02.03f01 Web Driver is not a Beta, it is a final release. The Beta Web Driver for El Cap has not yet been updated to the level of the latest Yosemite Web Driver.

Really interesting how Nvidia publicly acknowledges Maxwell yet there aren't any "official" Maxwell GPUs for Mac - yet they say sure, go ahead and add a Maxwell card to an older Mac Pro. Tacit acknowledgement of our little scene, and egg on Apple's face for doubling down on AMD GPUs? Looks like Nvidia gave up on trying to make a deal with Apple and now are resorting to shaming them. Hope it works.

Really interesting how Nvidia publicly acknowledges Maxwell yet there aren't any "official" Maxwell GPUs for Mac - yet they say sure, go ahead and add a Maxwell card to an older Mac Pro. Tacit acknowledgement of our little scene, and egg on Apple's face for doubling down on AMD GPUs? Looks like Nvidia gave up on trying to make a deal with Apple and now are resorting to shaming them. Hope it works.

Click to expand...

That would be funny but there are probably only a couple of thousand cMP machines in the world with Maxwells added. But we are a vocal bunch and as our debates and benchmarks demonstrate we really do not want second best. The beta drivers could signal that Maxwell GPUs are going to be forthcoming, but possibly only in the iMac.

Nvidia and Apple are now competing in the mobile, television set top box and automotive market so there will naturally be politics involved.

Small but vocal is right. It can't escape Apple's notice that modern Nvidia GPUs in cMPs are pummeling their AMD counterparts in tests. Would explain why Nvidia is even bothering to keep updating the drivers for such a small userbase.

Small but vocal is right. It can't escape Apple's notice that modern Nvidia GPUs in cMPs are pummeling their AMD counterparts in tests. Would explain why Nvidia is even bothering to keep updating the drivers for such a small userbase.

Click to expand...

No on both counts. Apple couldn't care less whose chips are best. Long ago, its analysts will have figured out that AMD and Nvidia are close enough in everything but the very top end. And, until Fiji, AMD's much smaller die size has meant it offered the best cost/performance ratio.

The only reason that Nvidia keeps updating its drivers is in the hope that they beat AMD to the next supply contract. To be in with a chance, they need to show Apple that there are drivers ready to go.

I agree, Apple couldn't really care less which GPU is at the top. I pretty much can guarantee AMD is offering bargain basement prices for Apple to keep using their parts, the embedded business is nearly all AMD has at this point.

GM107 (i.e. the GTX 750 Ti) is way better than the GPU in the new rMBP though. GM204 (GTX 970, GTX 980) is way better than the GPU in the riMac. It's not just the high-end where NVIDIA is ahead, they are clobbering AMD in perf/watt across the entire spectrum.

I agree, Apple couldn't really care less which GPU is at the top. I pretty much can guarantee AMD is offering bargain basement prices for Apple to keep using their parts, the embedded business is nearly all AMD has at this point.

Click to expand...

Agreed. If Apple wanted the "best" GPUs in their products they would have switched back to nVidia years ago because nVidia has absolutely clobbered ATI in every metric, at every point along the performance continuum, for quite awhile. If ATI's cards get any worse, they will have to pay Apple to use their crap just so they can move product and claim the sales volume.

GM107 (i.e. the GTX 750 Ti) is way better than the GPU in the new rMBP though. GM204 (GTX 970, GTX 980) is way better than the GPU in the riMac. It's not just the high-end where NVIDIA is ahead, they are clobbering AMD in perf/watt across the entire spectrum.

Click to expand...

The "problem" is that AMD is cheaper across the entire spectrum, and that's the only thing Apple cares about.

That's especially sad when you look at the machines Apple sells and realize that they need low TDP cards for their tiny shiny cases, but they prefer to roast their hardware to save a few $$. In my hackintosh I don't care about TDP at all, so I'll just go for the cheapest option, but Apple should.

With the old GTX470/480 cards I used to see GPU temps of 90C and I was concerned about longevity.

Apple and AMD are running that 5K with a piece radiating 105C, inside with a very pricey screen just mm away. I'll bet $20 that these iMacs will eventually have brown/yellow areas over the GPU. (if the logic board lives long enough)

I have stated it before, the presence of these 2nd rate GPUs proves that Apple isn't the least bit interested in making the BEST performing computers, just the computers that create the image of solidity and are the most profitable.

With the old GTX470/480 cards I used to see GPU temps of 90C and I was concerned about longevity.

Apple and AMD are running that 5K with a piece radiating 105C, inside with a very pricey screen just mm away. I'll bet $20 that these iMacs will eventually have brown/yellow areas over the GPU. (if the logic board lives long enough)

I have stated it before, the presence of these 2nd rate GPUs proves that Apple isn't the least bit interested in making the BEST performing computers, just the computers that create the image of solidity and are the most profitable.

Click to expand...

Yep. I seriously do not know what I will do when I finally reach a point where a cMP doesn't "cut it" anymore for me but there is no desirable Apple computer left to buy. Not interested in an iMac whose video card will cook the screen, and not interested in a trash can workstation full of proprietary hardware that cannot be upgraded. I could end up an iOS-only Apple user plugging my iPhone, iPad, etc. into a Windows box. Hard to imagine when I have owned Apple computers since the early 90s. I still think back to my beloved Pismo PowerBook - favorite Mac I have ever owned - and realize how far Apple's priorities have shifted away from upgradeability.

Yep. I seriously do not know what I will do when I finally reach a point where a cMP doesn't "cut it" anymore for me but there is no desirable Apple computer left to buy. Not interested in an iMac whose video card will cook the screen, and not interested in a trash can workstation full of proprietary hardware that cannot be upgraded.

Click to expand...

So many of us are in this exact same boat. Apple doesn't seem to want us any more.

MacRumors attracts a broad audience
of both consumers and professionals interested in
the latest technologies and products. We also boast an active community focused on
purchasing decisions and technical aspects of the iPhone, iPod, iPad, and Mac platforms.