Nvidia, the maker of Intel-compatible chipsets for Apple's line of Macs, has announced it will cease development of future hardware until its suit with Intel is settled sometime in 2010.

The announcement made this week means that Nvidia has placed its nForce chipset line on hold, pending the outcome of Intel's suit against the chip maker, according to PC Magazine. Intel has alleged that a previous chipset agreement between it and Nvidia does not apply to the Core or Nehalem series of processors.

"We have said that we will continue to innovate integrated solutions for Intel's FSB architecture," said Robert Sherbin, spokesman for Nvidia. "We firmly believe that this market has a long healthy life ahead. But because of Intel's improper claims to customers and the market that we aren't licensed to the new DMI bus and its unfair business tactics, it is effectively impossible for us to market chipsets for future CPUs. So, until we resolve this matter in court next year, we'll postpone further chipset investments."

A year ago, Apple officially made the move to Nvidia chipsets with the GeForce 9400M G integrated controller, a single chip of which 70 percent is devoted to graphics processing functions. It was then that Apple embraced Nvidia's MCP79 platform, married with Intel Core processors. It was later extended to iMacs and Mac minis.

Earlier this year, Intel sued Nvidia in an attempt to stop the company from developing compatible chipsets for future generation Intel processors. Many of Nvidia's gains -- including the partnership with Apple -- have amounted to Intel's loss.

This summer, Apple was rumored to be abandoning Nvidia chips in its Macs following a contract fight, though nothing official came of it.

Nvidia's recent announcement of the Fermi architecture, geared toward the scientific community and not PC graphics, has led some to believe that the company is changing its business strategy and moving away from the high-end gaming market. Nvidia has denied those assumptions.

Nah, nvidia already has the license it needs for the Core series I think...

But that license will be moot when Core i5 and i3 mobile CPUs hit the market in Q1 2010. And I don't think any of Intel's IGPs have anywhere near the performance of nVidia's GeForce 9400M. It's going to be a huge step back in GPU performance, much like when Apple moved from the dedicated GPU of the PPC Mac Mini to the horrible performance in the GMA 900 in the first Intel Mac Mini.

nVidia would be foolish to spend millions of dollars developing new chipsets they might never be allowed to sell. It's far smarter to spend their time and money developing a new product.

So if they lose in court they'll already have a new business to take the place of the PC chipset one.

If they win in court they can decide whether it's worth it to go back into competition with Intel in the chipset business. They've proven in the past that they can make better chipsets than Intel and make good profit on them so it wouldn't take long to make up for lost time.

This is Intel wanting to take over the graphics chip market with their integrated graphics.

Expect future Mac's to have considerably less performance, this move is forcing Apple's hand but the writing on the wall has been there for quite some time since AMD bought ATI.

PC 3D gaming is nearly dead anyway in favor of consoles.

Road warriors: Get your MacBook Pro with matte screens, superdrives and separate graphics now before they are all turned into glorified glossy netbooks (MB AIRS) that can't do squat.

Netbook sales are predicted to greatly increase sales for the holidays this year.

This is the new trend, inexpensive, stripped, all machine produced, all on the logic board components wrapped in plastic shell (or metal for premium Macs) and sold a little bit less price than older models, but a hell of a lot more profit margin.

Sad days of performance are upon us.

iMac's with integrated graphics are going to go over like a lead balloon.

I think Apple is going to rock the world with a iTablet/iMac combination. The tablet doubles as the monitor when it's in the iMac cradle and taps off the quad core/integrated graphics.

Don't expect to get any good 3D performance out of the cradle though.

The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...

Well, Apple can use intel chipsets...and put dedicated cards for the gpu on them.

Remember when the Mac Mini used to cost less with PPC and a dedicated card for gpu? Now with intel/nvidia chipsets it costs more. *Shrugs. So what. GPUs that do way more and cost a 'nominal' amount are out there. Heck, the recent low end Ati card is well under a £100...and it's the kinda of card Apple should have in a low end consumer desktop...

I'd like Apple stop taking the skinflint road to specs and truly put the GPU first in it's specs. They've been lousy with gpu choice and value for money for freakin' years.

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

iMac's with integrated graphics are going to go over like a lead balloon

and what about imacs with non integrated graphics. I mean isn't that what the geeks around here have been demanding. and saying if Apple doesn't go that way, the company is doomed.

it is my understanding that what intel is putting a halt do is the single set that the local geeks don't like. not that Apple couldn't put two chips, one Intel and one whatever graphics they choose, into the systems.

so this is actually a potential forcing of Apple's hand and a win for you geeks (assuming that Apple wishes to stay with Nvidia for graphics in one form or another)

of course this is a chipset that probably wouldn't go into any machines for another year so by then the issue could be resolved

I think you're confused about the difference between an integrated chipset and a graphics processing unit. Nobody is stopping Apple from picking ATI or nVidia to supply discrete GPUs.

What's happening is Intel is preventing anyone else from making chipsets for the Core i family of CPUs.

Intel never has and never will allow AMD (who own ATI) from making compatible chipsets because they compete against each other in the CPU business.

In the past Intel did have a licensing agreement that permitted nVidia to make chipsets for Intel processors. What's in question now is whether that agreement covers all Intel processor and bus designs or whether, as Intel claims, it only covered processors that used the Front Side Bus design to communicate with a support chip commonly referred to as the northbridge.

Anyone wanting to use the old Core 2 series processors can either buy a matching Intel chipset or one from another company that makes them like nVidia. Apple chose nVidia because their chipset works well and includes an integrated GPU that's good enough for low to mid range customers. That saves both Apple and their customers money because one chip does it all.

Anyone wanting to use Intel Core i series processors is required to purchase Intel chipsets and then either live with truly awful Intel graphics or spend more money and buy separate GPUs from ATI or nVidia.

Intel felt a little insecure about their graphics *cough-slow* processors, and was beginning to feel threatened by the emerging power of the GPGPU. I understand why they'd want to squeeze out the competition (you can read a little more about it here: http://www.siliconmadness.com/2009/0...clarkdale.html), but I'm disappointed with how Intel's dealing with it, and they come off looking like a douche. Not to mention it hurts consumers. Intel should have picked up the slack long ago, and developed more competitive GPU's.

nVidia would be foolish to spend millions of dollars developing new chipsets they might never be allowed to sell. It's far smarter to spend their time and money developing a new product.

So if they lose in court they'll already have a new business to take the place of the PC chipset one.

If they win in court they can decide whether it's worth it to go back into competition with Intel in the chipset business. They've proven in the past that they can make better chipsets than Intel and make good profit on them so it wouldn't take long to make up for lost time.

Or put AMD CPU's in bed with nVidia... Can you hear the screaming yet?

Nvidia and AMD used to be big partners, but NV hasn't done much work on modern chipsets for AMD since AMD merged with ATI. Supporting AMD would be supporting their chief graphics competitor, and so far they have refused to do it.

This will not affect anything Apple currently uses (the 9400M chipset). It means that Nvidia will not make any chipsets for future Intel processors (i3/i5/i7), something we already knew.

So, come next year, it will be back to Intel integrated graphics for the Mac mini and Macbook.

Or, the introduction of the MacBiggie which has a real upgradeable cards instead of discrete graphics. The MacBiggie will be 10x10x4, have 2x HDMI and quad-core for $1199. It's gonna be awesome. I saw it on the internet.

Wrong, actually... It's not nearly dead, or on the way out or anything.

Yes it is.

For example, all the retail game stores used to sell only PC and Mac games, now they sell only console games.

Computer processors are now less powerful and multi-core to reduce heat as they can't get them to go any faster, which doesn't lend itself very well to 3D game programming advances which very few functions in a constantly changing game engine can be passed off onto other cores and realize a performance gain.

Computers are trending for cooler and portable with integrated graphics, like netbooks.

Computers are designed to do a lot of different things at once, a 3D console is designed to do one thing at once very well.

A 3D gamin console can be had for a few hundred, it would take a few thousand for a good gamin PC even close to a console in performance. This means more people can afford a 3D gamin device, thus more games.

The danger is that we sleepwalk into a world where cabals of corporations control not only the mainstream devices and the software on them, but also the entire ecosystem of online services around...

For example, all the retail game stores used to sell only PC and Mac games, now they sell only console games.

Computer processors are now less powerful and multi-core to reduce heat as they can't get them to go any faster, which doesn't lend itself very well to 3D game programming advances which very few functions in a constantly changing game engine can be passed off onto other cores and realize a performance gain.

Computers are trending for cooler and portable with integrated graphics, like netbooks.

Computers are designed to do a lot of different things at once, a 3D console is designed to do one thing at once very well.

A 3D gamin console can be had for a few hundred, it would take a few thousand for a good gamin PC even close to a console in performance. This means more people can afford a 3D gamin device, thus more games.

Steam seems to be doing fine

i don't play as much games as before, but there is more quality and less quantity these days. unlike consoles

Reading through the thread it seems some folks are confused about what is going on with the lawsuit. This graphic gives a decent outline of what intel is doing to change their architecture with Westmere. IMO Intel is purposely leaving Nivdia out of the mix. The key point being Nivdia will no longer provide a northbridge controller/GPU for Westmere.

The second question is how does that affect Apple. Apple was using the 9400M to improve graphic performance in their designs which didn't use discrete graphics so we need to compare the next gen intel Graphics vs Nividia's solution. Intel was licensing SLI from Nvida so any system with discrete graphics will see little change from the current situation.

The Imac and Macbook pro have included the option for discrete graphics so this really affect the Mac Mini and Macbook of the future. If your buying those machines you probably are not looking at a graphics powerhouse. If nothing else, there is a die shrink involved, though not all the way to 32nm. While Westmere processors will be 32nm, the graphics, and other functions that used to be done by the northbridge, will be made on a 45nm process. The smaller transistors enable much higher performance. While G45 had 10 shader cores, the new intel GPU increases that to 12. A number of performance limiting issues have now been resolved, so we should see much more competitive performance from Intel's graphics vs 9400M.

For example, all the retail game stores used to sell only PC and Mac games, now they sell only console games.

As mentioned, PC games, and console games for that matter, are moving to digital distribution. The reason store shelves are filled with console games is because the used game market makes the retailers more money, and you can't resell a PC game.

Quote:

Computer processors are now less powerful and multi-core to reduce heat as they can't get them to go any faster, which doesn't lend itself very well to 3D game programming advances which very few functions in a constantly changing game engine can be passed off onto other cores and realize a performance gain.

Less powerful than what, exactly? I'm sure a single Nehalem core can beat a single Xenon core any day of the week. Or a single core of any other consumer processor.

As I'm sure you know, even the initial Core Duo processors were faster per clock than the PPC970, and the SPEs of the Cell are certainly not comparable to a fully featured core. The graphics cards in the consoles are almost four generations old. Frankly, it's silly to expect a $200 box to be able to outperform a $1000 gaming PC.

Quote:

Computers are trending for cooler and portable with integrated graphics, like netbooks.

Perhaps so, but that is a different market segment. Why would a gamer trend toward a netbook? They're a cheap novelty.

Quote:

Computers are designed to do a lot of different things at once, a 3D console is designed to do one thing at once very well.

Not unless you're still living in 2002. The current trend is for consoles to act as media extenders and/or media hubs.

Quote:

A 3D gamin console can be had for a few hundred, it would take a few thousand for a good gamin PC even close to a console in performance. This means more people can afford a 3D gamin device, thus more games.

Really? I can put together a computer for less than $500 that can beat today's consoles on performance, crappy console ports aside. Remember, the Xbox 360 and PS3 play at low to medium resolutions.

For example, all the retail game stores used to sell only PC and Mac games, now they sell only console games.

Computer processors are now less powerful and multi-core to reduce heat as they can't get them to go any faster, which doesn't lend itself very well to 3D game programming advances which very few functions in a constantly changing game engine can be passed off onto other cores and realize a performance gain.

Computers are trending for cooler and portable with integrated graphics, like netbooks.

Computers are designed to do a lot of different things at once, a 3D console is designed to do one thing at once very well.

A 3D gamin console can be had for a few hundred, it would take a few thousand for a good gamin PC even close to a console in performance. This means more people can afford a 3D gamin device, thus more games.

Quote:

Originally Posted by al_bundy

Steam seems to be doing fine

i don't play as much games as before, but there is more quality and less quantity these days. unlike consoles

Several things here.

Steam has a very limited game selection and only offer the PC versions of games that had Mac versions. Also there is issue of the verification method Steam uses for those games.

MacTrippe is right in that console gaming has stomped PC gaming into the ground. Go to Best Buy, Hastings, GampStop, or even Walmart and compare the number and type of console games PC games.
Go online and the reality that Sturgeon's Law is alive and well in the PC gaming market is obvious especially if you look at the cheap game market of Flash/game creation program. The good to crap ratio is just as bad int he PC world as on the console--you just don't see it as much in the brick and mortar store as they will go with what sales. So you get a mixture of hits with casual 'why did they make this' thrown in.