The Inquirer, which has been sticking it to NVIDIA for some time, is again taking the chip maker to task this week, claiming that it's supplying Apple with graphics chips for the new MacBook Pro that use materials that have caused problems in the past.

Following its own investigation into the matter, the British technology tabloid concluded that the dedicated NVIDIA 9600M GT graphics chips in the unibody MacBook Pros use the same non-eutectic solder contact bumps as the GeForce 8400M and 8600M family of chips .

Those chips were proven prone to long-term heat damage, which led to notebook recalls from Apple, HP and Dell, and caused NVIDIA to take a $200 million charge back in July as a result. In these cases there were instances where the bumps, or tiny balls of solder, that hold the chip to its circuit board would crack under thermal stress and cause the hardware to become defective.

According to NVIDIA documentation, non-eutectic solder contacts (or "bad bumps") are comprised of mostly lead (95%) and a bit of tin (5%), which can lead to clumping when the material cools. By contrast, eutectic solder contacts (or "good bumps") are made of roughly 63% tin and 37% lead, a composition which cools more uniformly and produces a more consistent grain that's not prone to the same long-term heat damage.

Since the bumps sit permanently sandwiched between the chip die and their green fibreglass package, the only way to determine their composition would be to take a MacBook Pro, disassemble it, desolder the chips, saw them in half, encase them in lucite, and run them through a scanning electron microscope equipped with an X-ray microanalysis system.

The Inquirer claims to have done just this with a MacBook Pro it bought off the shelf in California shortly after the systems were announced in mid-October. It was reportedly aided by a team of unnamed scientists who have access to the multi-million dollar tools required to properly examine the chips.

A profile of the materials found in the bumps used on the MacBook Pro's NVIDIA 9600M GT show a huge spike of lead and only a tiny spike representing tin, leading the Inquirer to conclude that the chip "is unquestionably using bad bumps." The same test run on the MacBook Pro's second, integrated NVIDIA 9400M graphics chip turned up a profile consistent with eutectic solder contacts, and therefore that chip is said to be free of the issues that may plague the 9600M GT.

A profile of the bumps on the MacBook Pro's integrated 9400M (left) compared to that of its discrete 9600M GT counterpart (right).

The Inquirer suggests that the bad bumps used on the 9600M GT are causing a problem on the new MacBook Pros often referred to as the "black screen of death." The issue, which manifests itself when the systems heat up during game play, sometimes causes the notebooks' screens to go black after just a few minutes of gaming, while the systems lock up and the audio enters into an infinite loop. Apple is said to be investigating the problem.

For its part, NVIDIA has vehemently denied the Inquirer's assessment, claiming that the "GeForce 9600 GPU in the MacBook Pro does not have bad bumps. The material set (combination of underfill and bump) that is being used is similar to the material set that has been shipped in 100s of millions of chipsets by the world's largest semiconductor company."

A representative for the chip maker had not responded to AppleInsider's individual request for comment as of press time.

We've had this problem twice at work with the early 2008 17" MacBook Pros. The Nvidia chips in them constantly give the BSOD. I get it most of the time when I step away from my computer and my computer freezes either going in or coming out of sleep. One of our MBPs was completely taken out of commission for the issue. Sad to hear that it plagues their new MBP too...

Seriously though, I would think Apple did their homework on the 9600 before using it... "Fool me once," etc. Either way, one more reason to go with the smaller Macbook unless you absolutely need a "Pro" (which isn't really a Pro product anymore anyway).

My early '08 MB Pro (that I just got about four weeks ago on Amazon) has an nVidia 8600M GT graphics chip in it, which would seem to be one of the defective ones. Though my early '08 MBP *is* from the end of the production run, though. You'd like to think the prob got fixed by then.

Hmm... even so, guess I should consider getting the Apple Care if I still can, huh?

I think $350 is an awful lot of cheddar, tho'. 25% of the cost of the machine.

Reading the story, the journalists do seem to have done their homework this time out. I don't have a problem with the source.

...

Yeah. We are careful in selecting stories from certain publications, but the Inquirer has followed the NVIDIA story pretty well and their evidence in this case is clearly sound. NVIDIA has also not been very forthcoming on subjects similar to this, and I can attest to that first hand. They have every opportunity to tell their side of the story. In addition to contacting them this morning, they know us well and how to get in touch with us. If they issue comment, I will have the story updated. I have a feeling they will not, however, citing the privacy of their client (Apple). This was the response I was given while researching the problems with the MacBook Pro's 8600 chip, many weeks before Apple ultimately confirmed the problem independent of NVIDIA.

Apple may well have known that the 9600M GT still uses the old bump material, but may have concluded that it was an acceptable risk. The 9600M GT is produced on a 65nm process instead of 80nm like the 8600M GT and it's heat consumption should be lower reducing the risk of fracture. There may also have been some tweaks made between generations to mitigate the problems even if the same material is used. I believe the new Unibodies also generally run cooler than the old MBP too. Combined, the risk may still be there, but may be close enough to tolerances that Apple and nVidia felt no additional measures were needed.

Apple may well have known that the 9600M GT still uses the old bump material, but may have concluded that it was an acceptable risk. The 9600M GT is produced on a 65nm process instead of 80nm like the 8600M GT and it's heat consumption should be lower reducing the risk of fracture. There may also have been some tweaks made between generations to mitigate the problems even if the same material is used. I believe the new Unibodies also generally run cooler than the old MBP too. Combined, the risk may still be there, but may be close enough to tolerances that Apple and nVidia felt no additional measures were needed.

Or maybe nVidia tried to pull a fast one on Apple (and others), because nVid didn't want to eat the (pretty tremendous) cost of fixing the issue.

That wouldn't surprise me at all, particularly if the costs of fixing it would jeopardize nVidia's continued existence, as opposed to merely its' bottom line. That, however, is still not really a good excuse. If you effed up, you should admit it as soon as you're aware of it, and fix it.

No doubt nVidia's lack of candor on the issue is going to really hurt them going forward. Who's going to trust them anymore? \

Or maybe nVidia tried to pull a fast one on Apple (and others), because nVid didn't want to eat the (pretty tremendous) cost of fixing the issue.

That wouldn't surprise me at all, particularly if the costs of fixing it would jeopardize nVidia's continued existence, as opposed to merely its' bottom line. That, however, is still not really a good excuse. If you effed up, you should admit it as soon as you're aware of it, and fix it.

No doubt nVidia's lack of candor on the issue is going to really hurt them going forward. Who's going to trust them anymore? \

...

While it may be in nVidia's interest to keep it quiet from the public, I can't see them withholding that information from Apple. I thought Steve Jobs publicly said they are supporting nVidia's platform and are going to use it in future models. It's almost a certainty that Jobs extracted guarantees from nVidia that this won't blow up in his face, before he gave his support, especially given the prior history of the 8600M GT. Just like Jobs supposedly told Motorola during G4 negotiations that he couldn't wait for Apple to never have to work with them again, nVidia would probably perceive Apple pulling away from them and going completely ATI as a real threat, and given Apple's media influence, it'd certainly be a PR catastrophe. If bump material is still an issue, I can't really see nVidia keeping it from Apple and risking Steve Job's wrath.

Or maybe nVidia tried to pull a fast one on Apple (and others), because nVid didn't want to eat the (pretty tremendous) cost of fixing the issue.

That wouldn't surprise me at all, particularly if the costs of fixing it would jeopardize nVidia's continued existence, as opposed to merely its' bottom line. That, however, is still not really a good excuse. If you effed up, you should admit it as soon as you're aware of it, and fix it.

No doubt nVidia's lack of candor on the issue is going to really hurt them going forward. Who's going to trust them anymore? \

...

That's what bothers me, why would they manufacture all these chips while knowing that they have "bad bumps" and running the risk of losing another few million? Or did they have to rush it out so Apple could announce it on time?
Either way, I'm really disappointed by the way Apple didn't notice this and how nVidea simply keeps lying about it...

While it may be in nVidia's interest to keep it quiet from the public, I can't see them withholding that information from Apple. I thought Steve Jobs publicly said they are supporting nVidia's platform and are going to use it in future models. It's almost a certainty that Jobs extracted guarantees from nVidia that this won't blow up in his face, before he gave his support, especially given the prior history of the 8600M GT. Just like Jobs supposedly told Motorola during G4 negotiations that he couldn't wait for Apple to never have to work with them again, nVidia would probably perceive Apple pulling away from them and going completely ATI as a real threat, and given Apple's media influence, it'd certainly be a PR catastrophe. If bump material is still an issue, I can't really see nVidia keeping it from Apple and risking Steve Job's wrath.

It's possible that nVidia wasn't aware of just how bad the problem was at first.

Or perhaps they were worried that admitting it and disclosing fully would kill the company, and thought they could somehow skate by, hoping that the problem might end up being not so bad. Companies make strategic calculations like that all the time. Sometimes it even pays off for them. I'm thinking of Ford Explorers and SUV rollover deaths, for example. That wasn't entirely caused by defective tires, but rather, also by Ford's design of the vehicle. But even so, Ford stonewalled, and pretty much got off scot-free.

I dunno, there are a lot of "what ifs" here, but if I were Steve, I'd be taking a long hard look at ATi graphics chips right now. If nothing else, nVidia does not seem to be able to get it together right now.

Kasper, you (and The Inquirer) have missed a much more serious consequence of this story if it's true. Lead solder is banned in Europe as part of RoHS, so if the report is accurate then Apple would be breaking the law in several European countries (any country that's a member of the EU) by selling this system there.

Given that Apple have stated that the systems are RoHS compliant I highly doubt the accuracy of The Inquirer's report.

Kasper, you (and The Enquirer) have missed a much more serious consequence of this story if it's true. Lead solder is banned in Europe as part of RoHS, so if the report is accurate then Apple would be breaking the law in several European countries (any country that's a member of the EU) by selling this system there.

Given that Apple have stated that the systems are RoHS compliant I highly doubt the accuracy of The Enquirer's report.

That would be a good question for Kasper to put to the Inquirer. He did say that he and the Inquirer could get ahold of each other.

Edit- But wait a sec... RE-READ the article:

The Inquirer claims to have done just this with a MacBook Pro it bought off the shelf in California shortly after the systems were announced in mid-October.

Maybe that clears it up? I assume lead solders are still legal in the US? Though I don't know that one off the top of my head.

Kasper, you (and The Enquirer) have missed a much more serious consequence of this story if it's true. Lead solder is banned in Europe as part of RoHS, so if the report is accurate then Apple would be breaking the law in several European countries (any country that's a member of the EU) by selling this system there.

Given that Apple have stated that the systems are RoHS compliant I highly doubt the accuracy of The Enquirer's report.

Kasper, you (and The Enquirer) have missed a much more serious consequence of this story if it's true. Lead solder is banned in Europe as part of RoHS, so if the report is accurate then Apple would be breaking the law in several European countries (any country that's a member of the EU) by selling this system there.

Given that Apple have stated that the systems are RoHS compliant I highly doubt the accuracy of The Enquirer's report.

There are exceptions where there is not a suitable alternative and I would say this is why they can still use lead.

Yeah. We are careful in selecting stories from certain publications, but the Inquirer has followed the NVIDIA story pretty well and their evidence in this case is clearly sound.

Frankly I don't buy the idea that the evidence is strong. More so I'm not convinced that we know exactly what the problem was with the Nvidia chips. Lots of rumors sure but where are Apples service reports and defect investigations?

Quote:

NVIDIA has also not been very forthcoming on subjects similar to this, and I can attest to that first hand. They have every opportunity to tell their side of the story. In addition to contacting them this morning, they know us well and how to get in touch with us.

Hopefully you can see why they wouldn't want to get into public discussions of this sort.

Quote:

If they issue comment, I will have the story updated. I have a feeling they will not, however, citing the privacy of their client (Apple). This was the response I was given while researching the problems with the MacBook Pro's 8600 chip, many weeks before Apple ultimately confirmed the problem independent of NVIDIA.

This is pretty much standard operating procedure for companies that want to keep their customers.

Quote:

Best,

K

Some of the problems I have with this reporting are listed below:

1.
I'm not sure how any current production computer can have lead based solder in it's construction and be sold in the EU. The RoHS would seem to prevent that but there could be exceptions.
2.
Tin based solders have their own issues including the growth of whiskers. Frankly the whole electronics industry has had to deal with problems associated with the move to lead free solders and frankly it has been a step backwards. One of the biggest complaints about the RoSH was the mandating of lead free electronics before suitable replacements have been found.
3.
Something as nasty as a process issue can lead to bad solder joints. Often these are referred to as cold solder joints and can cause issues based on thermal conditions, vibrations or other things.
4.
Solder joints can be thermally stressed by trying to pass to much current through them. A thermal failure of a solder joint does not automatically imply bad solder. All joints have mechanical and thermal limits outside of which reliabilty becomes an issue.

In any event enough with the list! What one needs to know is if Apples products are RoHS compliant or not. If they are I can't imagine much lead being in the machine.

I believe the OP was making a joke... the Enquirer versus the Inquirer. As in the National Enquirer.

Quote:

Originally Posted by Kasper

Yeah. We are careful in selecting stories from certain publications, but the Inquirer has followed the NVIDIA story pretty well and their evidence in this case is clearly sound. NVIDIA has also not been very forthcoming on subjects similar to this, and I can attest to that first hand. They have every opportunity to tell their side of the story. In addition to contacting them this morning, they know us well and how to get in touch with us. If they issue comment, I will have the story updated. I have a feeling they will not, however, citing the privacy of their client (Apple). This was the response I was given while researching the problems with the MacBook Pro's 8600 chip, many weeks before Apple ultimately confirmed the problem independent of NVIDIA.

Normally you do need to take what the Enquirer says with a grain of salt, but if I recall they were the ones that really broke the 8000 series issues and continued to follow up on the huge scope including the fact that if effect certain desktops with intergrated graphics. HP only confirmed this (for my Slimline by extending the warranty) after the Enquirer broke that as well.

So yes I would take them at their words about this issue, and its what caused my reaction and promise not to buy the new Macbooks as i wanted more powerful Intel or ATI chip. I'll stick with my White 2.4ghz Macbook for now. Also the fact that it didn't have firewire has caused me to criticise. Hopefully the next edition will be one colour (not aluminum and black, just black or aluminum) and drop Nvidia and add firewire. I can dream can't I... (And i do realize that only the pro is effected SO FAR)

I'm sorry you don't reward a supplier with quality issues on getting to be your whole chipset with a even more lucrative contract for more parts. Apple will reap from this bad judgement.

Engadget further seems to claim they have gotten word that some users are experiencing graphics issues which seems to further cement this as a reality.

I've been saying for some time here, that the main reason why Apple has been underclocking their GPS in certain machines is because of this NVidia problem, which, by the way has been known about for well over a year, and likely started before then.

Understand that this problem first came to light with Hp's laptops. Nvidia at first claimed that it was just a short production run affecting those 20,000 machines, but it was seen that it wasn't the case.

Note that many machines from other manufacturers that are using Nvidia's chips have had actual meltdowns from this, while Apple's haven't.

I've been hoping that Apple would have switched to ATI chips during this time, but they didn't.

I assumed that by underclocking by the amount they did, they would have avoided the heat problems.

Nvidia's chips made since late October are supposed to be free from this problem.

Frankly I don't buy the idea that the evidence is strong. More so I'm not convinced that we know exactly what the problem was with the Nvidia chips. Lots of rumors sure but where are Apples service reports and defect investigations?

The evidence is strong that All of NVidia's chips have been manufactured with this older technology for some time, and have only recently stopped.

Quote:

Hopefully you can see why they wouldn't want to get into public discussions of this sort.

Nvidia has been denying this whole thing from the beginning.

Nvidia's been protecting themselves. It has nothing to do with their clients, which have been demanding hundreds of millions in compensation. Nvidia had to put hundreds of millions into a fund earlier this year for those very payments, and the financial debacle resulting has caused them much investor grief.

Quote:

This is pretty much standard operating procedure for companies that want to keep their customers.

A much better way is to not go the cheap route in manufacturing, which is what they did in the first place. Then trying to contain knowledge of the disaster, rather than just coming out and admitting it, as they also did.

Quote:

Some of the problems I have with this reporting are listed below:

1.
I'm not sure how any current production computer can have lead based solder in it's construction and be sold in the EU. The RoHS would seem to prevent that but there could be exceptions.

Some parts production is grandfathered in, until the parts are superceded by new ones.

Quote:

2.
Tin based solders have their own issues including the growth of whiskers. Frankly the whole electronics industry has had to deal with problems associated with the move to lead free solders and frankly it has been a step backwards. One of the biggest complaints about the RoSH was the mandating of lead free electronics before suitable replacements have been found.

Whisker growth is not a problem for these connections. It's mostly a problem with far smaller nodes, such as those on the chip itself, for which reason, manufacturers have moved to gold plated wiring, rather than the older tin plate.

Quote:

3.
Something as nasty as a process issue can lead to bad solder joints. Often these are referred to as cold solder joints and can cause issues based on thermal conditions, vibrations or other things.

The problem here is that the tin and the lead have different rates of expansion as the temp rises, leading to a breakage in the connections. It's well known that one should never mix lead based solder with tin bumps. This was a major error.

Quote:

4.
Solder joints can be thermally stressed by trying to pass to much current through them. A thermal failure of a solder joint does not automatically imply bad solder. All joints have mechanical and thermal limits outside of which reliabilty becomes an issue.

That's a different problem. no one is saying that too much current is present. The problem is the point heat generated at the solder joints. It isn't too hot per se. The problem is that the tin softens earlier than the higher temp lead. Normally, this wouldn't be a problem, as the tin from the chip would soften by the same amount (I'm not talking about softening to the point of melting, just a lowering of tensile strength). But the lead remains stiff. As the tin on the sub board bumps move, the lead on the chip doesn't, resulting in a break.

Normally you do need to take what the Enquirer says with a grain of salt, but if I recall they were the ones that really broke the 8000 series issues and continued to follow up on the huge scope including the fact that if effect certain desktops with intergrated graphics. HP only confirmed this (for my Slimline by extending the warranty) after the Enquirer broke that as well.

So yes I would take them at their words about this issue, and its what caused my reaction and promise not to buy the new Macbooks as i wanted more powerful Intel or ATI chip. I'll stick with my White 2.4ghz Macbook for now. Also the fact that it didn't have firewire has caused me to criticise. Hopefully the next edition will be one colour (not aluminum and black, just black or aluminum) and drop Nvidia and add firewire. I can dream can't I... (And i do realize that only the pro is effected SO FAR)

I'm sorry you don't reward a supplier with quality issues on getting to be your whole chipset with a even more lucrative contract for more parts. Apple will reap from this bad judgement.

Engadget further seems to claim they have gotten word that some users are experiencing graphics issues which seems to further cement this as a reality.

You keep referring to the MB's. The story references the MBP's. Are they both affected? I assumed the MBP would be the only one to be affected?

You keep referring to the MB's. The story references the MBP's. Are they both affected? I assumed the MBP would be the only one to be affected?

Sorry I rambled really about both the MB and the MBP. And if you were to READ it entirely I did explicity state that only the MBP had reported issues so far. Though I have my worries about nvidia in general and a reason for me avoiding the MB (together with the firewire issue).

I guess in my mind the MB and MBP are really the same, since they look the same, have the same cpus (in the high end MB), same chipset etc... But yes its only the dedicated graphics, not the intergrated that is having issues. Though I don't really trust Nvidia... especially not after this. My post was more what I wish apple had released as a new MacBook, preferably using ATI.

Oh no. ATI is dropping their implementation of CTM to use OpenCL instead, whereas Nvidia will likely be using it on top of their own CUDA.

I'm not talking about OpenCL. I'm talking about inclusion of the chips themselves. If Apple were to request certain things of ATI with respect to inclusion of the chip on a motherboard they may not be as likely to get what they want because of not using AMD.

While I'm at it, I'll mention that Apple has *always* underclocked their GPUs. Even on the PowerPC, and even ATI chips as well. My Power Mac G5 dual 2.0 GHz came with the ATI Radeon 9800 and it was underclocked relative to the PC version.

I'm not talking about OpenCL. I'm talking about inclusion of the chips themselves. If Apple were to request certain things of ATI with respect to inclusion of the chip on a motherboard they may not be as likely to get what they want because of not using AMD.

I can't agree with that at all.

First of all, it's much more likely that ATI would respond favorably than Nvidia, as AMD is in such desprate straits these days, and will be tryng to get all the business it can. ATI right now, is the light of the company, whereas it was originally thought to be its anchor.

Secondly, there's no evidence that Nvidia is making any changes to its chips for Apple that it isn't making for anyone else, which is to say, just about none.

Sorry I rambled really about both the MB and the MBP. And if you were to READ it entirely I did explicity state that only the MBP had reported issues so far. Though I have my worries about nvidia in general and a reason for me avoiding the MB (together with the firewire issue).

I guess in my mind the MB and MBP are really the same, since they look the same, have the same cpus (in the high end MB), same chipset etc... But yes its only the dedicated graphics, not the intergrated that is having issues. Though I don't really trust Nvidia... especially not after this. My post was more what I wish apple had released as a new MacBook, preferably using ATI.

I am not inciting a riot over it, I just happen to receive my new 2.4 GHz MB Aluminum the other day. I was like, WTH??? I had the first generation Intel MBP that had the noise problem that took two trips to Apple to get fixed before they found the fix was due to a faulty part. So I am sensitive to this.

So, I guess my question at this point is as follows: If Apple doesn't show an inclination to switch GPU manufacturers when will it be okay to upgrade to the new unibody MBP and feel reasonably secure I'm not purchasing inherently defective hardware?

I wonder why they used scanning electron microscope equipped with an X-ray microanalysis system. What I have access to, Mass Spec with Mass analyzer is what I would have used, cheaper then SEM with X-ray. But hey I guess if you have it and have the cash to waste have fun with it

I'm not talking about OpenCL. I'm talking about inclusion of the chips themselves. If Apple were to request certain things of ATI with respect to inclusion of the chip on a motherboard they may not be as likely to get what they want because of not using AMD.

While I'm at it, I'll mention that Apple has *always* underclocked their GPUs. Even on the PowerPC, and even ATI chips as well. My Power Mac G5 dual 2.0 GHz came with the ATI Radeon 9800 and it was underclocked relative to the PC version.

There's a reason AMD kept the ATI name when it bought out the graphics company, and that's because it needs every sale it can get. Intel quite happily lists "ATI Crossfire" on the features of its chipsets, they don't care that the company is owned by their only competitor.

AMD doesn't make chipsets for Intel processors (although thanks to ATI they have a license to do so ), so you're obviously not going to see ATI's integrated graphics in any Macs, and that was Apple's focus with the Macbook line. But Apple has always tried not to play favorites with graphics in the past, and I don't see that changing. Let's see what GPUs get put in updated iMacs and Mac Pros before we declare Apple's relationship with ATI dead.

You'd have to ask an Apple engineer why they always underclock GPUs, but I am 99% sure it's for noise reasons. Lower clock = less heat = slower fans = less noise, and Steve is a freak about noise.