Posted
by
simoniker
on Thursday April 08, 2004 @03:38AM
from the blooming-silicon dept.

crem_d_genes writes "According to the San Jose Mercury News, Intel is planning microprocessors that have a reduced amount of lead in them (reportedly 95% lower). It's about time a company started this - good job - and let's hope other tech companies take the hint. While many places in the US have banned the disposal of computer parts, there have been unintended consequences of the eco-friendly laws. Many 'recycled' computers currently get shipped overseas where parts eventually make their ways into the hands of workers who usually 'burn' the parts to get rid of plastic and recover small amounts of valuable metals. In the process they are exposed to the toxic compounds that are released. In other cases, lead makes its way into drinking water."

Semicondoctor fabs have a truly collosal ecological footprint, good thing what they make is worth more than gold. They consume tremendous amounts of water, and energy, to say nothing of the photoresists, acid baths and slag from the parts of the ores that aren't used. There are no doubt a log of computers, but you want to make an impact invent a lightbulb that costs the same or less, can be adapted to all the same fixtures, lasts longer, uses half as mu

Read the title for your reduced power consumption chips, which should be hitting the desktop within a few months or so. Banias is just wiping the floors with any competitors battery-life and speed wise, and their greatest competitor is actually themselves; those god awful Celeron notebooks with 30 minute battery lifes. But what's cheap usually outsells what's new.

I fully believe that Pentium V (Pentium 5, whichever they choose to call it), will be Dothan, introducing to the desktop for the first time a power-saving logic. Not only does this make sense for quieter, smaller computing (two of the biggest buzzfactors on the market right now; those micro cases and motherboards are selling like wildfire), it makes for cheaper, faster computing. I believe that the cluncky Pentium 4M will be dropped altogether, and the Pentium 4 technology (Tejas, the last NetBurst Archetecture chip I know of) will be integrated into the Xeon line to run head on verses the AMD64 chips (hince, the reason they're adding in the x86-64 extensions to that processor).

I think power consumption has always been a large factor. You can't increase switching speed and transistor count without either utilizing a LOT more power, or making the process more efficient.

Now granted, the "need for speed" in recent years has caused the net effect to be higher power consumption, but if consumption were anywhere near the level of older processors, but with the transistor count and switching speed of current processors -- we'd need some very serious heat dissipation.

Right now, speeds are fast enough that raw clock speed isn't as much of a concern for consumers any more. Even I don't feel the need to upgrade at this point, as the gains would be minimal. Any machine you buy new today will be more than sufficiently fast for what most users do, and most of them will play current and near-future games with ease.

So the push is now back to power consumption, just like when all the "Enegery Star Compliant" stuff first started appearing. Notice that most PC companies are making much quiter PCs, giving them smaller and more stylish designs, etc -- rather than having the fastest available CPU, etc. Lower power consumption falls inline with this, especially with making PCs quieter (less power means less required cooling, smaller power supplies, and ultimately smaller and quieter PCs).

It's all a matter of what's going to sell at a given moment. If we required more CPU speed, power consumption be damned. While we don't really need more speed, focus can go to power consumption and efficiency.

This is insightful? Somebody needs to learn to read a bit more carefully.

"Right now, speeds are fast enough..."

Note the use of the qualifying term. He's not indicating that nobody will ever need a faster processor, but that for most everyday uses computers are fast enough, and he has a point. Sure, there are some folks out there for whom instantaneous won't be fast enough, but as it is until the next must-have push the envelope app is unleashed on the masses, current computer speeds are good enough f

In fact, I feel this should be the *first step*. There are huge amounts of lead in many other products, making this a relatively small improvement when viewing "the big picture". However, power requirements have skyrocketed since the computer became a common household device. Laptop tech shows it's possible, why not apply that in desktops as well?

This was the first thing I thought of as well, it is completely insane the amount of power some cpu's are using now. Granted, there are chips which are efficient, but there should be steps taken by all those designing hardware to decrease the power consumption. I mean all components in a computer, hard drives, video cards, etc.
But not all blame should be placed on the hardware companies. Those designing software could be more efficient in their usage of the processor. I think most people have become a

Q1. Do Intel products contain lead?
A1. Yes, most of our products contain lead in very small amounts. The use of lead in very small quantities in electronic products is ubiquitous. Lead is found throughout electronic components, component packaging, printed circuit boards, and other products. Intel estimates that approximately 90% of all electronic components contain some lead -- mostly due

All don't. It's a marchitecture thing, Intel wanted high frequencies no matter what. As a result we have processors which do less work per clock cycle, huge pipelines and high power consumption.

All x86 processors don't have this issue. Via's C3 is miles away from Intel's Pentium 4. AMD is also somewhat better than Intel, and Intel's own Banias (Pentium M) is also rather low power.

The problem is, Intel's been brainwashing the public that YOU WANT A COMPUTER WITH MANY MANY GIGAHERTZ for so long now that the're more or less stuck with high power consumption until they have time to create a whole new architecture.

The problem is, Intel's been brainwashing the public that YOU WANT A COMPUTER WITH MANY MANY GIGAHERTZ for so long now that the're more or less stuck with high power consumption until they have time to create a whole new architecture.

Actually, intel is moving away from measuring chip speed by GHZ. Wired just had this article [wired.com] about it.

Basically, Intel is a couple years behind AMD who is now using numbers like 2300+ to describe chip speed.

AMD may be using numbers like 2300+ to describe the speed, but in the end, when a person goes to their local Walmart, or Dell.com, or whereever they go to buy their next PC, they're only going to look as far as "hmm, 2300+ is bigger than 2200+". They're not even going to know what the actual speed difference is, because they don't care to know, just as long as what they're getting is faster. GHz IMO is at least a little more honest when it comes to Intel Processors because the IPC (instructions per clock) shouldnt change all that much from a 2.0GHz CPU to a 2.2 GHz CPU, whereas the instructions per clock on a 2600+ CPU can be drastically different from that of a 2700+ (in fact, it can be a whole different core). Also true for the Pentium 4's as well. Damn, we just need a standard... someone, anyone, PLEASE!?

Benchmarks. In effect AMD's 2300+ like rating is based on independently audited benchmarking which is then normalised to the Intel CPU speed.

IT purchasing is notoriously independent of standards, and it is not just clock speeds. We see jokes such as 500W PC speakers (supplied with a 20VA transformer) and the ubiquitous use of 'X' (4X AGP, 56X CDR).

Standards exist but, apparently, buying a PC is more of an emotial experience than a scientific one!

Sadly, Intel isn't Apple. Intel's never been about marketing their product based on how well it's completed a benchmark; just how fast it can clock matters to most consumers because it's a big, flashy number, and big flashy numbers are quiet distracting.

Apple got it right by using Benchmarks to sell their product, even if the benchmarks are strange and deceptive. Hey, lying, cheating, and stealing are what got Microsoft to the top, everyone's gotta play a little dirty.

Ever since the more watts = more power advertising race started, people started looking at just watts, not how it was measuered. In the beginning, true RMS watts per channel was the standard. It included at what amount of distortion of a sine wave was permitted such as 1%, 0.1%, 0.005%, etc. into a specified resistive load such as 4 or 8 ohms.

Some smart advertiser found if they take all the channels of a 2 or 4 channel amplifier, ignore low distortion (square wave clipped output is ok) list the power del

GHz IMO is at least a little more honest when it comes to Intel Processors because the IPC (instructions per clock) shouldnt change all that much from a 2.0GHz CPU to a 2.2 GHz CPU, whereas the instructions per clock on a 2600+ CPU can be drastically different from that of a 2700+ (in fact, it can be a whole different core).

And therein lies the major problem with GHz-based speed comparisons. As long as you're dealing with the same core (which is not the same as processor name i.e. "Pentium 4",) the speed will scale rather linearily with core speed (ignoring bus speeds etc.)

But you simply can't compare an N-GHz processor with core X to an N-GHz processor with core Y. The problem is, there really is no objective measurement system, as of yet, anyway.

Actually, intel is moving away from measuring chip speed by GHZ. Wired just had this article about it.

Basically, Intel is a couple years behind AMD who is now using numbers like 2300+ to describe chip speed.

The difference here is this: AMD's numbers were intended for comparison with a P4; for example, an Athlon 2600+ is supposed to be roughly equal to a P4-2.6 GHz. And to AMD's credit, most benchmarks showed that they were quite generous to Intel.

Intel designed the P4 to do less work per clock, but at a much higher potential clock. Thus even a P-III would out-do a P4 for the same clock frequency. Whether this was a marketing decision or not, I don't know...

Point being, Intel is getting away from clock-speed ratings for different reasons. I personally think that it's because demand has gone down significantly. Computers are today more than fast enough for almost everything the average user wants to do. Even I don't really need a faster machine at this point, and I write software...

So the market isn't going to be driven by faster CPUs. Most of my family won't buy a new PC based solely on that. But if the new machine was smaller, quieter, and more power-efficient, that might be incentive to upgrade (again, even I would probably go for that if it were at least as fast as my current PC).

It's all about market demand. For the last few years, consumers demanded faster CPU speeds; this has changed, and the smarter companies in this industry are changing as well.

Actually, while they were clearly intended for comparison with Intel, AMD said they were for a Thunderbird (IIRC) of equivalent megahertz. And the benchmarks were often less than generous [tomshardware.com], though the newer Athlon 64 seems to be doing much better than equivalently rated Prescotts.

Anyways, if Intel can get away from clock-speed ratings, I hope it can get away from 100 watt processors. Where are the quiet and efficient Pentium M desktop systems? Some companies [radisys.com] are designing motherboards for them, but ther

I think we are seeing this trend a bit more than before. Where a few years ago, speed was the all-important marketing factor (consumers wanted faster CPUs), these days computers are fast enough for most.

Look at a typical HP or Dell (or even e-Machines) people buy these days. My cousin's HP Pavilion has a DVD+/-R, CD-RW, 80 GB disk, fast P4 etc -- yet is a very quiet and small machine. There's a shroud over the CPU leading to a case fan (there is also a separate CPU and PSU fan; some Gateways from a couple

I agree. While there will always be a need for more computing horsepower, consumer PC purchases in the near future will be largely based on size, style, and noise, and processor horsepower will be a secondary concern.

It is similar to the muscle car days of the 1960's and 70's - everyone was wanting more power, more speed. They got what they wanted, but there was a sacrifice of handling, fuel consumption, etc. Then we saw a shift in the 80's and 90's to the econoboxes. Now for many consumers, the look at

I'd fathom that Intel would love for their chips to use less power, but they are more concerned w/ the Mhz race. So when they design their latest chips, power/heat are thought about, but aren't given as much concern as Trasmeta may give those issues. Notice how the newest Transmeta chip is only about 1 Ghz. Also notice how long that Intel has been at speeds greater than 1 Ghz (~3 years). So these companies like Transmeta spend a lot more time looking into power/heat issues, and it shows. They seriousl

I'd fathom that Intel would love for their chips to use less power, but they are more concerned w/ the Mhz race...

I agree to an extent, but you have to realize how much thought must go into power consumption when you increase speed and transistor count. To get a higher clock frequency, and to pack more transistors on the CPU, you must lower the power consumption overall quite significantly.

The difference, of course, is that Intel's market is mostly performance machines where power consumption is seconda

8080 processor, actually. The 8086 is a 16-bit extended version of the 8085, which is an enhanced 8080 meant as an answer to Zilog and their Z80, and the 8080 and 8085 should be considered the only 8-bit x86s. I didn't include the 8008, because I don't know if the 8080 is code-compatible with it. If so, the x86 family should also include the 8008.

One day in class, the wacky department head at the engineering school I went to told us a little story from his youth. He said that while in chemistry class he discovered that if you dip a nickel (a 5 cent piece for international/.ers) into a pool of mercury it gets very shiny. And if you put it in your mouth, it tastes funny! That story definitely explained why the fellow was almost as mad as a hatter.

There not doing it out of the kindness of their hearts. Some countries (Japan) are phasing in laws that chips be made lead free. Otherwise, the can't be sold there.
A Pb-free chip only cost 1-3 dollars more than otherwise in my experience... (consumer electronics ASICs)

A Pb-free chip only cost 1-3 dollars more than otherwise in my experience... (consumer electronics ASICs)

I believe your experience is quite different than designing billion-transistor, ultra-high-clock chips, though. When they make a major structural change (such as material type or die size), it affects all aspects of the process.

I'm not an engineer, so I don't know how much of an impact this particular change makes, and I don't know if it will increase the ultimate cost at all (who knows - material cos

Okay, I thought I'd sit back and moderate on this one, but I'm already tired of reading the garbage.

Ever wonder why Intel's not been cranking out Prescott cored processors that run even faster/hotter? Is it because they couldn't just bolt a jet engine and a copper block to the thing and ship it? No. It's because they're shifting their attention (once again).

AMD fanboys listen up: Yeah, you guys are winning the strongarm race right now. You've got the faster middle-class processor (upper end desktop/lower- to medium-end server) and Intel knows this quite well. They could scale Prescott very quickly up, but so would come heat, and therefore energy prices.

Now, lets look at other moves Intel's made lately. They've announced they're going to a PR-rating for selling processors. What sense does this make if they're just going to ramp up their processors even faster clockwise? Why do they need to compare anything except clocks? Well, because AMD is wiping the floor with them, that's why.

90nm technology has also been undergoing perfection with Prescott, meaning lower voltages, higher yields, less wasted silicion on the wafer.

Both of these things bring us to the sucecssor to Banias, Dothan. Extremely large cache, 90nm technology, extremely fast CPU. Not only will this be one of the most (energy effecient/clock effecient) chips ever made, it most likely will be the next desktop processor. But, here's the kicker, for them to be able to do this, they need to take a short pause from ramping up their current technology's speed, and moving Dothan over to a bigger mass production line. This is why we find Intel pretty silent right now, and most likely the same with AMD (anticipation; AMD's a very reactionary company).

So, I'm very sure that this is one of the top priorities sitting on the desks of Intel Engineers, I applaud them for taking every step towards a cleaner environment while making my newest tech gadgets.

> 90nm technology has also been undergoing perfection with Prescott, meaning lower voltages, higher yields, less wasted silicion on the wafer.

Not to mention increasing the higher leak current, the possibility to increase the clockrate to 5GHz, and higher power consumption. Oh, the 5GHz was an estimate of Intel. It is their current target for the end of the year. So, no Dothan on the desktop.

The 3.2GHz Prescott consumes even a fair amount more energy than its 3.2Ghz predecessor.

> Ever wonder why Intel's not been cranking out Prescott cored processors that run even faster/hotter? Is it because they couldn't just bolt a jet engine and a copper block to the thing and ship it? No.

No, it is because the mainboards and psu can't deliver the 100A those devices would require. And it is quite a problem to dissipate the heat of such a thing. Remember the new mainboard-layout [slashdot.org] which shoud cope with that thing? Also an idea of Intel.

> Why do they need to compare anything except clocks? Well, because AMD is wiping the floor with them, that's why.

Quite the contrary. AMD has introduced the X+ rating for that reason. The problem is totally self-made. They've developed a design which has a better performance (Banias/Dothan...) at even lower clock speed. Now they have a problem to place that chip against its own products.

Not to mention increasing the higher leak current, the possibility to increase the clockrate to 5GHz, and higher power consumption. Oh, the 5GHz was an estimate of Intel. It is their current target for the end of the year. So, no Dothan on the desktop. 5GHz may be their target, and they may hit/miss it with Prescott, but the fact is, what Prescott is for, has changed. They know as well as we do that this is a totally unacceptable chip for the desktop (except for the extreme high end, gamers, case modders)

Quite the contrary. AMD has introduced the X+ rating for that reason. The problem is totally self-made. They've developed a design which has a better performance (Banias/Dothan...) at even lower clock speed. Now they have a problem to place that chip against its own products.

To clarify (and make sure I'm understanding correctly), Intel's "more MHz/GHz is better" marketing approach is presenting a problem to even themselves, much like it did for AMD a couple of years ago. Now that Intel is making more effecient (work done per-clock) processors, like AMD has been doing, simply comparing MHz among even just Intel processors is no longer a good performance measure, and might even make their new line appear slower (again, when comparing only clock-speed numbers).

It sounds like they're taking a step back from the P4 design, which were slower clock for clock than even their older (PIII) processors, but capable of higher clock speeds; so at the time the MHz myth worked to their advantage, where now it is no longer to their advantage.

That, and the market (in my opinion) isn't as speed-hungry as it was just a year ago. A quieter, smaller, more energy-efficient PC design is more likely to make the average user upgrade than a faster, beefier PC. Computers are "fast enough" for most people's needs (most of the time even for myself, a programmer and FPS game player).

This is a little off topic, but I have recently become quite "fan fatigued" and would absolutely kill for a processor that could just rely on a heatsink. In addition to being quieter, it would be a hell of alot more reliable- I find that fans, even supposedly higher quality brand name ones, are the least reliable component in machines.

In addition, I am surprised at the lack of implementation of more speed-step like features. I leave my PC on all the time. even when im using it, im usually surfing the we

AMD fanboys listen up: Yeah, you guys are winning the strongarm race right now. You've got the faster middle-class processor (upper end desktop/lower- to medium-end server) and Intel knows this quite well. They could scale Prescott very quickly up, but so would come heat, and therefore energy prices.

AMD has the faster high-end processors, too. I just ordered a high-end workstation for modeling and simulation at work. I chose a 64-bit AMD CPU both for the speed it gives now as well as for the future grow

I'm wondering what would happen if all manufacturers of electronic equipment were required to provide a 5-year warranty on all their products. Anyone think it would reduce the amount of cheap electronic stuff that ends up in the garbage after a week and contributes to pollution?

I'm wondering what would happen if all manufacturers of electronic equipment were required to provide a 5-year warranty on all their products. Anyone think it would reduce the amount of cheap electronic stuff that ends up in the garbage after a week and contributes to pollution?

I see three problems with this. The more obvious is that the market doesn't want this; otherwise people would buy higher-quality products (at an appropriate and higher price). But many people (possibly most) buy cheaper equipment,

"The Santa Clara, Calif.-based chip maker, the world's biggest, said it is working with the rest of the industry to remove the remaining amount of lead that's needed to connect the processor's core with its packaging."

Lead free solder. Most lead free formulations have a high silver content, and reflow temperatures are higher.

All major solder manufacturers allready have lead free products in place, check out thier websites for exact formulations.

BTW, a lot of chip manufacturers have allready done thier lead free packaging. Intels move is late in the day, which is ironic because they are making hi end high cost chips were gold is often used for bonding and plating rather than the solder used to tin the pins of lowwer c

Solder is correct.But the article mixes two separate issues, thus the answer is a bit longer:If you look at a BGA package on an PCB, then there are two interconnects: first the silicon die is connected to an intermediate substrate, the interposer. The result is the BGA.Then the BGA is connected to the board.

For the second level interconnect (interposer to board) eutectic or near-eutectic lead-tin solder is used right now- around 37% lead, melts at 187 deg C.SnAgCu (~95% tin, ~3.5% silver, ~0.5% copper) is

If you think that even a 95% decrease in the lead in the microprocessor would have as much as 0.1% impact on the amount of lead in a desktop computer, think again! The lead in the solder on the boards and in the power supply is a far greater factor than the very small amount of lead in a CPU. Sure, you can say "any decrease is an improvement", and maybe it even really is (that depends an awful lot on what the lead is replaced with though), but don't let let yourself be fooled by someone pointing at the CPU and calling attention to it while the Intel chip is not the real problem.

If you think that even a 95% decrease in the lead in the microprocessor would have as much as 0.1% impact on the amount of lead in a desktop computer, think again!......pointing at the CPU and calling attention to it while the Intel chip is not the real problem.

I wouldn't say it is even.1% of the problem

CPU's in desktops often get pulled and used in other systems. Pulling a CPU out of socket requires no burning or chemical reaction, hence nothing is released in the enviroment.

The lead in the solder on the boards and in the power supply is a far greater factor than the very small amount of lead in a CPU.

I'm not so sure that this is true these days. I have no sources here, but I believe the majority of solder used in consumer electronics (including PCs) is of the lead-free variety (mostly silver and nickel, I believe).

I do know that some cheaper consumer electronic devices have warnings in the manual about proper disposal because "this product contains lead...", but most things

Lead free soldering represents a minority in manufacturing, with companies now only starting to switch over with pressure from Japan and eventually the EU.

I'm not sure why I was under the impression that companies had started doing this a while ago, but I guess it's good that something is being done now anyway. I don't really know what kind of dangers lead poses, though even if minor, and if it's not *that* difficult to start using something else, it probably should be phased out...

With the amount of heat the chips give off, you can keep entire rooms warm. My p4 3.2ghz keeps my office warm when its cold. I also have a 1.2ghz p4 sitting next to it but that doesnt give off much heat. Just a thought.

The Register [theregister.co.uk] has an article [theregister.co.uk] with more info.
A flip-chip package currently contains 0.4 grams of lead. A tiny amount compared to that in the solder in a motherboard, let alone a monitor.

Legislators looked at the amount of lead used by the electronics industry as a whole. Overall it is a lot, and as it is not actually necessary (it just costs a bit less), they simply said no lead in electronics.

on the site [state.me.us] linked to in the article they claim "A typical computer processor and monitor contain five to eight pounds of lead..."

Now I've never cracked open a monitor so I don't know if they really contain 8 lbs of lead, but where is all this lead in a PC? The entire motherboard can't weigh more than a pound or two so that's not it. The case? No, that's sheet metal. Is it in the hard drive? Average mid-tower PC probably doesn't weigh much more than 8 lbs total so I can't imagine where all this lead is at.

Also monitors are rarely thrown out. I've gone through about half a dozen PCs but kept the same monitor. They're just too freaking useful, even old 14" monitors are great for a second PC and still easily sell on eBay. Are these broken monitors people are tossing out?

Ist This is Intel, so we are talking only about the processor and other chips, not the whole machine? Vast majority of lead is soldering to the motherboard and other printed circuits - outside Intel's control.

2nd You won't stop 3rd world countries trying to kill themselves. A colleague of mine once worked for a crane company who sold to India, among other places. He went out there to check the new installation of a new crane once and found they had removed all the hand rails around ladders and platforms etc and sold them for scrap! You cannot impose western standards on these places.

3rd Not just 3rd world countries. I work as a safety engineer and anyone, even supposedly "sensible" workers within my own industry (they have to pass various aptitude tests here), have limitless imagination in devising new ways to try to kill themselves. Only constant monitoring and supervision stops them from doing so. We can only leave 3rd world countries to regulate themselves.

4th Sounds like a publicity gesture by Intel to me. "Lead" is one of those trigger words which switches people into self-righteous mode. These gestures always seem to work - even among people of above average knowledge and intelligence. Just watch the posters here for example.

There always is a risk that first generations of such environmentally-friendly products have some kind of malfunction, and need to be returned and replaced. This has happened in several cases, including in the semiconductor industry.Probably the dump of failed environmentally-friendly but useless products damages the environment more than the originally replaced product.

I would wait for the second generation of such a processor before buying it myself or recommending to buy a lot of them at work. For me, the amount of lead that could be in a single processor and could be saved in the next, is not worth the risk of having it fail.

It really is. On many levels modern chip production is horrendously bad for the environment. It's a little known fact but pure silicon doesn't exist naturally on earth, it's a multistep process with some really nasty chemicals to produce it. Lithography is again a multistep process with some truly nasty chemical waste including some strong acids. The machines used to "dope" silicon to produce p/n junctions are often sold off cheep to hobbiests because of the large costs associated with cleaning and recycling them. If you find one don't take it, often they explode if opened. Then let's not forget that the next gen P4 is slotted to run, at what, 150 watts?

Oh, but wait, atleast now there'll be a quarter gram less lead in my computer.

Most people have all the computing power they need. It's time more people worry less about clock speed and more about their electric bills and what happens to all those chemicals after Intel's done with them. Cheers.

But this isn't anything unique to Intel, and it isn't done out of the goodness of their green little hearts.

Every IC manufacturer, in fact practically every manufacturer of anything electronic, is already investigating lead reduction or elimination at some level or other. Not all are making a public hoopla about it, though.

Lead free solder requires the development of new alloys and new processes. The changeover isn't trivial, but some promissing candidates exist. Typically they have very high tin content, plus some mix of Silver, Copper, and Antimony.

There are several reasons for this trend: Regulatory changes (pending in the US, and I think already passed in Europe?), Liability/Insurance cost (employee lawsuits), and waste treatment cost, including waste water.

My opinion: I don't beleive lead in electronics will ever be totally eliminated, nor outright outlawed. I'm no solder/process expert, but those I know tell me that leadless soldering presents many challenges. More likely in my opinion, regulations will take the form of taxes and fees on lead content, driving manufacturers to use it only where no good alternative exists.

The main problem relates to the higher temperatures needed to melt lead free solder. These higher temperatures can stress components and are particuarly worrying in products that have to last 20 years.

But this isn't anything unique to Intel, and it isn't done out of the goodness of their green little hearts.

I agree with you for the most part. However, lead-free solder isn't much more difficult to work with (at least as an elecronics hobbiest). I think the concern is more the cost of the solder, given that (I believe) it usually contains a lot of silver. Maybe it's harder to manufacture (or manufacture with), or perhaps there are other mechani

I work for a European Semiconductor company, and have some involvement in our drive to be lead free, so i know a little about this.

Lead is used in the lead frame of the chip, as the coating to make it solderable, and also in some BGA packages as the balls. Pb is not used in the actual chip manufacture.

There are alternatives to Pb, but normally they require higher temperatures for soldering, which have an impact on the package thermal characteristics and material, which in turn may have some influence on the performance of the chip itself, so these changes have to be handled carefully.

At the moment the US does not have a deadline for phasing out Pb (I think they refused to sign up?) but the EU does, so if Intel wants to sell chips in the EU, or Japan, they have to provide Pb free alternatives.

One person mentioned that this is a small percentage compared to the rest of the Pb in a PC - which comes from the solder mainly, but what you should remember is that the EU directive applies to ALL Pb products, and therefore all circuit boards will be Pb free too.

It's only in the US that you might get a Pb free chip, with no reduction in the ammount of Pb in the rest of the machine.

This is a lot of work for a lot of people. It's not a small change, and all companies have to do this, not just Intel.

In some places, a deposit is required for disposable goods, usually things like soda / water / beer bottles / cans / jugs / whatever what have you. There was a time when it was the norm to provide reuseable containers made out of glass, which gets reused but this is no long the fasion. Simply put, it's more cost effective let the consumer junk that bottle, and not worry about the cost of disposal. This keeps prices down and everyone's happy.

When you say "found homes" for them I'm tempted to ask whether there's some adoption agency out there that matches up geeks and computers.

FWIW, I'd suggest you consider keeping your old gear. You may surprise yourself and discover a need you didn't think you had. Even in a home environment, extra gear could easily be used for a test sytem (new program installations, alternative distributions, major upgrades, etc.), or alternatively be put to use as a file server, backup storage, multi-boot replacement, a

FWIW, I'd suggest you consider keeping your old gear. You may surprise yourself and discover a need you didn't think you had. Even in a home environment, extra gear could easily be used for a test sytem (new program installations, alternative distributions, major upgrades, etc.), or alternatively be put to use as a file server, backup storage, multi-boot replacement, a firewall, yada yada yada.

I have a file server, I have a nat firewall, I have web server, I have my pc, that other pc, and some other PC ov

Many have already written that the lead is in the glass of the CRT. If I'm not mistaken, lead is added to glass to improve it's clarity.

However, the lead in the soldering alloy is significant, too: the so-called "eutectic" alloy contains 37 to 40 % lead and the rest is tin (Sn).Eutectic alloys have a lower melting point than any of it's components. That's exactly the reason why lead is added to tin, in soldering alloys.Another very efficient dopant is silver - it decreases considerably the melting point. Unfortunately, it's expensive.

Tin is basicly innocuous, while lead is toxic. The problem with lead is that it causes a chronic poisoning called saturnism, where your brain suffers considerable damage - in fact, largely unrecoverable.I should add here that there are historians that think one reason for the fall of the roman empire lies in the use of lead cups for drinking wine. These lead cups were quite popular in the roman army, and it's not inconceivable that this might have decreased the soldier's mental and physical abilities.

The problem with the lead-free soldering technologies is exactly the higher melting temperature of pure tin compared to the eutectic alloys. Reflow and other technologies have to be fine-tuned for higher temperatures, and the risk of damaging some of the components is significantly higher. I, for one, prefere much more to use normal, eutectic alloy for my hobby work.

Some twenty years ago, my friend and I we made some about a kilogram of pure gold from computer parts by dirty and costly chemical work, mainly from russian mainframe parts (remember "Minsk" anyone?). One russian card contained 10x more gold than japanese memory card or french connector mainly because of thickness, western parts are practically of no use.

The problem with the pure gold was it was contaminated with about 0.9% of mix of platinum and iridium so it was much harder then normal soft pure gold. It

workers who usually 'burn' the parts to get rid of plastic and recover small amounts of valuable metals

I don't know why the submitter/editor put quotes around burn; there is nothing metaphorical about this. The parts are burnt in a fire to get rid of plastic coating from wires, etc, to make separation of copper and other metals easier.

Coincidentally, just this week a Japanese customer of ours asked us to modify our firmware on our embedded device to support a different flash chip because the only one we currently support uses lead. We happily oblidged. So Intel definitely isn't the only company out there trying to be more "green".

Semiconductor manufacturing is often an environmental disaster, due to the wierd chemicals and strong reagents used, in bulk, in the doping and etching processes.

The other popular alternative to silicon is Gallium Arsenide. Gosh, arsenic, another heavy metal with a place in the history of poisonings.

Lead, mercury, and arsenic are famous just because they're common on the earth and have been known since ancient times. All heavy metals accumulate int he body and cause problems, and I'm not sure that exoti

In the process they are exposed to the toxic compounds that are released. In other cases, lead makes its way into drinking water.

But now it won't get into OUR drinking water, and the lead in the water of the enemy means their babies will talk and walk slower, making them easier military targets when they grow up. This could be a nice long term strategy in our war on terrorism, and helps keep our streams and lakes lead free, too.

I think a big reason behind intel's elimination of solder is that lead in the packaging is responsible for a large number of the alpha rays that cause soft errors. Intel's processors are on the cutting edge of technology and they'll be suspectible to these errors long before, say, the real-time-clock. Here's Mitsubishi's low-alpha solders [mmc.co.jp], and here's a quote from this article [psu.edu]:

The charged particles causing soft errors can come from three different sources.

It's about time a company started this - good job - and let's hope other tech companies take the hint.

Hello, wake up call. This is a major industry trend. Intel is following along. They're definately not the ones starting this, in hopes the rest of the industry will catch on. It is a European Union Directive that deserves the "good job" credit... and it is Intel and every other major manufacturer in the electronics industry that is "taking the hint".

Most new electronic components are being made with little or no lead. Major companies and contract manufacturers (who solder boards for most smaller companies) are switching to lead-free soldering processes.

Already this forum is filled with +5 comments about power consumption and how the solder contains much more lead than the chips. Well, here's the news... the whole industry is migrating to lead-free solder.

Much of the conversion is driven by an EU directive that all electronic products sold in Europe be lead-free by 2008.

I am an electrical engineer, and even at the US-based company where I used to work, they're having to go through the painful process of switching the wave solder and reflow ovens (surface mount soldering) to lead-free fluxes and solder alloys.

So give credit where credit is due. It's the European Union, not Intel, that deserves "good job". The whole industry is taking the hint, as selling or being able to sell in the EU is important to almost everybody.

There are **other** initiatives that promote the reduction of power consumption in PC's. Note that the average consumption of a CPU in a typically used desktop PC can be quite low, how low depends more on the OS than the chip design.

I suspect this has more to do with complying with the law than a desire to become more green. EU law requires Pb use to be reduced imminently. If they really wanted to become more green then power consumption would be a more profitable place to start. As would reducing the evil chemicals (and power) used in manufacture.

actually, they are required to do this if they intend to keep selling chips in europe and japan. a recent group of laws in the EU (or is it some individual EU countries, i'm not sure) and Japan require that consumer electronics be nearly lead-free, both in the final product and in the manufacturing process. this includes PCB's and integrated circuits. most manufacturing operations, and any electronics makers that want to do business outside of north america, have been transitioning to lead-free products recently.

intel is meeting its upcoming legal requirements. the real win here (for intel), is turning something they are legally obligated to do into an "environmentally friendly" pr victory. the news media seems to be eating it up.