Posted
by
timothyon Thursday August 26, 2004 @02:22AM
from the but-the-ribs-aren't-done-yet dept.

servantsoldier writes "There's a new solution for the transistor heat problem: Make them out of charcoal... The AP is reporting that Japanese researchers, led by Daisuke Nakamura of Toyota Central R&D Laboratories Inc., have discovered a way to use silicon carbide instead of silicon in the creation of transistor wafers. The Japanese researchers discovered that they can build silicon carbide wafers by using a multiple-step process in which the crystals are grown in several stages. As a result, defects are minimized. Other benefits are decreased weight and a more rugged material. The researchers say that currently only a 3" wafer has been produced and that a marketable product is at least six years away."

Silicon Carbide does work -- Cree, Inc. [cree.com] of Durham, NC has been manufacturing electronics (particularly blue LEDs) for years using silicon carbide as the substrate. The technology was developed at NC State University, as I recall.

You may be dyslexic when reading English. Learning a different language that reads from right to left may eliminate this problem [bmjjournals.com].Alternatively you should have your optical system replaced with that of the Mantis Shrimp [blueboard.com], which has eight different retinal pigmentsranging from ultra-violet to ruby red, and a couple of layer of polarisation filters added for good measure. This should sort out your reading problem.

OFFTOPIC:You may be dyslexic when reading EnglishSo for kicks I read the articles and it would seem that if learning to read right to left were to fix the person in question I would be very worried for their general health for if a hematoma bursts in the brain you're in really deep schnizzit, and anything else causing that lingual defect can only be worse...

A couple of those, connected via heatpipe to a hotplate at the top of the case, would make an excellent hot-plate for a coffee or tea pot =)

As for the plugs - well, there's some way to go yet. At the moment, power supplies are on the order of 5-600W. An electric heater can put out up to 3000 or so watts.

I used to run a constantly-on heater, two PCs, three monitors, some random home networking equipment and a desk lamp all off a series of four-way power bars connected through a single 13A 230V UK plug. The

Oh, before anyone tries the stage-lighting thing: it *worked*, but the plug got pretty hot and eventually the circuit breakers tripped. The problem was solved by splitting the load over two plugs on opposite sides of the stage =P

Ah, the days of helping out with school stage tech. I still don't think the music dept. has forgiven me for blowing up two of their (old, crappy, faulty-but-not-diagnosed-until-they-failed) PA amps in one night...

You're lucky. With our lowly 120V supplies here, 2000 Watts is about as much as you can ever expect on a single circuit. (theoretically 2400W on a 20A circuit, but once you're pulling close to 20A, the wires and cords themselves start to draw enough in heating that it adds up)

On the other hand, I have accidentally touched live AC wires a few times (and even stuck my finger in a light socket as a kid) and had relatively minor effects from it. I'd imagine 220/240 has a bit more of a kick...:)

Yes, silicon carbide and water cooling will get the heat out of the CPU faster.

The problem still remains that a metric buttload of heat is produced, and that it comes out of the electricity bill. Sometimes twice: in the summer you also pay for the air conditioning, since that shiny new CPU is heating the room some more.

I think it's getting ludicrious.

The Prescott is already over 100 W, and Intel apparently plans dual core versions. Whoppee for 200+ W CPUs. NVidia 6800 Ultras are rated for 120 W, and they're hyping SLI setups now. Yep, _two_ graphics cards, if just 120W worth of hot air blowing off the back of the case wasn't enough.

Add hard drives, motherboard, and the PSUs own inefficiency, and you're already looking at 1000W worth of heat for the whole computer. That's already like a space heater.

In fact, go ahead and turn a space heater on near your desk in the summer, and you've got a pretty good approximation of what the next generation of computers promises to be like. Now picture some 4 of them in the same room, at the office.

And it's raising exponentially. Carbide and water cooling will only help them get further along that curve.

And I'll be damned if I'm thrilled at the prospect.

This also brings the problem of even more fans. Even with water cooling, you then have to get the heat out of the water. It still means fans. More heat will just mean more fans, bigger fans, or faster fans. Or all the above.

And I'm not thrilled at the prospect of the return of the noisy computer either. I can jolly well do without the machine sounding like a jumbo jet. Especially when I'm watching a DVD or such, I can do without having to turn the volume sky high just to be able to hear what they're saying. And at the office I can do without four noisy hovercrafts in the same room.

The amount of heat being generated by chips does not seem to be decreasing at all...

I disagree. I've just upgraded an Athlon XP 1800+ system to an Athlon64 3500+.
The new box runs around 20 degrees C cooler than the old one at idle and under heavy load; both use the supplied retail AMD heatsinks. I'm not using "Cool 'n Quiet" on the '64; it might take a bit off the idle temperature, but I don't see the point.

A friend has a 1.2Ghz thunderbird Athlon that runs pretty much consistently at 60 degrees, no matter what you do, wheras my Athlon XP1700+ with stock heatsink barely ever crosses 40. We have the same case, and I've never bothered with case fans or hard drive fans...there's just the CPU fan....

The heatsink supplied with the XP1800 was horrible. A tiny all-aluminum sink with a high-speed 60x15 fan just doesn't cut it for cooling that CPU. I've heard that the 2200+ and beyond have decent stock heatsinks, but I haven't verified that myself.

Devices built with the rugged material would not require cooling and other protections that add size, weight and cost to traditional silicon electronics in power systems, jet engines, rockets, wireless transmitters and other equipment exposed to harsh environments.

So you see, besides that it is nearly as hard as diamond and can survive the temperatures of re-entry into the Earth's atmosphere, they want use it to replace silicon electronics that are used in more stressful environments. A

They do they are called thermo-couples and operate on the peltier effect.
Take two different wires twist them together into two junctions, break one wire put in a meter; then heat one junction, cool the other and electrical current flows. the peltier cooler work by adding current which causes one junction to warm, and the other to cool.

You should be able to take a peltier cooler, heat one side and cool the other and get some electricity out of it. I imagine the efficency is pathetic, but its just waste he

The voltage doesn't matter; it's the wattage. So, you probably won't need more than 120V for future machines, but you may need better wiring so that more amps can be carried to it without blowing a fuse (or lighting your house on fire).

Honestly, I do that as it is today! During the winter, I'm a cheap miser, and keep the rest of the house at about 50. I keep my computer in my room and always keep the door closed, and it'll reach a balmy 70 degrees just from the PC.

I've got a quitea bit of experience with SiC abrasives, what with the materials engineering and being a bit of a lapidary.

First off, it's nowhere near diamond in terms of hardness. The Mohs scale is semi-arbitary in assignement, and not even vaugely linear. On proper hardness scale (in this case Vickers), diamond has a hardness of around 90 GPa, compared to about 25 GPa for SiC. That's the reason I've got a box full of diamond abrasives, despite the cost (about 30 times more expensive), they are much faster, and last almost indefinitly. More later on this.

Secondly, SiC needs to be rough. If you don't belive me, try grinding a carrot into shape on a window. The glass is very much harder then the carrot, but is nearly perfectly smooth, and as such, the carrot just sides about. Compare with rubbing the carrot on something like a concrete paving slab, which grinds it much better. The reative hardnesses are wrong here, but show the need for surface roughness.

As an aside, if you think that paper cuts are bad from standard office paper, then try getting one from fine SiC abrasive paper. Stiffer paper, cuts deeper, and the abrasive roughs up one side of the cut, so it takes about four times as long to heal. It's a mistake I've made exactly once.

A processor is not a single pure material - if it was, it wouldn't do anything. They are a complex layered system, with layers of copper and SiO. Trying to grind anything with a processor die will just succed in scraping off all that important stuff. The hardness of SiO is Mohs 7, well below that of anything actually used as an abrasive for metals. (It's the same as ground glass, near enough, sometimes used for abrading wood or plastics).

For comparison silicon has a hardness of 12 GPa Vickers. SiC is only around twice as hard as that.

So, no, you can't really use it as an abrasive. If you really want to be very careful, you might be able to use the edge of the die as a scraper, but you'd probably just remove the important stuff.

That's alla moot point, however. I strongly supect that you'll never see the actuall die, it will be under a metal heat spreader. Because they can cope with higher temperatures [0], there is even less need to take the risk of mishandling breaking the die.

And lest you think that SiC would be less likely to break then silicon, I'm afraid not. Aside from the fact that many broken Athlons are due to the top few layers of SiO and metal breaking, SiC is not that tougher than silicon. As any lapidary will tell you, it's perfectly possible to chip saphire and diamond, if you're not careful.

Nah, you just use a diamond saw. Same as for the silicon wafers. It's conceptually the same thing as a very thin diamond tipped grinding wheel, and it grinds a cut through the material. You can also use a diamond encrusted wire as a saw, like, erm, this one *holds one up*, but they are much slower, and only really good as hand saws, or for chopping thin sheets [0].

It's going to be a little slower, as SiC is about twice as hard as silicon, but that's not going to slow it down that much. Diamond saws are also used to chop up boules of sapphire and ruby, which are of similar hardness to SiC (a little softer), and also diamond (harder), so it's no big techical problem.

Or, a laser. A nice big excimer laser would slice it neater than a diamond saw. With the improved surface texture after cutting, the decrease is polishing coupled with the increase in hardness might make it worth while. Probably not, though.

[0] I use my saw for cutting rocks for lapidary purposes, principly quatrz of various sorts.

The article is kind of vague on the details, for instance, just how much hotter are these semiconductors going to be able to run? Is it possible that chips made from these will have to use a non-plastic casing material? If so, that would be very cool. I doubt it though, that'd have to be pretty hot.

The interconnects I was referring to are those conductors which attach one device to another within the same chip, ie the power and signal traces. Your link refers to connections from one chip to another.

From the article.... In an advance that could lead to lighter spacecraft and smarter cars, researchers have developed a new technique for producing a high-quality computer chip that is much more resistant to extreme conditions than the silicon found in most of today's electronics.

So a chip more resistant to extreme conditions is also somehow 'lighter' and 'smarter'...

I'm all for being able to OC the hell outa my proc and not be worried about burning it..

BUT

These CPUs would be far more durable and last a lot longer. Why is that a problem? Think about the last time your job/office/place of business replaced computers. You're gonna be stuck with that slow machine a whole lot longer.

If thermal conductivity is all that important, is anyone commercially producing isotopically purified silicon wafers? This stuff has better thermal conductivity (phonons tend to scatter off mass irregularities).

Silicon carbide and diamond both have significant potential use as power semiconductors. Forget CPUs, think I/O. Think smaller power supplies, smaller audio drivers, more rugged automotive systems, and, ultimately, being able to shrink robotics controllers as a next step to producing very small robots. If a robot's motors are running at 80C, you want the power semis to be able to handle that. Furthermore, a lot of possible fuel cell designs run at fairly high temperature and, again, you want the electronics to survive the environment without too much cooling.

There are also huge potential benefits for rad-hard communications satellites, where cooling is a major problem (radiation only.)

80C is a realistic maximum case temperature for DC motors, which I used as an example. If the environment reaches 80C, what do you think the junction temperatures of the transistors will be?

Also, please note that the junction temperatures you quote are maxima. You will not get good life at high temperatures with silicon but, more importantly, the ability to handle pulses and voltage drops as junction temperature rises. I suggest you look at the SOAR curves for a few power devices to see what I mean.

The main problem is getting the required purity, silicon based chips involve a multi-step process process to manufacture the substrate now. Basical they take very pure silica sand (SiO2), and purify it as much as possible chemicaly, Reduce it to remove the oxygen, melt it, then extract it by growing a single crystal. Then crystal of Si is then heated to just short of the melting point and then, moving it through a electric induction heater a small portion of the crystal melts, and any remaining impurities t

Different markets. X86 is under extreme competitive pressure to produce the fastest possible processors in the medium price range. This means more complicated circuitry to produce the same function. (As a trivial example, compare a simple adder to a look-ahead-carry adder.) The complication adds heat.

It makes a mockery of "Green PCs" though. In the last 18 years that I have had various PCs the power usage has gone up from ~100W to ~350W for the box. CRT monitor power has gone up too and only switching to an LCD has improved things.

A machine built with 8x ARM cores would have as much grunt as a P4, but cost less and would use only a fraction of the power.

It's very nice that SiC can withstand high temperatures and is very hard, but are these the most important features of a semiconductor material?I would be more interested in band gap voltage, electron/hole mobility etc.Who needs a chip that can run hot when it cannot run fast?Maybe for specialized hardened aplications like space, but I don't see these being used for mainstream applications.

Well, SiC has a wide range for bandgap, 2.2 to 3.25 eV, which is much less stable vs. temperature than Si. This is one of its "problems" for ICs. The other is the difficulty in making large wafers. The huge benefit of its large bandgap is long minority carrier lifetimes....think standard RAM cells that can hold their charge for hundreds of years. The real focus these days for SiC has been discrete power devices since they can function with a much higher junction temperature than silicon devices. Severa

Now the chip's will get hot enough to ignite combustibles (paper, plastic insulation, dust) and still operate. Then you'll cut your hand on the edge of the SiC chip as you're trying to put out the fire...

If I remember correctly diamond chips are interesting because they can easily bind to organic molecules. I believe I saw a sample chip made by some students and Sumitomo is into it too.

Does silicon carbide have any such properties? (i.e. anything besides heat resistance?)

The flip side of course is for high temperature operation which I think is a bit scary, maybe the chip itself can handle it but what about the stuff next to it? I would rather have lower temperature circuits. As it is only a very tiny vo

Developments like this in Japan and other countries tell me the the US not losing its technological edge it has already lost it. Japan patents brand new tech like this while in the US we patent SUDO and 1 click shopping.

I live in Japan and work for a Japanese company. Trust me, stupid business process patents are not unique to the U.S. Our company has attempted (sometimes successfully) to patent some of the most obvious, blatant crap by tagging "online" onto it. AND WE'RE NOT EVEN A TECH COMPANY!!

In the event that we find out that someone else already HAS "invented" this idea, it is usually NTT (Nippon Telephone and Telegraph) which has also registered the hell out of a shit load of trademarks that it doesn't use.

And, over here, there are a lot of people worried that we've really lost our tech edge against China and Taiwan. To a certain extent, I think they're right. China and Taiwan used to be copiers, not innovators. But then again, so was Japan half a century ago. Recently, China and Taiwan have started innovating too. It should have been obvious that they "could" innovate, about 18 years ago when the first fake Nintendo consoles from Taiwan were found. They say over 80% of the circuitry, including the CPU, was original, and not a copy. (Then again, a lot of the fake Apple IIe machines back then were pretty original too, sometimes with features that weren't available on the real thing!)

Perhaps this will lead to computers that are built like modern tube amps. You'll have windows in the front of the box, but not
so you can see your neon light case mod, but so you can see the warm glow off of the CPU, RAM and chipsets.