Intel announces 3D Tri-Gate transistors

Intel's new Tri-Gate transistors will allow for greater performance and energy efficiency

Intel has revealed a brand new type of transistor, which uses a three-dimensional design to operate in a smaller space and consume less power than existing designs.

The new transistors are called Tri-Gate units, in reference to their use of three conductive surfaces. The company claims they offer an overall 50% power saving over current planar transistors, including greatly reduced power leakage when in the off state. Alternatively, the new transistors can deliver 37% faster performance with the same power draw.

The three-dimensional design additionally allows the new transistors to be packed more densely on a silicon wafer than was previously possible, enabling a reduction in the size, and hence price, of future chips.

'The performance gains and power savings of Intel's unique 3-D Tri-Gate transistors are like nothing we've seen before' said Intel senior fellow Mark Bohr. 'The low-voltage and low-power benefits far exceed what we typically see from one process generation to the next.' Despite these advances, Bohr said that Tri-Gate transistor wafers would cost only around 2-3% more to manufacture than 32nm transistor wafers using planar designs.

Though the Tri-Gate design was first proposed by Intel engineers back in 2002, it has taken until now to reach high-volume production. The first chips to use the technology will be Intel's forthcoming Ivy Bridge CPUs, the 22nm successors to Sandy Bridge that are expected to arrive at the end of 2011.

Intel then plans to extend the technology across its range, including to the low-power Atom processor series.

Are you looking forward to the performance benefits that these new transistors could bring? How long will it take Intel's competitors to catch up? Let us know your thoughts in the forums.

Originally Posted by GuinevereOnly 2% to 3% more to manufacture, what'll equate to at the tills? A 50% premium?

That really wouldn't surprise me. The lack of competition at this field is hurting us consumers. Intel is dominating and AMD is lacking behind (AMD fan boys need not shout, it's the bloody truth).

AMD's offer is and has been very poor when compared to Intel's.
And there's no point in saying that AMD offer bang for buck (Phenom II for example) - well if you were after the most bang for buck then you've got no business looking for the latest and fastest CPU's, since they all come with a premium price tag.

There go AMD's hopes of a long term come back. I doubt Intel will license this tech to Global Foundaries and the like.
It's a really cool bit of design though. I'll look forward to it coming to market, even if I do have to wait a year or two. The only downside I can see is that Intel may use this to extend the life of the Atom design, rather than actually designing a good power efficient architecture.

Originally Posted by ZERO <ibis&gtWait if the power draw is that much lower and thus the heat does not not mean that on the high end we are looking at a 50% increase in base clock speeds!?

From what Intel said I doubt we'll see significantly higher clock speeds on Ivy Bridge - instead the power savings of these really rather clever 22nm transitors will used to improve the architecture - thus resulting in more performance.

After all, more Intel desktop chips have been 'stuck' at 2.66GHz or so for the last 5+ years, so there's been a massive increase in performance during that time.

Originally Posted by ch424All they've done is make the gate thicker and go marketing mental. I'm sure Samsung, TSMC and the others will catch up soon enough.

I don't understand, surely if it's just a larger gate then the transistor would be bigger, not smaller and faster?

Larger in terms of thicker. Other news websites have better explanations of how the technology works -effectively it's in 3 dimensions. Probably not simply a marketing ploy though - we don't hear so many random useless low-level technologies from Intel nowadays so it would be out of the ordinary.

i'm pretty sure the prices are not going to drop much on this. just because its smaller it doesn't mean its cheaper. keep in mind the overall materials to create a cpu is extremely cheap. silicon is one of the most abundant non-gas elements in the world. cpu pins are gold plated and the dies as a whole aren't very big. from what i can tell, 90% of what you pay for in a cpu is the architecture, the research, the machines, and everyone who contributed. 5% would be advertisements, and the rest is the actual materials. i've seen brand new CPUs being sold for as little as $32. considering a certain percentage would go to the retailer, that shows how little a cpu is actually worth.

considering this is new technology requiring more advanced machines, i'd expect the price to go up, not down.

as for amd, sure it sucks for them but they'll get it eventually. for everything intel thought of first, amd has eventually released the same thing and improved upon the idea. typically the only reason their improvements don't surpass intel is because of other technical downsides such as fab size, amount of memory channels, instruction sets, and time for testing.
for example, when intel first released quad core CPUs, it was really just 2 dual cores slapped next to each other with a few changes here and there. it worked as a quad core but it wasn't a "true" quad core. amd wanted to out-do intel by creating a true quad core with communication between all cores, but they rushed it so it wasn't as good and some models failed, hence the phenom x3. i'm sure because of this moment, this is why amd is taking so long to release bulldozer. i'm sure bulldozer was ready for release 5 months ago but they want to make sure it will get them where they want.

Ah well it'll probably take ivy bridge like 10 months from now to start really getting laptops out. Still this makes me sad =(. Oh well had to get a new system at some point. Useless to just keep putting it off because "something better is coming" since something better is always coming ^^;

I think the price of these chips will be determined more by how much Intel will have to spend to re-tool their factories along with how much they have actually spent on R&D. It will also be interesting to see what, if any, price drop there will in Sandy Bridge processors once Ivy is large on the market.

Nice Intel just hope they don't give this tech to any foundries then AMD will beg for help against there BD "come back" CPU's and left them in the dry to get there own stuff and not copy what IBM and Intel have made trough the years

with more tightly packed transistors, the heat per area has increased dramatically. this means it will be more difficult to overclock these chips due to heat generated. also, the narrow channel for those transistors could mean they will degrade faster due to electromigration.

i will be holding my purchase decision on Ivy bridge until i've seen normal user's overclocking results, and seen whether extreme overclocker's chip can survive a month at very least.

Oh dear, how on earth is Amd goping to compete with this in the low end......Ivybridge will be DX11, 22nm and 50% power saving with this new trigate design.....this seems like the end of the road for Amd, they are a whole process generation behind and the addition of trigate is going to kill them....fantastic tech news but afraid this is hailing the end of any realistic hope of competition.

At higher voltage the improvement over a 32NM planar transistor starts to diminish it seems. AFAIK,all the current Sandy Bridge processors have a VID over 1V. The planar transistors Intel use are produced on a 32NM bulk process.

It would be interesting to see the advantage when compared to planar transistors produced on a 32NM SOI process and AFAIK this is what AMD is using.

Anyway,hopefully the H67 chipset supports Ivy Bridge as it would mean a better upgrade path for my computer.

You read an Intel press release about their amazing "breakthrough" and you swallow it hook, line and sinker without a single critical thought. It is not helped by a technology press industry largely ignorant of such technical matters (not directed towards Bit-tech in particular).

IT IS A F*****G FINFET!!! They aren't anything special to Intel, they didn't invent them nor are they alone in developing them, Intel is just first to put it into production. I like that Bit-tech just regurgitates the Intel announcement. "Tri-gate" is just Intel's marketing name for a non-unique concept.

"Tri-gate" is no more a "breakthrough" in transistor design than Intel's HKMG was. It is a small evolutionary step along the way. It especially won't make much of a user difference, so if you are holding off on Sandybridge for these fancy new FinFETs you are quite the fool. Now, the IB 22nm process shrink is more of a reason...

As for the fabulous performance figures "50%!!"/"37%", did you even bother to notice that these were over Intel's 32nm process (once again, something Bit-tech failed to say)? What would the increases be over a traditional planar 22nm transistor? Much less, I guarantee. Also, did you notice that these figures were for low voltages (0.7V)? That's right, the performance advantages DECREASE as you raise the voltage to more normal desktop CPU levels of 1.3/1.4V. It means that power reductions will be better when the core is clocked down at idle, but not so much when at load.

Meanwhile, the SOI consortium (AMD, GloFo, IBM, etc) will be fielding FD-SOI which delivers a similar performance advantage (perhaps even better) at low voltages while still using planar transistors. Why is this important? Because it is a tried design/manufacturing process that utilises existing tools and experience.
If Intel thinks "Tri-gate" Atoms are going to kill off ARM, then they are going to find future FD-SOI ARM's a problem.
See here (PDF) for more.

Okay I was plannin on making a new build this summer, but apparently as Adnoctum and CAT said, the power reduction would only most likely benefit low powered devices below typical voltage.. so since my build wont be low powered this won't really affect it then would it..?

Reminds me of LED and LCD and the like.

Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.