bit-tech.net

Intel announces 3D Tri-Gate transistors

Intel announces 3D Tri-Gate transistors

Intel's new Tri-Gate transistors will allow for greater performance and energy efficiency

Intel has revealed a brand new type of transistor, which uses a three-dimensional design to operate in a smaller space and consume less power than existing designs.

The new transistors are called Tri-Gate units, in reference to their use of three conductive surfaces. The company claims they offer an overall 50% power saving over current planar transistors, including greatly reduced power leakage when in the off state. Alternatively, the new transistors can deliver 37% faster performance with the same power draw.

The three-dimensional design additionally allows the new transistors to be packed more densely on a silicon wafer than was previously possible, enabling a reduction in the size, and hence price, of future chips.

'The performance gains and power savings of Intel's unique 3-D Tri-Gate transistors are like nothing we've seen before' said Intel senior fellow Mark Bohr. 'The low-voltage and low-power benefits far exceed what we typically see from one process generation to the next.' Despite these advances, Bohr said that Tri-Gate transistor wafers would cost only around 2-3% more to manufacture than 32nm transistor wafers using planar designs.

Though the Tri-Gate design was first proposed by Intel engineers back in 2002, it has taken until now to reach high-volume production. The first chips to use the technology will be Intel's forthcoming Ivy Bridge CPUs, the 22nm successors to Sandy Bridge that are expected to arrive at the end of 2011.

Intel then plans to extend the technology across its range, including to the low-power Atom processor series.

Are you looking forward to the performance benefits that these new transistors could bring? How long will it take Intel's competitors to catch up? Let us know your thoughts in the forums.

39 Comments

Discuss in the forums Reply
battles_atlas 4th May 2011, 19:28 Quote
Will I have to wear stupid goggles to use one?
mi1ez 4th May 2011, 19:28 Quote
Wow. Genuinely impressed!
Valinor 4th May 2011, 19:30 Quote
Quote:
Originally Posted by battles_atlas
Will I have to wear stupid goggles to use one?

They only work with a 120Hz motherboard.
doggeh 4th May 2011, 19:34 Quote
Quote:
Originally Posted by battles_atlas
Will I have to wear stupid goggles to use one?
Quote:
Originally Posted by Valinor
They only work with a 120Hz motherboard.

hahahaha
Jimbob94 4th May 2011, 19:44 Quote
Looks awesome
Kenny_McCormick 4th May 2011, 19:44 Quote
I'll buy the non-3D version. 3D gives me headache.
Guinevere 4th May 2011, 19:52 Quote
Only 2% to 3% more to manufacture, what'll equate to at the tills? A 50% premium?
cgthomas 4th May 2011, 20:01 Quote
Quote:
Originally Posted by Guinevere
Only 2% to 3% more to manufacture, what'll equate to at the tills? A 50% premium?

That really wouldn't surprise me. The lack of competition at this field is hurting us consumers. Intel is dominating and AMD is lacking behind (AMD fan boys need not shout, it's the bloody truth).

AMD's offer is and has been very poor when compared to Intel's.
And there's no point in saying that AMD offer bang for buck (Phenom II for example) - well if you were after the most bang for buck then you've got no business looking for the latest and fastest CPU's, since they all come with a premium price tag.

</rant>
Centy-face 4th May 2011, 20:07 Quote
I have a complete techrection at this.
Technobod 4th May 2011, 20:08 Quote
I'm intrigued as to how well they'll overclock compared to 2D transistors...
jimmyjj 4th May 2011, 20:08 Quote
Impressive.

Intel's research budget is in the billions, I am not sure if AMD will ever catch up.
Bloody_Pete 4th May 2011, 20:12 Quote
Probably less, as they'll be able to make more per wafer :) So faster and potentially cheaper. Win :)
flibblesan 4th May 2011, 20:36 Quote
Wasn't AMD working on this sort of technology years ago?
ch424 4th May 2011, 20:54 Quote
All they've done is make the gate thicker and go marketing mental. I'm sure Samsung, TSMC and the others will catch up soon enough.
l3v1ck 4th May 2011, 21:47 Quote
There go AMD's hopes of a long term come back. I doubt Intel will license this tech to Global Foundaries and the like.
It's a really cool bit of design though. I'll look forward to it coming to market, even if I do have to wait a year or two. The only downside I can see is that Intel may use this to extend the life of the Atom design, rather than actually designing a good power efficient architecture.
TheLegendJoe 4th May 2011, 22:13 Quote
22nm : | ARRRG

I'm rocking a phenom 2 965BE, pretty sure thats 40nm... And came out 2 ish years ago, proof of moores law? ;) haha.
ZERO <ibis> 4th May 2011, 22:20 Quote
Wait if the power draw is that much lower and thus the heat does not not mean that on the high end we are looking at a 50% increase in base clock speeds!?
Lizard 4th May 2011, 22:24 Quote
Quote:
Originally Posted by ZERO <ibis>
Wait if the power draw is that much lower and thus the heat does not not mean that on the high end we are looking at a 50% increase in base clock speeds!?

From what Intel said I doubt we'll see significantly higher clock speeds on Ivy Bridge - instead the power savings of these really rather clever 22nm transitors will used to improve the architecture - thus resulting in more performance.

After all, more Intel desktop chips have been 'stuck' at 2.66GHz or so for the last 5+ years, so there's been a massive increase in performance during that time.
Oggyb 4th May 2011, 22:28 Quote
Quite, I'd rather have more performance at the same clock setting, with 50% reduction in power draw than way more perfomance than I'll ever need and an unchallenged electricy bill.
Boogle 4th May 2011, 22:32 Quote
Quote:
Originally Posted by ch424
All they've done is make the gate thicker and go marketing mental. I'm sure Samsung, TSMC and the others will catch up soon enough.

I don't understand, surely if it's just a larger gate then the transistor would be bigger, not smaller and faster?
Bakes 4th May 2011, 22:47 Quote
Quote:
Originally Posted by Boogle
Quote:
Originally Posted by ch424
All they've done is make the gate thicker and go marketing mental. I'm sure Samsung, TSMC and the others will catch up soon enough.

I don't understand, surely if it's just a larger gate then the transistor would be bigger, not smaller and faster?

Larger in terms of thicker. Other news websites have better explanations of how the technology works -effectively it's in 3 dimensions. Probably not simply a marketing ploy though - we don't hear so many random useless low-level technologies from Intel nowadays so it would be out of the ordinary.
schmidtbag 5th May 2011, 00:16 Quote
i'm pretty sure the prices are not going to drop much on this. just because its smaller it doesn't mean its cheaper. keep in mind the overall materials to create a cpu is extremely cheap. silicon is one of the most abundant non-gas elements in the world. cpu pins are gold plated and the dies as a whole aren't very big. from what i can tell, 90% of what you pay for in a cpu is the architecture, the research, the machines, and everyone who contributed. 5% would be advertisements, and the rest is the actual materials. i've seen brand new CPUs being sold for as little as $32. considering a certain percentage would go to the retailer, that shows how little a cpu is actually worth.

considering this is new technology requiring more advanced machines, i'd expect the price to go up, not down.

as for amd, sure it sucks for them but they'll get it eventually. for everything intel thought of first, amd has eventually released the same thing and improved upon the idea. typically the only reason their improvements don't surpass intel is because of other technical downsides such as fab size, amount of memory channels, instruction sets, and time for testing.
for example, when intel first released quad core CPUs, it was really just 2 dual cores slapped next to each other with a few changes here and there. it worked as a quad core but it wasn't a "true" quad core. amd wanted to out-do intel by creating a true quad core with communication between all cores, but they rushed it so it wasn't as good and some models failed, hence the phenom x3. i'm sure because of this moment, this is why amd is taking so long to release bulldozer. i'm sure bulldozer was ready for release 5 months ago but they want to make sure it will get them where they want.
ch424 5th May 2011, 00:16 Quote
Sorry, I meant thicker relative to the other parts of the transistor - it's still smaller overall!
Neogumbercules 5th May 2011, 02:50 Quote
Glad i didn't upgrade to Sandy Bridge. I think I'll be looking at an upgrade come the end of the year...
Dae314 5th May 2011, 03:14 Quote
ack -_- just bought a sandy bridge based laptop yesterday T_T

Ah well it'll probably take ivy bridge like 10 months from now to start really getting laptops out. Still this makes me sad =(. Oh well had to get a new system at some point. Useless to just keep putting it off because "something better is coming" since something better is always coming ^^;
sinner666999 5th May 2011, 05:25 Quote
I think the price of these chips will be determined more by how much Intel will have to spend to re-tool their factories along with how much they have actually spent on R&D. It will also be interesting to see what, if any, price drop there will in Sandy Bridge processors once Ivy is large on the market.
flipman 5th May 2011, 06:01 Quote
Nice Intel just hope they don't give this tech to any foundries then AMD will beg for help against there BD "come back" CPU's and left them in the dry to get there own stuff and not copy what IBM and Intel have made trough the years
Xir 5th May 2011, 07:31 Quote
Is this "just" a FINFET?
Snips 5th May 2011, 07:54 Quote
"Intel currently accounts for around 80% of global microprocessor sales, according to market analysts IDC.

Its nearest rival, Advanced Micro Devices (AMD) has a 19% share.

AMD was the first to produce a prototype 22nm chip in 2008.

It is widely expected to pursue a similar fin-based system to Intel, known as FinFET.

However, the company has yet to announce its plans for a commercial product.

AMD spun off its chip fabrication arm in 2009, creating GlobalFoundaries.

Analyst Dan Hutcheson said that the separation of design and manufacture had damaged AMD's ability to innovate.

"There is a huge competitor advantage to having your own fab [fabrication facility] and being able to tune the process.

"This is Intel pulling away," he said"

And who was it who said selling GF was a good thing for AMD?

As for AMD clawing something back short term, where is the evidence for such a statement?

Even with Intel's Sandy epic cock up, AMD still have not and cannot capitalise on it.

Intel = Barcelona

AMD = Tranmere Rovers

On a very bad day for Barcelona, TRFC have a chance but even then it isn't a realistic one.
sotu1 5th May 2011, 09:43 Quote
Are Intel still using MS Paint for their artwork?
Fizzban 5th May 2011, 11:21 Quote
Very nice. I'm impressed.
wuyanxu 5th May 2011, 12:27 Quote
why as no one picked up on issues of this: heat density.

with more tightly packed transistors, the heat per area has increased dramatically. this means it will be more difficult to overclock these chips due to heat generated. also, the narrow channel for those transistors could mean they will degrade faster due to electromigration.


i will be holding my purchase decision on Ivy bridge until i've seen normal user's overclocking results, and seen whether extreme overclocker's chip can survive a month at very least.
[USRF]Obiwan 5th May 2011, 13:45 Quote
Horizontal gates going vertical awesome!

In other news: AMD strikes back with Quad gates :P
dicobalt 5th May 2011, 19:47 Quote
What was that noise? I hope it wasn't the sound of Bulldozer dropping dead.
Valinor 5th May 2011, 20:32 Quote
No worries about bulldozer, AMD were using 4D gates in it...
technogiant 6th May 2011, 08:12 Quote
Oh dear, how on earth is Amd goping to compete with this in the low end......Ivybridge will be DX11, 22nm and 50% power saving with this new trigate design.....this seems like the end of the road for Amd, they are a whole process generation behind and the addition of trigate is going to kill them....fantastic tech news but afraid this is hailing the end of any realistic hope of competition.
CAT-THE-FIFTH 6th May 2011, 17:49 Quote
Looks pretty cool technology although it seems most of the improvements will be in low power devices. It would mean Intel is in a much better position against ARM based products.

The following image is from the Anandtech article on the new transistors:

http://www.anandtech.com/show/4313/intel-announces-first-22nm-3d-trigate-transistors-shipping-in-2h-2011

http://images.anandtech.com/reviews/cpu/intel/22nm/power.jpg

At higher voltage the improvement over a 32NM planar transistor starts to diminish it seems. AFAIK,all the current Sandy Bridge processors have a VID over 1V. The planar transistors Intel use are produced on a 32NM bulk process.

It would be interesting to see the advantage when compared to planar transistors produced on a 32NM SOI process and AFAIK this is what AMD is using.

Anyway,hopefully the H67 chipset supports Ivy Bridge as it would mean a better upgrade path for my computer.
Adnoctum 13th May 2011, 18:03 Quote
FFS, the number of sheeple in this thread is incredible.

You read an Intel press release about their amazing "breakthrough" and you swallow it hook, line and sinker without a single critical thought. It is not helped by a technology press industry largely ignorant of such technical matters (not directed towards Bit-tech in particular).

IT IS A F*****G FINFET!!! They aren't anything special to Intel, they didn't invent them nor are they alone in developing them, Intel is just first to put it into production. I like that Bit-tech just regurgitates the Intel announcement. "Tri-gate" is just Intel's marketing name for a non-unique concept.

"Tri-gate" is no more a "breakthrough" in transistor design than Intel's HKMG was. It is a small evolutionary step along the way. It especially won't make much of a user difference, so if you are holding off on Sandybridge for these fancy new FinFETs you are quite the fool. Now, the IB 22nm process shrink is more of a reason...

As for the fabulous performance figures "50%!!"/"37%", did you even bother to notice that these were over Intel's 32nm process (once again, something Bit-tech failed to say)? What would the increases be over a traditional planar 22nm transistor? Much less, I guarantee. Also, did you notice that these figures were for low voltages (0.7V)? That's right, the performance advantages DECREASE as you raise the voltage to more normal desktop CPU levels of 1.3/1.4V. It means that power reductions will be better when the core is clocked down at idle, but not so much when at load.

Meanwhile, the SOI consortium (AMD, GloFo, IBM, etc) will be fielding FD-SOI which delivers a similar performance advantage (perhaps even better) at low voltages while still using planar transistors. Why is this important? Because it is a tried design/manufacturing process that utilises existing tools and experience.
If Intel thinks "Tri-gate" Atoms are going to kill off ARM, then they are going to find future FD-SOI ARM's a problem.
See here (PDF) for more.
sdfx 26th June 2011, 18:21 Quote
Okay I was plannin on making a new build this summer, but apparently as Adnoctum and CAT said, the power reduction would only most likely benefit low powered devices below typical voltage.. so since my build wont be low powered this won't really affect it then would it..?

Reminds me of LED and LCD and the like.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums