bit-tech.net

Rattner talks about future transistor technology

Rattner talks about future transistor technology

Justin Rattner thinks CMOS transistor technology will be around for quite some time... at least until 2020, if not beyond.

Intel Chief Technology Officer Justin Rattner said that it’s likely there will need to be more changes made to the way transistors work sooner than we think.

He said that Intel is locked and loaded when it comes to its 32nm process—and it has already demoed working 32nm SRAMs as early as last September—but he said that there’s still quite a bit of work to do before Intel starts to finalise things at 22nm.

The jury is still out at 22nm and beyond,” he said. “As to whether we’ll continue with traditional transistors, or whether we’ll move to a surface device such as tri-gate transistors that we’ve been talking about now for at least the last four or five years.

Rattner added that it’s a decision Intel is going to face at at either 22nm or probably at 16nm. “It’s a very important transition—it’s as important as high-k metal gate—because once you’re on the surface you’re going to have a much wider choice of materials. And you can build conventional CMOS devices with tri-gate transistors, which are essentially just better CMOS transistors.

He then continued to talk about how he believes that CMOS technology will be a workhorse for years to come without getting awfully specific.

When I asked Rattner to be more specific about how long we can expect CMOS technology to be used, he said that he’s confident that Intel will be using fairly familiar charge-based transistor technology until at least 2020 and maybe beyond that, too.

It’s sometime in that next decade [after the transition to surface CMOS technology] around the time when we’re making 10nm devices that we want to start looking at other quantum properties,” said Rattner. “Historically, we’ve not been able to look more than a decade out and it takes around a decade to perfect the use of these new materials. For example, our work in high-k started about a decade ago and that’s a good measure for how long these technology advancements take.

So, according to Rattner, we’ll be seeing devices using CMOS technology for quite some time now and Intel is already starting to think beyond then, too. Another thing I learned from the chat I had with him was just how much of a step forward the transition to high-k metal gate was – we’re not going to see a similar leap in technology for quite a few years.

How long do you think Intel can continue to follow Moore’s Law? Share your thoughts in the forums.

5 Comments

Discuss in the forums Reply
Arkanrais 23rd August 2008, 03:32 Quote
what happened to that light based transistor technology I heard of a year or two back (I think that was it)?
something about replacing the metal between the transistors with (something like) fiber optics (or just empty spaces/vacuums between photosensitive and light emitting cells) so that you don't have to wait for the electrical charge to reach the other end of the wire before sending out the next one, instead sending out sequential light flashes, eliminating the latency of an electrical current.
I think that was what the tech was on about. could be completely wrong or have dreamed it up.

still, I am surprised initially, to see intel thinking right down to the 10nm level, though giving it some thought, it does seem I should have expected as much. I wonder what the next levels down frm that will be? 7nm, 5nm, 3nm then possible journeys into picometer territory (or is that getting into the difference between atoms and sub-atomic particles)?
Phil Rhodes 23rd August 2008, 04:29 Quote
> instead sending out sequential light flashes, eliminating the latency of an electrical current

Electrical currents go at the speed of light.

Depends on the material, of course.
johnmustrule 23rd August 2008, 04:41 Quote
Quote:
Originally Posted by Arkanrais
what happened to that light based transistor technology I heard of a year or two back (I think that was it)?
something about replacing the metal between the transistors with (something like) fiber optics (or just empty spaces/vacuums between photosensitive and light emitting cells) so that you don't have to wait for the electrical charge to reach the other end of the wire before sending out the next one, instead sending out sequential light flashes, eliminating the latency of an electrical current.
I think that was what the tech was on about. could be completely wrong or have dreamed it up.

still, I am surprised initially, to see intel thinking right down to the 10nm level, though giving it some thought, it does seem I should have expected as much. I wonder what the next levels down frm that will be? 7nm, 5nm, 3nm then possible journeys into picometer territory (or is that getting into the difference between atoms and sub-atomic particles)?

Once you make a chip small enough it stops functioing in the ideas of normal physics and ultimately you have to focus on quantum mechanics when your developing parts that small. That's what the guy was getting on about in the article CMOS can only go so small. I belive optical transistor technology is still very much in development, but they obviously wont be serious about anything that's not cost effective, if Intel wanted to they could release them pretty quickly but this isn't going to happen because they've got CMOS manufacturing being done very efficiently.

FYI there is a quantum proccesor if you care to google it, there's probably a video on youtube too, It's the ultimate graphics workhorse but not as generalized as a normal CPU. Basically it can crack any encription, render any scean instantaniously, we're talking pixar graphics for the home. Right now there only owned by a couple colleges and the US goverment, though I haven't read about them in awhile.
FeRaL 25th August 2008, 06:12 Quote
Quote:
Originally Posted by Phil Rhodes
> instead sending out sequential light flashes, eliminating the latency of an electrical current

Electrical currents go at the speed of light.

Depends on the material, of course.

Actually in a vacuum the speed of electricity (electrons) is about 1/10 the speed of light.
In a solid medium it is even slower....
Xir 25th August 2008, 09:50 Quote
Quote:
Originally Posted by Arkanrais

I wonder what the next levels down frm that will be? 7nm, 5nm, 3nm then possible journeys into picometer territory (or is that getting into the difference between atoms and sub-atomic particles)?

Well we are talking size of litography here... the devices are already thinner...the gate thickness for 65nm was about 1.2 nm...thats...3-4 Atoms* of Si...while the gate is SiO² and i dont know the grid konstant for that.

Intel says (http://en.wikipedia.org/wiki/45_nanometer)...for the 45nm process, they use 1nm oxide with a 7Angstrom transition layer... 7 Angstrom is 0.7nm

*Silicon atoms form in a grid that has a a step width of about 1/2nm (in their non-stressed form anyway).


So: the computers we usre today use sizes < 2 nm...which means were deep into quantum effects like tunneling.


Murray
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums