bit-tech.net

Researchers develop all-optical 'transistor'

Researchers develop all-optical 'transistor'

Researchers at MIT have developed the optical equivalent of a transistor (not pictured), laying the groundwork for future computers that use light in place of electricity.

Researchers at the Massachusetts Institute of Technology have developed an all-optical 'transistor', which they claim could pave the way to optical computers as well as powerful quantum computing systems.

The team from MIT's Research Laboratory of Electronics, in partnership with Harvard University and the Vienna University of Technology, have released details of their experimental realisation of an optical switch which can be controlled using a single photon. As a result, it's possible for light to interact with light - something that isn't normally possible, with two photons meeting each other in a vacuum simply ignoring each other and passing straight on through.

The team is comparing the switch to the electronic transistor, the revolutionary device that replaced the vacuum tube and made possible all the wonderful high-performance computing equipment we enjoy today - claiming that what they have created is the equivalent of a transistor for light, rather than electricity.

The system takes the form of an optical resonator, a pair of highly reflective mirrors designed to form a switch. 'If you had just one mirror, all the light would come back,' explains Vladan Vuletić, the Lester Wolfe Professor of Physics at MIT, in the establishment's announcement on the research. 'When you have two mirrors, something very strange happens.' When the switch is in the on position, a light beam can pass through both mirrors; when in the off position, the intensity of the beam is reduced by around 80 per cent.

The mirrors achieve this trick by being positioned in such a way that the gap between them is the same as the wavelength of the light. As a result of light's somewhat odd status as being both a wave and a particle, an electromagnetic field builds up between the two mirrors causing the forward-facing mirror to become transparent - but only to that particular wavelength of light.

The team claims that the experimental discovery could lead to mainstream optical computing, in which the electrons of a computer chip are swapped out for photons. As a result, an optical computer would run significantly cooler with much less energy wasted as heat - and, in turn, would draw less power than its electronic equivalent. The system also holds promise for the growing field of quantum computing, where 'qubits' are held in superposition - being both 0 and 1 simultaneously - in order to solve problems in parallel, with photons easier to hold in superposition than electrons.

The team's work is unlikely to leave the lab any time soon, however. The current prototype works by filling the gap between the mirrors with a supercooled cesium gas - not something that really lends itself to be shrunk down and duplicated a few million times to form a modern processor. 'For the classical implementation, this is more of a proof-of-principle experiment showing how it could be done,' admits Vuletić. 'One could imagine implementing a similar device in solid state — for example, using impurity atoms inside an optical fibre or piece of solid.'

MIT's work will likely raise interest from chipmakers, many of whom are investing in optical computing technologies as a response to the growing difficulties in shrinking electronic components to ever-smaller sizes. For now, however, optics are likely to be reserved for inter-chip communications such as with IBM's holey optochip and work being carried out in the Intel co-funded Optoelectronics Systems integration in Silicon Centre at the University of Washington.

The team's paper is published in the most recent issue of the journal Science.

33 Comments

Discuss in the forums Reply
greigaitken 5th July 2013, 10:48 Quote
"positioned in such a way that the gap between them is the same as the wavelength of the light"
Surely this means you cant shrink each unit smaller than this gap.
uv light goes to 10nm, not very small.
Gareth Halfacree 5th July 2013, 10:57 Quote
Quote:
Originally Posted by greigaitken
uv light goes to 10nm, not very small.
But gamma ray photons are 10 picometres, which is a damn sight smaller than we're ever going to get using electronic components.
maverik-sg1 5th July 2013, 11:04 Quote
Graphene will be mainstream before this gets out the lab - that's how far off this tech is.

Light beams are too fat right now a real optical cpu die would be massive compared to todays, it's interesting tech and I am interested to see how it develops, because there's lots of benefits to using light, they just need to work on making the beams smaller at same time.
greigaitken 5th July 2013, 11:06 Quote
"But gamma ray photons are 10 picometres"
what would my case be made of to stop it killing me?
Griffter 5th July 2013, 11:07 Quote
do i see light at the end of the tunnel?
Gareth Halfacree 5th July 2013, 11:08 Quote
Quote:
Originally Posted by greigaitken
"But gamma ray photons are 10 picometres"
what would my case be made of to stop it killing me?
Lead. But no, seriously, there are plenty of wavelengths between gamma rays and UV that would be suitable. I was merely pointing out that photons go to significantly smaller wavelengths than UV.
greigaitken 5th July 2013, 11:10 Quote
Ok, i thought that after uv, the smaller you go, the more damage it does to you.
If GH says it's safe that's good enough for me.
Gareth Halfacree 5th July 2013, 11:12 Quote
Quote:
Originally Posted by greigaitken
If GH says it's safe that's good enough for me.
Worst case, you go full-on Bruce Banner.
Corky42 5th July 2013, 11:14 Quote
Take care when overclocking, because you won't like me when i'm angry
maverik-sg1 5th July 2013, 11:29 Quote
Quote:
Originally Posted by Corky42
Take care when overclocking, because you won't like me when i'm angry

I spat my drink reading this one....brilliant!! :)
Stanley Tweedle 5th July 2013, 13:37 Quote
I always find it amusing whenever these physics tech articles appear. There's always a string of comments from people who think they're more knowledgeable than the scientists in question and it invariably goes along the lines of... "it won't work because......."

:)
lp rob1 5th July 2013, 14:51 Quote
But even so, why would you need a particularly small CPU die for this? In silicon microelectronics, the whole design is planar, so having a small CPU die is necessary to fit a large number of transistors in. But with light, little to no heat is produced, so the same circuit could be stacked on multiple layers. So even though in a single layer you might only have, say, 30% of the 'transistors' of a comparable silicon CPU, you would have many thousands of layers.
ev1lm1nd666 5th July 2013, 20:20 Quote
Plus with this new tech you could build cpu's in 3D, stacking layer upon layer. This could lead to CPU's that need no/very little in the way of active cooling while being lightyears (pun intended) ahead of anything we have today.Technology has advanced to the level where we can produce things on a molecular level, just imagine the gaming computer of 50 years from now.....
Blackshark 5th July 2013, 20:39 Quote
Graphene is a red herring. Whilst a lot of money is being pushed by governments desperate not to be left out, there has been little real world CPU applicable advances.
Alecto 5th July 2013, 21:53 Quote
Quote:
Originally Posted by Gareth Halfacree
Quote:
Originally Posted by greigaitken
uv light goes to 10nm, not very small.
But gamma ray photons are 10 picometres, which is a damn sight smaller than we're ever going to get using electronic components.

And the gamma-ray deflecting mirrors at this scale would be made of what ?
siliconfanatic 6th July 2013, 04:14 Quote
Once again it does not have the be gamma. IE:

(I've always wanted to use that one )
sub routine 6th July 2013, 07:45 Quote
Hold on chewy were gonna overclock the nuts off this.
http://i40.tinypic.com/wbdwqq.jpg
Corky42 6th July 2013, 09:24 Quote
The wave length that springs to mind for me are X-rays as they are in commonly used already, and X-rays range from 0.01 to 10 nm
themassau 6th July 2013, 10:55 Quote
Quote:
Originally Posted by greigaitken
"But gamma ray photons are 10 picometres"
what would my case be made of to stop it killing me?

don't forget that these gamma rays would be low power so it will not penetrate the chip walls. also there are only a few photons per mirror.
if the frequency is higher than the penetration is less that why infra red is line of sight and wifi isn't.
the dangerous gamma rays from a nuclear reactor are really high energy there are many at once.
Pinguu 6th July 2013, 13:14 Quote
Quote:
Originally Posted by Gareth Halfacree
But gamma ray photons are 10 picometres, which is a damn sight smaller than we're ever going to get using electronic components.

Huh? I thought as fundamental particles, photons don't have dimensions.
Cobalt 6th July 2013, 13:48 Quote
Photons have a wavelength which is what is being referenced here. All this talk of light beyond UV is nonsense though. How are you going to make the x/gamma rays? Free electron lasers were pretty big last time I checked. Unless you don't need a coherent source for the photons.
Corky42 6th July 2013, 19:07 Quote
Small X-ray emitters are already being developed.
http://en.wikipedia.org/wiki/X-ray_generator#Advances_in_X-ray_technology
Quote:
Engineers at the University of Missouri (MU), Columbia, have invented a compact source of x-rays and other forms of radiation. The radiation source is the size of a stick of gum and could be used to create portable x-ray scanners. A prototype handheld x-ray scanner using the source could be manufactured in as soon as three years.
Dr. Strangelove 6th July 2013, 20:47 Quote
I still don't understand why people are so worried about size???

Who cares if it is the size of a shoebox, if it is XXXX number of times more powerful than what we have now. With single photons being the driving force there is probably next to no heat development within the "CPU". And with photons travelling at the speed of light (literally), distance within the chip will probably not matter much either....

-Jacob
Dr. Strangelove 6th July 2013, 20:48 Quote
ups... double post
edzieba 7th July 2013, 15:37 Quote
Quote:
Originally Posted by greigaitken
"But gamma ray photons are 10 picometres"
what would my case be made of to stop it killing me?
Air. Or maybe tissue paper? Even if the entirety of the power input was emitted as raw gamma output (rather than the more likely waste heat and other lower energy photons) 100W of gamma radiation emitted omnidirectionally isn't going to do squat unless you enjoy holding your head inside the chassis while using your computer.
Quote:
Graphene is a red herring. Whilst a lot of money is being pushed by governments desperate not to be left out, there has been little real world CPU applicable advances.
What idiot would build a CPU with graphene? It's not a very good semiconductor at all. What is is excellent at is in other sorts of microelectronics like power amplifiers, and it's useful mechanical and structural properties.

Complaining Graphene is no good for CPUs is like complaining Carbon Fibre is no good for making windows. Technically correct, but missing the point somewhat.
Corky42 7th July 2013, 16:40 Quote
Quote:
Originally Posted by edzieba
What idiot would build a CPU with graphene? It's not a very good semiconductor at all. What is is excellent at is in other sorts of microelectronics like power amplifiers, and it's useful mechanical and structural properties.

Complaining Graphene is no good for CPUs is like complaining Carbon Fibre is no good for making windows. Technically correct, but missing the point somewhat.

Graphene is actually a very good conductor of electrons offering nearly no resistance.
http://en.wikipedia.org/wiki/Graphene#Electronic

But in its current state it wont be seen in CPU's for decades or more.
http://www.bit-tech.net/news/hardware/2011/01/21/ibm-graphene-wont-replace-silicon-cpus/1
Quote:
Graphene is a semi-metal or zero-gap semiconductor

From a massive list of problems preventing them being used are.
1. Graphene does not have an energy gap, and therefore, graphene cannot be “switched off"
2. Much of the research done on graphene to date has focused on proving basic principles
3. When the transistor was discovered, in 1947 at Bell Labs, the three scientists working on the problem knew they’d found something big, but refining the first transistor into a marketable product took years
ferret141 8th July 2013, 12:33 Quote
Quote:
Originally Posted by ev1lm1nd666
Plus with this new tech you could build cpu's in 3D, stacking layer upon layer. This could lead to CPU's that need no/very little in the way of active cooling while being lightyears (pun intended) ahead of anything we have today.Technology has advanced to the level where we can produce things on a molecular level, just imagine the gaming computer of 50 years from now.....

3D light processor.......am I the only one reminded of this?

http://farm4.static.flickr.com/3663/3420375543_7799dffe73.jpg

Or this?

http://thehumanscorch.files.wordpress.com/2012/05/tesseract-in-the-avengers.jpg
ashchap 8th July 2013, 13:09 Quote
Quote:
Originally Posted by Dr. Strangelove
I still don't understand why people are so worried about size???

Who cares if it is the size of a shoebox, if it is XXXX number of times more powerful than what we have now. With single photons being the driving force there is probably next to no heat development within the "CPU". And with photons travelling at the speed of light (literally), distance within the chip will probably not matter much either....

-Jacob

Assume shoebox is 30cm across

speed of light = 3x10^8 metres per second

time taken for a photon to travel from one side of the shoebox to the other = 0.3/3x10^8 = 1x10^9 = 1 nanosecond

1GHz clock cycle = 1 nanosecond

so you could not run the processor faster than 1GHZ without having serious syncronization problems.

Quote:
Originally Posted by Article
As a result of light's somewhat odd status as being both a wave and a particle...

Nitpicking, but light's status as being both a wave and a particle isn't odd. All particles are waves. All waves are particles. Light is nothing special in that respect, it just has a low enough energy that we can easily observe the wave behaviour.
Corky42 8th July 2013, 13:24 Quote
@Ashchap, you are mixing up what is used to generate the photon with what is used to do the switching, we don't measure the distance a electron has to travel from the PSU.
If they used X-rays the distance inside the transistor a photon would travel would be between 0.01 to 10 nm
ashchap 8th July 2013, 13:40 Quote
Quote:
Originally Posted by Corky42
@Ashchap, you are mixing up what is used to generate the photon with what is used to do the switching, we don't measure the distance a electron has to travel from the PSU.
If they used X-rays the distance inside the transistor a photon would travel would be between 0.01 to 10 nm

Agreed, but my point was that if you have one transistor on one side of the CPU that needs to send a signal to another transistor on the other side of the CPU then how long will it take for the information to get there? answer: (at least) the distance between them divided by the speed of light. If the clock cycle is over before an operation can physically be completed then you are going to have problems.
lacuna 8th July 2013, 13:41 Quote
Coincidentally, I was reading Congo (Micheal Crichton) last week and this concept was discussed in there and that was back in the early 80's. Apparently its taken quite a while to realise.
Corky42 8th July 2013, 16:41 Quote
Quote:
Originally Posted by ashchap
Agreed, but my point was that if you have one transistor on one side of the CPU that needs to send a signal to another transistor on the other side of the CPU then how long will it take for the information to get there? answer: (at least) the distance between them divided by the speed of light. If the clock cycle is over before an operation can physically be completed then you are going to have problems.

Less time than current CPU's take to do the same thing, the main advantage of using photons vs electrons is the reduction in size and heat, i think :?
siliconfanatic 10th July 2013, 01:28 Quote
Ferret: loved the avengers. Best reference ever, but good point. Tesseract=arc reactor. :D
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums