bit-tech.net

Larrabee die size is massive

Larrabee die size is massive

Pat Gelsinger holds the 300mm Larrabee wafer. The line is inserted for reference by ourselves.

French tech site Hardware.fr seems to have taken the only photo of Intel Senior VP, Pat Gelsinger, at IDF in Beijing, showing off Larabee.

PC Perspective claims to count 64 dies in the picture, which means even at full yield (it never happens) that would be a very costly part. This is on the assumption that Intel's Gelsinger is holding up a recent die made on 45nm, and not an older one made on 65nm, as there were some claims last year of early samples being used internally on the older process.

Attempting to calculate the die size (on the heavy assumption that it's square) we think Larrabee is a massive 700mm die. To work this out, we first jimmied the levels of the image in Adobe Photoshop, and could just about make out there are eight cores almost exactly in a diagonal. Assuming Gelsinger is holding a 300mm wafer, that equates to 37.5mm per diagonal, or 26.5mm a side. 26.5 squared therefore makes 702.25 mm² total area.

In his article on the French site, writer Damien Triolet claims a die size of "approximately 600mm²" however. Either way, 600-700mm² has people claiming somewhere in the region of 1.6-1.7Bn transistors.

In comparison, Nvidia's GT200 was 576mm at 1.4Bn squared making Larrabee in the region of 4-22 percent larger on that wafer.

Another comparative - Intel Tukwila - is a 2 billion transistor Itanium chip at 712mm squared, however while Tukwila is mostly L3 cache, whereas Larrabee is small IA cores. It also depends on the core frequencies too, and whether it's electrical paths buffed out for extra frequency like the ATI Radeon HD 4890 versus 4870 for example?

The general consensus claims Intel will launch its super sized part on 45nm - the largest to date on High-K Metal Gate, however since the process is mature this will give more security for a larger die. The 45nm High-K MG will lower power and leakage quite considerably, but it depends on the actual transistor and "core" count, as well as power/clock gating, versus aforementioned frequency fattening and IO routing: basically there are too many factors still outstanding for firm conclusions.

What do you think? Big and hot or the next super seller? What do you make the die-size to be? Let us know your thoughts and calculations in the forums!

36 Comments

Discuss in the forums Reply
Krikkit 13th April 2009, 15:09 Quote
I think it'll be rife speculation until we actually get one for some benchmarks.

Anyone know a release date? I'm assuming Q4 this year sometime?
Gremlin 13th April 2009, 15:44 Quote
Even if the hardware is mega potent and all encompasing of the term pure awesome , the drivers will still be intel level **** and the price performance will be ****ing terrible
wuyanxu 13th April 2009, 15:51 Quote
massive die, i love it! it'd be another consumer engineering milestone i can own :D (bought GT200 for this similar reason, 500mm+ die)

will Larrabee support Dx11?
notatoad 13th April 2009, 15:54 Quote
wait... 700mm squared or 700 square mm?
Bindibadgi 13th April 2009, 16:24 Quote
Quote:
Originally Posted by notatoad
wait... 700mm squared or 700 square mm?

Sorry, 700mm²

Fixed it now - I was on my laptop before which doesnt have a numberpad so I couldnt do alt+0178
Krikkit 13th April 2009, 16:30 Quote
Does it not have a NumLock? :p
aron311 13th April 2009, 16:31 Quote
Insert in Word and copy it across...
uncle_fungus 13th April 2009, 16:34 Quote
Or an Fn key for that matter? ;)

Btw, what is a wafter? (Image caption)
bowman 13th April 2009, 16:44 Quote
Quote:
Originally Posted by uncle_fungus
Or an Fn key for that matter? ;)

Btw, what is a wafter? (Image caption)

Wafer*, it's the silicon plate chips are etched on to. Each square on the wafer is an individual Larrabee prototype GPU.
uncle_fungus 13th April 2009, 16:47 Quote
Yes I know that. I was pointing out the (amusing) spelling mistake.
tejas 13th April 2009, 16:55 Quote
Larrabee= Pentium 4 Prescott of the GPU world!
Yemerich 13th April 2009, 16:59 Quote
I never understood why they are round...
wuyanxu 13th April 2009, 17:00 Quote
Quote:
Originally Posted by Bindibadgi

Fixed it now - I was on my laptop before which doesnt have a numberpad so I couldnt do alt+0178

is there a list of useful alt+numbers? it'd certainly be very useful.
FeRaL 13th April 2009, 17:25 Quote
Quote:
Originally Posted by Yemerich
I never understood why they are round...

They are round because they are cut from silicon rods that are drawn out of molten silicon. I imagine they could cut it square after the rod is extracted but that would not give you an optimum yield.

http://www.sehmy.com/Product/abtWafers.htm
thehippoz 13th April 2009, 17:26 Quote
so much for nvidias powerpoint presentation.. and the gt300 isn't going to be able to touch this- I bet they are shating bricks right now :D the driver argument is going to be null with this thing.. they can add optimizations over time.. but to tell you the truth this thing probably won't even need any out the gates

I can just imagine the stock heatsink.. this should be interesting.. at least ati has amd to fall on- they maybe able to dev something similar.. nvidia has alot ot worry about- there marketing machine better kick it up into super bs mode :D maybe the ntune guy can pretend he wrote a driver that doubles performance on every game ;p call it 'my ass got big banged by intel- help!'
Aterius Gmork 13th April 2009, 17:32 Quote
Quote:
Originally Posted by wuyanxu
is there a list of useful alt+numbers? it'd certainly be very useful.

http://www.usefulshortcuts.com/downloads/ALT-Codes.pdf
HourBeforeDawn 13th April 2009, 17:33 Quote
well it is a completely "new" approach to video processing so I would expect it to be large like any new tech and then over time start shrinking it down. Besides this may not be the final release of it.
tejas 13th April 2009, 17:36 Quote
Quote:
Originally Posted by thehippoz
so much for nvidias powerpoint presentation.. and the gt300 isn't going to be able to touch this- I bet they are shating bricks right now :D the driver argument is going to be null with this thing.. they can add optimizations over time.. but to tell you the truth this thing probably won't even need any out the gates

I can just imagine the stock heatsink.. this should be interesting.. at least ati has amd to fall on- they maybe able to dev something similar.. nvidia has alot ot worry about- there marketing machine better kick it up into super bs mode :D maybe the ntune guy can pretend he wrote a driver that doubles performance on every game ;p call it 'my ass got big banged by intel- help!'


What a crock of ****... The reason ATI have done well against Nvidia is something called performance per watt. Larrabee is clearly going to be a hot overpriced, crap yield, expensive to fabricate chip. Nvidia and ATI gpus rape x86 and Intel don't have a chance in hell in the GPU market against these two. Your intel fanboyism is totally blatant and larrabee will be another Pentium 4. And we all remember what a resounding success the Pentium 4 was dont we...
Bindibadgi 13th April 2009, 17:36 Quote
Quote:
Originally Posted by Yemerich
I never understood why they are round...

http://en.wikipedia.org/wiki/Czochralski_process

Very interesting! :)
Bindibadgi 13th April 2009, 17:38 Quote
Quote:
Originally Posted by tejas
What a crock of ****... The reason ATI have done well against Nvidia is something called performance per watt. Larrabee is clearly going to be a hot overpriced, crap yield, expensive to fabricate chip. Nvidia and ATI gpus rape x86 and Intel don't have a chance in hell in the GPU market against these two. Your intel fanboyism is totally blatant and larrabee will be another Pentium 4. And we all remember what a resounding success the Pentium 4 was dont we...

I agree, but Larrabee will simply be a GPGPU monster and Intel will rack in tons of money in the server space where it's already strong and business' will pay through the nose. I doubt it cares too much about the PC gaming market until it gets in the PS4 (my prediction) where it can levy some game publishers with Sony's help.
thehippoz 13th April 2009, 17:48 Quote
Quote:
Originally Posted by tejas
What a crock of ****... The reason ATI have done well against Nvidia is something called performance per watt. Larrabee is clearly going to be a hot overpriced, crap yield, expensive to fabricate chip. Nvidia and ATI gpus rape x86 and Intel don't have a chance in hell in the GPU market against these two. Your intel fanboyism is totally blatant and larrabee will be another Pentium 4. And we all remember what a resounding success the Pentium 4 was dont we...

lol it's not fanboyism.. just wait and see- well I am kinda a intel fanboy.. but if you define fanboy as liking the best ocing chips then- that's what I am!
Panos 13th April 2009, 17:48 Quote
Up to now Intel is all smoke and mirrors. Heh, I bet it will be couple of generations old, compared with the current market when it comes out.

Nvidia and ATI are fine tunning the current technologies, and adding new stuff.

While Intel, has to design it, produce it, put it on sale and then they have to start trying to get in part with the competition.
If they trying new technologies and tunning, at the scale ATI and Nvidia do, they will never put the product on the market.

And if it's like P4, with my blessings, the fun boys should downgrade their monitors from now.
thehippoz 13th April 2009, 17:57 Quote
Quote:
Originally Posted by Panos
Up to now Intel is all smoke and mirrors. Heh, I bet it will be couple of generations old, compared with the current market when it comes out.

Nvidia and ATI are fine tunning the current technologies, and adding new stuff.

While Intel, has to design it, produce it, put it on sale and then they have to start trying to get in part with the competition.
If they trying new technologies and tunning, at the scale ATI and Nvidia do, they will never put the product on the market.

And if it's like P4, with my blessings, the fun boys should downgrade their monitors from now.

transistor count is through the roof though.. going to be something to contend with for sure- the driver thing only matters in that nvidia and ati have a huge headstart in optimizing specific titles.. even if larry releases without any optimizations..

it should be able to handle 1080p gaming- if anything nvidia has been sandbagging for years now.. I could really care less what they do nowdays because it's all sand.. maybe larry will be the kick in the ass to get gpu's beyond what we have today and engines like crysis the norm
perplekks45 13th April 2009, 18:10 Quote
Play nice, kids. Wait for the first tests and then start bashing/praising Larrabee please. There is no way anybody can really predict how it will perform.
iwod 13th April 2009, 18:53 Quote
Not much of a problem. Since CPU aren't selling well and Intel have lots of Wafer space left. An huge sized Larrabee is going to be good for them. To test and iron out all major software bugs as well as to test market response.
Krikkit 13th April 2009, 19:03 Quote
Quote:
Originally Posted by perplekks45
Play nice, kids. Wait for the first tests and then start bashing/praising Larrabee please. There is no way anybody can really predict how it will perform.

Quite.
Turbotab 13th April 2009, 19:05 Quote
What a bunch of highly intelligent, well researched posts!, thank God for the Bit-Tech regulars and Bindi.
To the Hippo, if you look at the increase in transistors from Nvidia's G80 to G200 core, you will see a 100% increase. Therefore the GT300 will probably end well in excess of 2 Bn transistors, not to mention significantly higher number of (shader) cores, no way will Larrabee outperform it in terms of pure computing power. As Bindi stated the Larrabee's game winner is its native X86 cores, perfect to run all those GPCPU and C / C+ etc business apps, that the Cloud computing movement will hoover-up.
If the PS4 does go with Intel, I hope it does not result in a hot, noisy beast, like the XBOX 360.
TreeDude 13th April 2009, 19:18 Quote
The different design of Larrabee is going to cause all kinds of issues out the door for gamers. It will take Intel some time to nail them all down. Drivers will be updated, games will be patched, it will be a mess at first.

However, fanboy or not, we should all be excited. More competition means lower prices (at least until one of them go under).
thehippoz 13th April 2009, 19:46 Quote
Quote:
Originally Posted by Turbotab
What a bunch of highly intelligent, well researched posts!, thank God for the Bit-Tech regulars and Bindi.
To the Hippo, if you look at the increase in transistors from Nvidia's G80 to G200 core, you will see a 100% increase. Therefore the GT300 will probably end well in excess of 2 Bn transistors, not to mention significantly higher number of (shader) cores, no way will Larrabee outperform it in terms of pure computing power. As Bindi stated the Larrabee's game winner is its native X86 cores, perfect to run all those GPCPU and C / C+ etc business apps, that the Cloud computing movement will hoover-up.
If the PS4 does go with Intel, I hope it does not result in a hot, noisy beast, like the XBOX 360.

GT300 is not going to be anything great when you look at price/performance.. they would rather keep technology stagnant to milk money.. I'm not saying larrabee will be the nvidia killer.. but I do think it will have a big impact on what goes into dev from that point on and it will hurt nvidia's bottom line.. I'm not so concerned with ati because they have amd to work with

I'm just sick of nvidia's marketing.. they love to say how much better they are than everyone else.. when you look at what they offer compared to ati- it's just rehashed with some clock plays.. I think this will hurt thier bottom line- far as gaming goes, how can you say it can't compete with what's out! that's fanboyism at it's finest when you talk trash on things that haven't even been tested! I'm basing my opinions on what I see out and what I've been following for the past couple of years since the G80.. I've owned these cards! :D I just want something better- GT300 in excess of 2billion is yet to be seen also
Evildead666 13th April 2009, 19:50 Quote
The idea is that Larabee will be able to emulate DX and OpenGL and OpenCL, if not run some of them natively.
There will be no issue for gamers, as this will either work or not.
To be compliant, they will have to have the whole feature set compatible, not just parts of it.
So DX will work or it will not, there will not be any specific problems with certain games.
optimisations, maybe, but not outright bug corrections, unless the problem appears on all the graphics adapters.

the first Larabee's or Larabi, will be for the business end of the stick, since they probably won't be good enough to run consumer level games at an acceptable speed.
GPGPU stuff will probably run like nuts on larabee tho, right from the beginning....

When the die shrink appears in 2010, the speeds/Number of cores, will be sufficient for it to enter the consumer market, or not.
Evildead666 13th April 2009, 19:54 Quote
Intel is also trying to pull the graphics market back to the cpu, since in X years we will have 16/32/64 x86 cores on our cpu's, and they will replace the gpu entirely.

Making the x86 a future standard for GPGPU or just GPU would be hard for Nvidia, and OK for AMD/ATi...
n3mo 13th April 2009, 20:35 Quote
Quote:
Originally Posted by tejas
Larrabee= Pentium 4 Prescott of the GPU world!

I see it as more of an Itanium of GPU world. Great concept, fantastic on paper but fairly useless in reality. I don't see nVidia and AMD afraid of that, especially given Intel's history of crappy GPUs.

Theoretically Larrabee could grow and mature to be useful, but I don't think that they have the time to do it. We're getting close to the end of what the silicone-based technology can offer, multiplying cores doesn't give expected advantages (memory access times and latency suffers greatly, not to mention heat output etc.)
Quote:
Originally Posted by Evildead666
Intel is also trying to pull the graphics market back to the cpu, since in X years we will have 16/32/64 x86 cores on our cpu's, and they will replace the gpu entirely.

Making the x86 a future standard for GPGPU or just GPU would be hard for Nvidia, and OK for AMD/ATi...

x86 is too old, slow and limited to be used for GPU or GPGPU. While introducing high performance double-precision computing would mean a revolution, this is not going to happen.
Faulk_Wulf 14th April 2009, 00:00 Quote
Quote:
Originally Posted by bowman
Wafer*, it's the silicon plate chips are etched on to. Each square on the wafer is an individual Larrabee prototype GPU.

A question i always wondered but never asked.
Learning +1.
willyolio 14th April 2009, 04:34 Quote
when you're as big as intel, you can always resort to brute-forcing your way into the market.
Xir 14th April 2009, 09:12 Quote
Haven't seen that little dies on a wafer since 200mm ;-)
[USRF]Obiwan 14th April 2009, 10:29 Quote
To me it is just sounds like the "Matrox going 3D" situation again, Excellent 2D imaging but stepping into the 3D bandwagon far to late. A lot of hype and in the end it was a total waste...
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums