bit-tech.net

Nvidia trumpets world's fastest computer

Nvidia trumpets world's fastest computer

The Tianhe-1A supercomputer uses 7,168 Fermi GPUs to churn through over 2.5 petaflops.

Nvidia has announced a major win in the world of high-performance computing, with its Fermi-based Tesla GPUs forming the heart of what is now the world's fastest supercomputer.

The Tianhe-1A, created by the National University of Defense Technology in China, uses a combination of 7,168 Nvidia Fermi-based Tesla M2050 graphics processing units and 14,366 Intel processors of unknown type to hit more than 2.5 petaflops, a significant boost over the current 'official' fastest system, the Cray-manufactured Jaguar which measures 1.75 petaflops.

Nvidia claims that in order to reach the same level of performance with CPUs alone, it would require more than 50,000 processors and double the floor space along with a whopping 12 megawatts of power, compared to Tianhe-1A's modest requirement of 4.04 megawatts.

Guangming Liu, head of the National Supercomputer Centre in Tianjin, said of the new system 'the performance and efficiency of Tianhe-1A was simply not possible without GPUs,' stating that 'the scientific research that is now possible with a system of this scale is almost without limits; we could not be more pleased with the results.'

Jen-Hsun Huang, Nvidia's chief executive, claimed that the new system proves that 'GPUs are redefining high performance computing,' and that 'GPU supercomputers are essential tools for scientists looking to turbocharge their rate of discovery.'

The Tianhe-1A will be offered as an open-access system to researchers looking to boost the speed of their scientific modelling and investation across China. Let us know your thoughts on this computing breakthrough in the forums.

54 Comments

Discuss in the forums Reply
Mraedis 28th October 2010, 17:27 Quote
But can it run... ah screw it.

Isn't that like 1 GPU per TWO CPU's, I thought it would be at least 2:1 instead of 1:2.
Technobod 28th October 2010, 17:29 Quote
I wonder what the power requirements would have been if they had used a few more slightly slower GPUs like those from AMD...
I do hate to think what their electricity bill is XD
lacuna 28th October 2010, 17:39 Quote
Expect China to unveil its army of genetically enhanced super soldiers any time now then...
mecblade 28th October 2010, 17:40 Quote
But can it run Crysis 3?
proxess 28th October 2010, 17:44 Quote
Can it run Minecraft?
Gradius 28th October 2010, 17:46 Quote
14,366 Intel processors
7,168 Nvidia Fermi-based Tesla M2050 graphics processing units

14366 / 2 = 7183 vs 7168 ?! That's OVER 1 (GPU) : 2 (CPU)

"the new system proves that GPUs are redefining high performance computing"

What a bad joke!
TCoZ 28th October 2010, 17:51 Quote
With all those fermi GPUs in one machine I'm suprised it doesn't instantly melt.
mecblade 28th October 2010, 17:51 Quote
o.O just noticed its for a defense university. What could require that much processing power?

Im expecting a major hack on the pentagon's systems now, or maybe mount cheyenne.
steveo_mcg 28th October 2010, 17:53 Quote
Quote:
Originally Posted by proxess
Can it serve Minecraft?

fixed that for you
Phalanx 28th October 2010, 18:10 Quote
Did anyone catch the folding PPD? :p
theevilelephant 28th October 2010, 18:59 Quote
Quote:
Originally Posted by mecblade
.....or maybe mount cheyenne.

The stargate won't be safe, oh noes!

The first thing I thought when I saw the picture was "oh... thats a bit... grey and dull". Where are the flashing lights?! It's clearly a fake, everyone knows you can't have a supercomputer without flashing lights.
thehippoz 28th October 2010, 19:16 Quote
ha chinas brute force machine.. 'you need crack? ok'
TWeaK 28th October 2010, 19:55 Quote
Quote:
Originally Posted by Ph4lanx
Did anyone catch the folding PPD? :p

You beat me to it XD
Snips 28th October 2010, 20:15 Quote
"Conspiracy theory Alert!"

It will hack the LHC @ Cern and create it's own time machine and bring robots from the future to kill mankind.

(In my best old scottish accent) "We're all Doomed!"
HourBeforeDawn 28th October 2010, 20:38 Quote
4.04 mega watts huh so lets see since it is fermi involved here I bet .04 mega watts is CPUs and other hardware and that other 4 mega watts is the nVidia GPUs =p
SchizoFrog 28th October 2010, 21:06 Quote
Where is John Connor when you need him?
blackerthanblack 28th October 2010, 21:25 Quote
Quote:
Originally Posted by lacuna
Expect China to unveil its army of genetically enhanced super soldiers any time now then...

Each with their own Fermi and requiring cryogenic cooling a la Universal Soldier
thehippoz 28th October 2010, 21:38 Quote
1.21 jizzawatts
K.I.T.T. 28th October 2010, 21:39 Quote
Quote:
Originally Posted by theevilelephant
Quote:
Originally Posted by mecblade
.....or maybe mount cheyenne.

The stargate won't be safe, oh noes!

The first thing I thought when I saw the picture was "oh... thats a bit... grey and dull". Where are the flashing lights?! It's clearly a fake, everyone knows you can't have a supercomputer without flashing lights.

I've got to say i agree...it is a bit dull and before you all give me the 'you don't say' face i think they could have at least thought about how it looks.
HourBeforeDawn 28th October 2010, 21:40 Quote
Quote:

But can it run Crysis 3?

Can it run Minecraft?

no it actually cant. not designed for such operations.
Sketchee 28th October 2010, 23:14 Quote
Bet it still only gets 'fast' on systemrequirementslab :P
borandi 28th October 2010, 23:34 Quote
Quote:
Originally Posted by Gradius
14,366 Intel processors
7,168 Nvidia Fermi-based Tesla M2050 graphics processing units

14366 / 2 = 7183 vs 7168 ?! That's OVER 1 (GPU) : 2 (CPU)

You have head nodes and dummy nodes that don't have GPUs in.

Quote:
Originally Posted by HourBeforeDawn
4.04 mega watts huh so lets see since it is fermi involved here I bet .04 mega watts is CPUs and other hardware and that other 4 mega watts is the nVidia GPUs =p

So what, 14k 130W CPUs and 7k 230W GPUs and you came to that conclusion? Please don't ever do my taxes.
HourBeforeDawn 28th October 2010, 23:56 Quote
Quote:
Originally Posted by borandi
Quote:
Originally Posted by HourBeforeDawn
4.04 mega watts huh so lets see since it is fermi involved here I bet .04 mega watts is CPUs and other hardware and that other 4 mega watts is the nVidia GPUs =p

So what, 14k 130W CPUs and 7k 230W GPUs and you came to that conclusion? Please don't ever do my taxes.

wow borandi you really need to develop a personality and a sense of humor, it was a joke you know how terrible fermi is at power consumption lol. Man get a life you will be happier then being so serious all the time. You know the =P typically indicated sticking out tung which tends to also mean not a serious remark but one of humor, I gather your not much of a people person huh. ~_~
dactone 29th October 2010, 00:08 Quote
but can it cook sausages?
Technobod 29th October 2010, 00:12 Quote
Quote:
Originally Posted by dactone
but can it cook sausages?

As opposed to instantly cremating them? doubt it, infact you could use it to calculate how long they would last near all those GPUs...
Sloth 29th October 2010, 00:25 Quote
Quote:
Originally Posted by HourBeforeDawn
wow borandi you really need to develop a personality and a sense of humor, it was a joke you know how terrible fermi is at power consumption lol. Man get a life you will be happier then being so serious all the time. You know the =P typically indicated sticking out tung which tends to also mean not a serious remark but one of humor, I gather your not much of a people person huh. ~_~
You might want to consider your own words. You made a joke, he didn't see it was a joke (as is a common mistake on the internet) and applied readily available numbers to create a more reasonable model of power consumption. How is that justification to judge a person and tell them to get a life?

On topic, I wonder what the cost was on that beast was. I also can't help but wonder if this is a victory for GPU computing or really just a victory for parallel computing. My money's on the latter.
thehippoz 29th October 2010, 00:38 Quote
victory for huang.. they are all in cahoots though.. bet they all sit around a table and vconference with amd/ati on where they are going to milk next.. with the anti monopoly laws in place they need each other

on the supercomputer- I wonder what kind of failure rate they have compared to the intel chips

cray went with amd cpus too.. I'd be interested in the reliability of that setup they have in china- I'm sure it works but what kind of issues come up with replacement
HourBeforeDawn 29th October 2010, 00:40 Quote
Quote:
Originally Posted by Sloth
You might want to consider your own words. You made a joke, he didn't see it was a joke (as is a common mistake on the internet) and applied readily available numbers to create a more reasonable model of power consumption. How is that justification to judge a person and tell them to get a life?

On topic, I wonder what the cost was on that beast was. I also can't help but wonder if this is a victory for GPU computing or really just a victory for parallel computing. My money's on the latter.

nope I stand by my post, go look through his other 75 post, majority take an attack stance in his messaging and is always overly critical towards the respondent which typically relates to antisocial behavior ie not a people person.
robots 29th October 2010, 01:21 Quote
HourBeforeDawn 29th October 2010, 01:36 Quote
Posting pictures of your husband I gather? seriously though if people are that anal about humor its no wonder the world is screwed up like it is. ~_~
robots 29th October 2010, 01:40 Quote
duh I was backing you up :| And I'm not female btw, that pic is just some lady a moderator put in there to troll me, which I now use to help me find my posts easier. I've grown to like her smiling homely face, whoever she is.
HourBeforeDawn 29th October 2010, 01:43 Quote
Quote:
Originally Posted by robots
duh I was backing you up :| And I'm not female btw, that pic is just some lady a moderator put in there to troll me, which I now use to help me find my posts easier.

ah but how are you backing me up when its seem like its coming across as you dont like technology jokes? but regardless my bad on the gender assumption.

Edit: Oh I think I get it, like its a representation of borandi, ya sarcasm doesnt always translate well on the internet lol.
Krayzie_B.o.n.e. 29th October 2010, 03:14 Quote
The Tianhe-1A worlds fastest computer!!!
Funny thing though, I don't see any over clocking benchmarks.

And will this piece of Chinese crap be sold at Wal-Mart also?
robots 29th October 2010, 05:24 Quote
Heh it looks like the thing itself is the size of a walmart. It's a pity the fastest computer in the world is going to get used for developing 'defense' equipment :/ Predictable, but lame.
HourBeforeDawn 29th October 2010, 05:36 Quote
Quote:
Originally Posted by robots
Heh it looks like the thing itself is the size of a walmart. It's a pity the fastest computer in the world is going to get used for developing 'defense' equipment :/ Predictable, but lame.

Well in the US most super computers are first used to calculate nuclear fallout ~_~

Imagine the good it could do if instead it was put use to stuff like SETI or Folding@Home related science and research.
Synay 29th October 2010, 09:41 Quote
First of all - when can we expect the review, bit-tech?
Second - when will it be folding for our team? (as said by others)
Third - now I know why Fermis were in such a short supply, China bought them all!!! :D
BRAWL 29th October 2010, 10:08 Quote
It can play Crysis... 1.6 Million times (over 9000?)
I just read this in the news actually, but how mad is the power of this machine? Really?
cgthomas 29th October 2010, 10:44 Quote
Cool, when will Scan list it in their website. I'm looking for a pre-built system
mecblade 29th October 2010, 10:52 Quote
how much can it overclock to?
does it come with multiplier unlocked or do you have to use QPI?
will a Thermaltake frio be sufficient for cooling?
does it come with intel's stock cooler?
does it require a 1.5kw Power supply?

Finally, when will this be in the stores for christmas?

:D
PingCrosby 29th October 2010, 15:54 Quote
Good lord, my underpants have just exploded.
robots 29th October 2010, 20:42 Quote
Quote:
Originally Posted by HourBeforeDawn
Well in the US most super computers are first used to calculate nuclear fallout ~_~

Imagine the good it could do if instead it was put use to stuff like SETI or Folding@Home related science and research.
yeah :( Maybe science can get their hand-me-downs.
frontline 29th October 2010, 22:45 Quote
That's impressive kit to play mahjong online.
mclean007 30th October 2010, 12:10 Quote
Quote:
Originally Posted by borandi
You have head nodes and dummy nodes that don't have GPUs in.
I can explain much more easily - if you actually click through to the article it says 14,336 not 14,366, so it's exactly 2 CPU : 1 GPU. Bit-tech typo!
wiak 30th October 2010, 14:39 Quote
doesnt Fermi gpus use alot more than intel cpus? in power? :O
Publ!c Enemy 30th October 2010, 16:31 Quote
Yes i believe they use more power than the CPU's.
alecamused 30th October 2010, 21:59 Quote
so one would need 12megawatt for 50000 cpus to achieve the same level of performance. thats ~240 Watt / CPU

240W * 14366 (processors) ~ 3.4megawatt ... that leaves ~600kW for 7168 GPUs. so ~ 42W / GPU?

and that proves that - if an idiot like me does the math based on a few numbers from an article - nvidia gpus are incredibly power-efficient :D
general22 31st October 2010, 02:14 Quote
Well if you use the actual TDP numbers for the M2050 Tesla unit they are 225W a piece. So 7168 * 225 = ~ 1.6 MW for the GPUs. That leaves about 2.44MW for the CPUs and motherboard logic. If you allocate that power usage to CPUs only then that leaves them with a power usage of 168W each which seems high. I would estimate they are pretty high end 130W TDP parts and the rest of the power usage is taken up by other logic.
Dr Dark 31st October 2010, 12:53 Quote
Regardless..... it'd be one hell of a case mod.... lol
murraynt 15th November 2010, 14:38 Quote
Wow only 4.04mw really puts 1kw power supply's into perspective.
Mraedis 15th November 2010, 16:27 Quote
Megawatt, not kW. That's 40400kW for you.
murraynt 15th November 2010, 16:46 Quote
Quote:
Originally Posted by Mraedis
Megawatt, not kW. That's 40400kW for you.

Ahh read it wrong :o
thelaw 15th November 2010, 17:27 Quote
Hmm if China use it for "evil purposes" will we all boycott Nvida in future i wonder?
mixman 14th January 2011, 10:26 Quote
I would like to use it to unlock some SL3 Nokia (SHA-1)phones.... it took me 2 days to unlock one phone with my 5970 card..... this thing may only need two seconds.....
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums