bit-tech.net

Nvidia Tesla scores IBM GPGPU server win

Nvidia Tesla scores IBM GPGPU server win

IBMs iDataPlex servers use Nvidia's latest Tesla hardware based on its Fermi GF100 core.

It's been a tough few months for Nvidia, with Fermi facing tough competition upon launch and one staunch allies such as BFG dropping out of the graphics card market.

There is some good news for the green team though; Nvidia recently scored a design win with IBM's iDataPlex servers, and Nvidia Tesla 2050/2070 GPU computing modules, which use its latest Fermi GF100 core.

HPC, workstation and server products typically carry higher margins than consumer products, enabling companies to make more money - something AMD and Intel have benefitted from for years with their Opteron and Xeon CPUs. Prior to the launch of Fermi, Nvidia talked about its desire to focus on GPGPU, and getting IBM on board is a big sign that Nvidia's strategy might pay off. IBM claims it can fit "38,000 processing cores in a single rack", with each 2U server contains two Tesla cards and two Intel Xeon 5600 CPUs.

What's interesting is that despite the adoption of Tesla by IBM - and the fact it has promo videos on it site (which you can see below), the official specs page for its dx360 M3 server make no mention of Nvidia GPGPUs until you get right down into the features and benefits tab, where it's an additional "solution integration" rather than a main selling point. The point is further enforced in IBMs own video demo.

Do you think Nvidia will make it big with Tesla? Will its focus change away from PC gaming or is it always going to support it? Let us know your thoughts in the forums.

13 Comments

Discuss in the forums Reply
somewhereoveryonda 20th May 2010, 14:48 Quote
I'm guessing there will be a lot of money in this for nvidia.. thus more money for them to pump into R+D dept... so better graphics for us :D Its like F1 and the space industry.. the tech is developed there and fed into the consumer market.
FelixTech 20th May 2010, 15:05 Quote
Quote:
and getting IBM on board is a big sign that Nvidia's strategy might psy off. IBM claims


Fermi had to be good for someting right? :P
Lizard 20th May 2010, 15:36 Quote
Yikes - just don't run them 24x7 and then act surprised when they produce a load of memory errors and start dying after a few months.
rickysio 20th May 2010, 16:44 Quote
nVidia has to earn money somehow, after all.
Domestic_ginger 20th May 2010, 16:50 Quote
I thought the chubby bloke was going to get vapourised by 100 fermis upon opening the oven door.

Fermi was designed for GPGPU, I hope they at least do this right.
rickysio 20th May 2010, 16:54 Quote
Sauna room is good for slimming down.
Floyd 20th May 2010, 17:21 Quote
Are those new 470's and 480's really that hot? I thought my 275s were hot as ****!
borandi 20th May 2010, 18:05 Quote
Quote:
Originally Posted by Lizard
Yikes - just don't run them 24x7 and then act surprised when they produce a load of memory errors and start dying after a few months.

For HPC and servers, they are usually run in air conditioned (think 5-10ºC ambient) rooms.

Downside is the price. I want 3 or 4 in a workstation for some GPU stuff I'm doing, but it's gonna require 10k roughly.
DbD 20th May 2010, 18:10 Quote
Quote:
Originally Posted by rickysio
nVidia has to earn money somehow, after all.

Nvidia always seem to be making money - even when geeky forums are certain of their doom the figures generally still show them making a profit.

This is obviously promising, although it's pointless putting gpu's in your data centre if none of the applications actually use them. Don't know how long that will take - a few years I expect?
SchizoFrog 21st May 2010, 02:15 Quote
I guess this is the new topic for the next 5 years.... Temps. It used to be all about clock speeds but now all the comments are about temps. Temp is not everything. We have got used to this idea that everything 'needs' to be running at minimal temps just to perform and this is just not true. None of us need to use 3rd party coolers on our CPUs unless you are overclocking or 'that' worried about a little noise, but we all rush out to buy the latest coolers... and then replace those with the latest colour of the same thing!

Fermi runs hot, this is a matter of fact and design and the high temps they do run at do not hinder performance of the GPUs and the heat generated does not really cause any issues with any other componants either... so get over it.

So lets focus on actual performancce here and not silly numbers of clock speed, temp, noise levels or any other insignificant data... these cards work... and they kick arse while doing it.

Something else I would just like to add... all this recent hype about Eyefinity is a waste of space as it will never take off the way ATi hopes. But isn't it funny how the movie industry is all about 3D technology once again and nVidia has been advancing 'that' tech for the last couple of years now.
V3ctor 21st May 2010, 08:28 Quote
Quote:
Originally Posted by SchizoFrog

Fermi runs hot, this is a matter of fact and design and the high temps they do run at do not hinder performance of the GPUs and the heat generated does not really cause any issues with any other componants either... so get over it.

If the components run very hot , there's more probability that the hardware fails. (simple)
Quote:
Originally Posted by SchizoFrog

So lets focus on actual performancce here and not silly numbers of clock speed, temp, noise levels or any other insignificant data... these cards work... and they kick arse while doing it.

Something else I would just like to add... all this recent hype about Eyefinity is a waste of space as it will never take off the way ATi hopes. But isn't it funny how the movie industry is all about 3D technology once again and nVidia has been advancing 'that' tech for the last couple of years now.

I could look at performance, but i don't like to have an heater on my pc, nor a jet blowing fan... and specially don't want to spend loads of money only in consumption. Efficiency is the word right now, and comparing the GTX480 and the HD5870, I would choose HD5870 any day, it's quieter, consumes much less power and doesn't have high temps (like the ones I had with my HD4870), and the difference are of only a few fps... AND it's more expensive!!
Quote:
Originally Posted by SchizoFrog

Something else I would just like to add... all this recent hype about Eyefinity is a waste of space as it will never take off the way ATi hopes. But isn't it funny how the movie industry is all about 3D technology once again and nVidia has been advancing 'that' tech for the last couple of years now.

I agree with u, Eyefinity is reaallyy too expensive for the gamer, only the ultra-high end will buy one of those Samsung 6-panel thingy...
But it's great for work, having a card that can output up to 6 monitors with different info, that's alot of money that the companies save...
Domestic_ginger 21st May 2010, 10:28 Quote
Quote:
Originally Posted by SchizoFrog
I guess this is the new topic for the next 5 years.... Temps. It used to be all about clock speeds but now all the comments are about temps. Temp is not everything.

We've hit the 300w PCIe ceiling; that goes then it will get interesting. I think Fermi solo (and the dual cards in the past) has managed to max the draw.

Now with that out of the way its going to be about performance. Thankfully Fermi got panned for the simple act it was too hot and not fast enough to justify itself so hopefully it will not be the prototype for cards in teh future.
Evildead666 21st May 2010, 12:05 Quote
So IBM got the Tesla chips that Oak Ridge didn't want ?

I bet IBM didn't pay full whack for them.....
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums