bit-tech.net

AMD teases 12GB 'supercomputing' graphics card

AMD teases 12GB 'supercomputing' graphics card

AMD's FirePro S10000 is set to get an upgrade to 12GB of RAM, meeting Nvidia's Quadro K6000 head-on.

AMD has announced what it claims is the world's first 'supercomputing' graphics card, the FirePro S10000 12GB Edition - an upgrade to the 6GB model it announced back in November last year.

Designed for high-performance computing users, the FirePro S10000 sits very firmly at the top of AMD's GPU tree. Based on two 28nm Graphics Core Next GPUs, the original 6GB model boasts a 1.48 teraFLOP peak double precision compute rate and a 5.91 teraFLOP single-precision rate. Available in passive and actively cooled variants, the FirePro S10000 draws 375W of power per card and offers a 384-bit memory bus with 240GB/s of bandwidth per GPU for a total of 480GB/s.

Unlike the Tesla cards rival Nvidia aims at the supercomputing market, AMD's FirePro S10000 is a fully-fledged graphics card as well as an accelerated co-processor. Each board includes at least one Mini DisplayPort and DVI output, with the actively-cooled version also offering an additional three Mini DisplayPort connections. As a result, the boards have already started to find a home in extreme high-end workstations as well as headless supercomputing and high-performance computing (HPC) clusters.

The 12GB model changes little of the original design, beyond doubling up on the memory to a whopping 12GB - 6GB per GPU. 'Our compute application customers asked for a solution that offers increased memory to support larger data sets as they create new products and services,' claimed David Cummings, senior director and general manager of AMD's professional graphics division, at the announcement. 'In response, we’re announcing the AMD FirePro S10000 12GB Edition graphics card to meet that additional memory demand with support for OpenCL and high-end compute and graphics technologies.'

As is usual for such an unveiling, AMD had plenty of industry support with figureheads from companies including Dassault Systèmes and CAPS extolling the benefit of the increased memory on the new model. Sadly, pricing was not announced.

AMD's presence in the GPU-accelerated supercomputing market has been overshadowed by rival Nvidia. With a 12GB S10000 board, however, that could change: Nvidia's HPC-targeted equivalent, the Tesla K20X, has just 6GB of GDDR5 memory and lower peak performance of 1.31 teraFLOPS double-precision and 3.95 teraFLOPS single-precision. As a single-GPU board, however, the Tesla K20X enjoys a significantly lower TDP at 235W.

While AMD may be positioning its 'supercomputing' S10000 as a Tesla K20X competitor, however, its admission that the video output enabled board may find a home in the high-end workstation market places it somewhat closer to Nvidia's Quadro K6000. Announced in July, the Quadro K6000 offers very similar specifications to the FirePro S1000: 12GB of GDDR5, a 384-bit memory bus, and claimed single-precision performance of 5.2 teraFLOPS. As with the Tesla K20X, however, it easily beats the S10000 in power draw with a TDP of 225W.

AMD still has a way to go to catch its rival in market share: of the 54 systems in the latest TOP500 supercomputer list to use co-processor acceleration, 39 used Nvidia GPUs to just three with AMD GPUs - leaving the company outnumbered even by Intel, which scored 11 wins with its Pentium-based Xeon Phi Many Integrated Cores (MIC) x86 co-processor cards.

The AMD FirePro S10000 12GB Edition is due to launch some time early in 2014.

29 Comments

Discuss in the forums Reply
edzieba 15th November 2013, 10:54 Quote
For those getting exited about the 'passively cooled' variant: this is only passive in that there are no fans on the card. The box you put it in has to have some pretty hefty forced-air cooling.
Dave Lister 15th November 2013, 11:02 Quote
So could one of these be used for gaming/everyday use ? if so would BT do some benchmarking with both the 6Gb and 12Gb models please ?
Mister_Tad 15th November 2013, 11:14 Quote
Could? Sure.

Would? Not so much.

Being that it's likely to cost £4000+ per card and will perform no better (and likely worse) than a top-end gaming card means that it just doesn't make sense to use one of these exclusively for gaming.
GuilleAcoustic 15th November 2013, 11:15 Quote
Quote:
Originally Posted by Dave Lister
So could one of these be used for gaming/everyday use ? if so would BT do some benchmarking with both the 6Gb and 12Gb models please ?

Gaming on those card makes no sense. They use the same GPU than gaming card. The main differences are :

- ECC memory (not on low end model)
- More memory (not always)
- Optimised drivers for viewport rendering
- Certified drivers for professionnal softwares
- Quad buffering on mid/high end card for stereoscopic display (only way to use 3D stereo under Linux)
- 30-bit pixel color pipeline

Except that, you won't get any benefits from them for gaming. You'll most likely have less FPS as the card runs at slower frequency (while the card costs a lot more).
rollo 15th November 2013, 11:27 Quote
be priced similar to nvidias super high end tesla cards which in theory could be used for gaming if you were a bit nuts.

cost will be £4000 + for the 6gb and probably closer to £5500 for the top end one. See how it performs could be a useful investment.
Gareth Halfacree 15th November 2013, 12:24 Quote
Quote:
Originally Posted by rollo
cost will be £4000 + for the 6gb and probably closer to £5500 for the top end one. See how it performs could be a useful investment.
You read the part of the article where it mentions that the S10000 6GB Edition launched last year, right? As in, we don't need to guess how much it costs, 'cos you can buy one right now? Spoiler: it's about £2,500.
Pookie 15th November 2013, 14:06 Quote
BitCoin Miner? I wonder how long it would take to earn it's money back!
schmidtbag 15th November 2013, 15:15 Quote
Quote:
Originally Posted by GuilleAcoustic
Gaming on those card makes no sense. They use the same GPU than gaming card. The main differences are :

- ECC memory (not on low end model)
- More memory (not always)
- Optimised drivers for viewport rendering
- Certified drivers for professionnal softwares
- Quad buffering on mid/high end card for stereoscopic display (only way to use 3D stereo under Linux)
- 30-bit pixel color pipeline

Except that, you won't get any benefits from them for gaming. You'll most likely have less FPS as the card runs at slower frequency (while the card costs a lot more).

Depending on what the core is based on, there's a possibility you could upload the firmware of it's gaming counterpart. I'm not sure if all the RAM will be detected, but then again, I don't think spending $10000 for gaming is a wise choice. Even if you have 4 monitors at 1080p, the core would likely be the bottleneck before the memory.

@Pookie
I figure this would be terrible for bitcoin. As far as I'm aware BC isn't memory intensive at all. What it wants are lots of high-frequency cores with short pipelines. It wouldn't surprise me if a 2GB GPU had equal performance to this 12GB.
rollo 15th November 2013, 15:46 Quote
Quote:
Originally Posted by Pookie
BitCoin Miner? I wonder how long it would take to earn it's money back!

Gpu mining in bit coin died along time ago. ASIC miners are taking over and once more get delivered the difficulty will sky rocket. Even the fastest Gpu out there was below 0.1 bit coins a day last I checked.
Star*Dagger 15th November 2013, 18:31 Quote
"Even if you have 4 monitors at 1080p" 1080 is very low resolution. Most gamers who are serious have 3 30 inchers.

In any case, I have no interest in this card since it is made for the fine ladies and gentlemen who author content, not those of us who consume it.

Yours in Ultra High Resolution Plasma,
Star*Dagger

P.S. 10,000$ (pounds for those who use boutique currencies) is not a lot to spend on a PC. You can spend that much on monitors, cards and the steel hardware to hold up the screens. Add in a decent PC that handles everything from folding to gaming and you are easily over 10k$
GuilleAcoustic 15th November 2013, 18:37 Quote
didn't know that the resolution is what makes you a 'gamer'. Glad I'm using a 1280x1024 17" 4/3 screen to play, I wouldn't like to be called a gamer
raxonb 15th November 2013, 18:52 Quote
But can it play Crisis?

I've always wanted to say that :p
schmidtbag 15th November 2013, 18:54 Quote
Quote:
Originally Posted by Star*Dagger
P.S. 10,000$ (pounds for those who use boutique currencies) is not a lot to spend on a PC. You can spend that much on monitors, cards and the steel hardware to hold up the screens. Add in a decent PC that handles everything from folding to gaming and you are easily over 10k$

That comment right there completely voided your point on "1080p is a low resolution", because anyone who thinks 10k dollars/pounds, is "not a lot to spend" on a PC (key word here is PC, not workstation) has serious priority problems and is way too rich for their own good. Even on websites where people spend months working on custom PCs, they don't spend much more than $2500 on the actual hardware. The fact that you think $10k will get you a decent PC means you either live in Brazil or you must be doing something very wrong. Setting aside Nvidia Titans and monitors, if I were given $10k to build a gaming+folding PC, I don't see how I could even breach that price point. I hope you realize there is such thing as too much RAM, too many drives in RAID, too many GPUs, etc. Performance can and will drop once you get to a certain point, and you also waste an immense amount of power when idle.

Besides putting wealth into perspective, 1080p is, quite literally high definition. In fact it's officially defined as FHD, as in, full high definition. Anything higher than 1920x1200 is not "high" definition anymore, but SHD or UDH.

Good job looking ridiculously pretentious.
Pete J 15th November 2013, 20:07 Quote
Whilst you'd be out of your mind to buy a card like this just for gaming, I remember reading a review on Tom's Hardware that showed AMD top end workstation class GPUs ran games just as well as their gaming orientated GPUs. On the other handNVidia top end workstation GPUs have noticeably worse performance. Basically, if your a super serious CAD (and well off) user who likes to play games as well, this card is perfect for you.

Oh, and always ignore Star*Dagger. He/she comes along once in a blue moon to make some asinine comment, then wonders off.
XXAOSICXX 16th November 2013, 08:47 Quote
Quote:
Originally Posted by raxonb
But can it play Crisis?

I've always wanted to say that :p

*coughs* Crysis *coughs*
Star*Dagger 16th November 2013, 21:23 Quote
Quote:
Originally Posted by GuilleAcoustic
didn't know that the resolution is what makes you a 'gamer'. Glad I'm using a 1280x1024 17" 4/3 screen to play, I wouldn't like to be called a gamer

4:3 is a horrible thing to do to any game. 17 inches is a good size, for a tablet. and I am not sure that most games will ever work properly at ULD (Ultra Low Def) of 1280x1024.
If you can game in the top 10% with that rig you deserve a medal. Good job.
Quote:
Originally Posted by schmidtbag
That comment right there completely voided your point on "1080p is a low resolution", because anyone who thinks 10k dollars/pounds, is "not a lot to spend" on a PC (key word here is PC, not workstation) has serious priority problems and is way too rich for their own good. Even on websites where people spend months working on custom PCs, they don't spend much more than $2500 on the actual hardware. The fact that you think $10k will get you a decent PC means you either live in Brazil or you must be doing something very wrong. Setting aside Nvidia Titans and monitors, if I were given $10k to build a gaming+folding PC, I don't see how I could even breach that price point. I hope you realize there is such thing as too much RAM, too many drives in RAID, too many GPUs, etc. Performance can and will drop once you get to a certain point, and you also waste an immense amount of power when idle.

Besides putting wealth into perspective, 1080p is, quite literally high definition. In fact it's officially defined as FHD, as in, full high definition. Anything higher than 1920x1200 is not "high" definition anymore, but SHD or UDH.

Good job looking ridiculously pretentious.

1920x1080 is classified by home entertainment clown companies as HD. Does that mean I will limit my Gaming Glory to such low resolution? Negative!

3 monitors at 1500$ is 4500$. 3 cards at 750$ each is 2250$ that is 6750$ already. Decent desk is 500$, steel rack to hold the 3 screens in place and installed is another 1000$. We are up to 8250 now, and havent even bought the case or hardware in it.:(

While you can get a decent rig for 2500$ your video subsystems (which are your primary window into the Gaming Realms) will suffer severely.

Enjoy the show!

Yours in Elite Corrective Plasma,
Star*Dagger
GuilleAcoustic 16th November 2013, 21:52 Quote
Quote:
Originally Posted by Star*Dagger
4:3 is a horrible thing to do to any game. 17 inches is a good size, for a tablet. and I am not sure that most games will ever work properly at ULD (Ultra Low Def) of 1280x1024.
If you can game in the top 10% with that rig you deserve a medal. Good job.



1920x1080 is classified by home entertainment clown companies as HD. Does that mean I will limit my Gaming Glory to such low resolution? Negative!

3 monitors at 1500$ is 4500$. 3 cards at 750$ each is 2250$ that is 6750$ already. Decent desk is 500$, steel rack to hold the 3 screens in place and installed is another 1000$. We are up to 8250 now, and havent even bought the case or hardware in it.:(

While you can get a decent rig for 2500$ your video subsystems (which are your primary window into the Gaming Realms) will suffer severely.

Enjoy the show!

Yours in Elite Corrective Plasma,
Star*Dagger

Does your e-Peen feel good ? Maybe you're trying to compensate your lack of skill or retarded brain by growing your arrogance. I really pity you ...

EDIT: I really hope you're just trolling. If not .... you're the stupidest person I've ever met.
Gareth Halfacree 16th November 2013, 21:54 Quote
If it brings anything to the debate, my MONSTER GAMING RIG cost £500, plus another £100-ish for the single 1920x1200 monitor. Which has a broken power button now. It works, but sometimes spins off-axis, then you have to rotate it until the power symbol is the right way up before it'll actually switch it on or off. FEAR MY AWESOME HARDWARE!
GuilleAcoustic 16th November 2013, 22:03 Quote
Quote:
Originally Posted by Gareth Halfacree
If it brings anything to the debate, my MONSTER GAMING RIG cost £500, plus another £100-ish for the single 1920x1200 monitor. Which has a broken power button now. It works, but sometimes spins off-axis, then you have to rotate it until the power symbol is the right way up before it'll actually switch it on or off. FEAR MY AWESOME HARDWARE!

lol Gareth ... for a second I imagined your screen with a car ignition start lock xD (insert the key and turn it to power on the screen)
bawjaws 16th November 2013, 22:20 Quote
Don't worry, Spam*Dagger is basically a professional troll in his (thankfully) brief appearances here :D
Andy Mc 17th November 2013, 21:20 Quote
Quote:
Originally Posted by Pookie
BitCoin Miner? I wonder how long it would take to earn it's money back!

As previously said above: nopes.

An scrypt crypto currency however... That would be interesting to see this card compute...
jb0 18th November 2013, 13:54 Quote
Quote:
Originally Posted by GuilleAcoustic
Quote:
Originally Posted by Gareth Halfacree
If it brings anything to the debate, my MONSTER GAMING RIG cost £500, plus another £100-ish for the single 1920x1200 monitor. Which has a broken power button now. It works, but sometimes spins off-axis, then you have to rotate it until the power symbol is the right way up before it'll actually switch it on or off. FEAR MY AWESOME HARDWARE!

lol Gareth ... for a second I imagined your screen with a car ignition start lock xD (insert the key and turn it to power on the screen)

And now I have a case mod idea!
Gareth Halfacree 18th November 2013, 14:25 Quote
Quote:
Originally Posted by GuilleAcoustic
lol Gareth ... for a second I imagined your screen with a car ignition start lock xD (insert the key and turn it to power on the screen)
I've actually got one of those. No, really: my Research Machines Ltd. 380Z. The 'ignition' is a key-based power switch, located in the bottom-right - as shown on my mate Andy's Flickr feed. Keys on computers were actually really common, although they were more often used as lock-outs to disable the keyboard - or, prior to that, the front panel - than as power switches.
GuilleAcoustic 18th November 2013, 17:02 Quote
Quote:
Originally Posted by Gareth Halfacree
I've actually got one of those. No, really: my Research Machines Ltd. 380Z. The 'ignition' is a key-based power switch, located in the bottom-right - as shown on my mate Andy's Flickr feed. Keys on computers were actually really common, although they were more often used as lock-outs to disable the keyboard - or, prior to that, the front panel - than as power switches.

That is one gorgeous machine ! My missus would kill me if I had that kind of stuff at home. I'm more and more tempted to build an ARM based daily use computer. X86 architecture is really boring. I have on my short list
  • Odroid-XU
  • Adapteva parallella
  • Udoo
  • Wandboard
schmidtbag 18th November 2013, 17:18 Quote
Quote:
Originally Posted by GuilleAcoustic
That is one gorgeous machine ! My missus would kill me if I had that kind of stuff at home. I'm more and more tempted to build an ARM based daily use computer. X86 architecture is really boring. I have on my short list
  • Odroid-XU
  • Adapteva parallella
  • Udoo

I personally own the Odroid-U2, and while it's an amazing device that performs superbly, you might not want to use it (or the XU) as a daily PC. Android makes a terrible PC OS, and I'm not sure if Windows RT works on the odroid products. So that leaves linux (which is fine - linux is, IMO, the best OS for ARM), but the issue is GPU drivers - they're a pain to set up and even if you get them to work, you're restricted to what desktop environment will actually take advantage of them. So far (to my knowledge), you're limited to Unity, GNOME 3, and KDE 4. Personally, I haven't tried any of those on my odroid-U2 so I'm not sure what it takes to get them to work. The XU in particular won't really be worth getting unless you can take advantage of the big.LITTLE architecture, which most PC users can't.

But, I would love to be proven wrong. I haven't messed with big.LITTLE or OMAP5 (which is a similar idea to big.LITTLE) and I haven't bothered with the U2's GPU drivers in nearly a year. Depending on how much you're willing to spend, I think the Arndale would make a great ARM PC. I think it will still have the GPU problem but it's a lot more practical as a PC overall.
GuilleAcoustic 18th November 2013, 17:51 Quote
I've read about the GPU issue. The idea is to use it as a linux dev platform, as well as general purpose.

The gpu issue is that Linux mostly use openGL, while those gpu are openGL ES based. Maybe a Tegra4 dev board would be better.

The Linux video I saw abour the XU look promising, but there's no info regarding the headache to have it working (except restless nights)
raxonb 18th November 2013, 17:53 Quote
Quote:
Originally Posted by XXAOSICXX
Quote:
Originally Posted by raxonb
But can it play Crisis?

I've always wanted to say that :p

*coughs* Crysis *coughs*

After years of waiting I still get it wrong!
schmidtbag 18th November 2013, 17:57 Quote
Quote:
Originally Posted by GuilleAcoustic
I've read about the GPU issue. The idea is to use it as a linux dev platform, as well as general purpose.

The gpu issue is that Linux mostly use openGL, while those gpu are openGL ES based. Maybe a Tegra4 dev board would be better.

The Linux video I saw abour the XU look promising, but there's no info regarding the headache to have it working (except restless nights)

You're right, but Unity (with compiz), GNOME 3, and KDE all support GLES, though it isn't as simple and straight forward as just installing something like KDE and hope it will magically know when to use GLES - you have to tell it to use it. Tegra4 would probably be the easiest solution, besides, I get the impression Tegra4 isn't nearly as power efficient so it makes for a good PC plugged into a wall. Also, any Freescale based chip should have good linux GPU support.

I'm sure the XU works fine with linux, the problem is utilizing the A7 cores.
GuilleAcoustic 18th November 2013, 17:59 Quote
Quote:
Originally Posted by raxonb
After years of waiting I still get it wrong!

Don't worry, I often mispell it too :D

@ schmidtbag: i.MX6 quad core seams to have well supported GPUs.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums