bit-tech.net

AMD Radeon HD 6990 Pictured

AMD Radeon HD 6990 Pictured

The pictures, snapped by 4Gamers.net, show a serious beast of a dual-GPU design from AMD.

AMD's latest flagship graphics card reference design, the dual-GPU Radeon HD 6990, has been pictured on display at the company's Asia Pacific technology event in Singapore.

Demonstrated at the event by head of AMD's graphics business Matt Skynner, the card looks like a beast, comprising a pair of GPUs and a whopping 4GB of GDDR5 memory.

AMD hasn't officially confirmed the specs of the pair of GPUs yet, but we presume they'll be based on the Cayman architecture used in AMD's Radeon HD 6950 and 6970 GPUs. As such, the card is likely to have a total of either 3,072 or 2,816 stream processors.

Pictures of the card snapped at the event by 4Gamer.net reveal a dual-slot design featuring a redesigned cooling system with a centrally-located fan and vents at the front and back, while the backplate has a single DVI port and four mini DisplayPort connectors for video output.

Pricing information for the card was not discussed at the event, although Skynner reportedly claimed that it would be officially launched some time in the first quarter of this year. The card demonstrated by Skynner was described as an engineering sample, although with the launch date so close it's unlikely that the final released product will vary much from what was on show at the event.

AMD Radeon HD 6990 Pictured AMD Radeon HD 6990 pictured
Click to enlarge

As a result, the final version of the card is almost guaranteed to include the CrossFireX connector of the engineering sample, although it's not yet clear whether the card - which, thanks to its dual-GPU design, draws significantly more power than the company's other reference boards - will retain the single 6-pin and 8-pin power connectors, or swap them for a pair of 8-pin connectors.

Is AMD's latest flagship design is going to be a sure-fire winner, or will the company have to price it carefully to compete with Nvidia's latest GPUs? Share your thoughts over in the forums.

54 Comments

Discuss in the forums Reply
Flanananagan 26th January 2011, 16:02 Quote
2011: The year when single Graphics Cards are the same size as popular games consoles from a decade earlier.
GravitySmacked 26th January 2011, 16:03 Quote
Gargantuan one might say.
Repo 26th January 2011, 16:09 Quote
I think I might have a spot of bother getting this into my Shuttle XPC case....
Bloody_Pete 26th January 2011, 16:11 Quote
As it's been pointed out in the forums, a site stated that 'it's the same lengths as a piece of A4 paper' meaning it's 300mm, or about the same length as the 5970.

Going be the length of the PCI-E connector I'd guess at 13"
megadriveguy 26th January 2011, 16:16 Quote
Pricing information for the card was not discussed at the event, Well it will be expensive that's for sure
the_kille4 26th January 2011, 16:24 Quote
But didn't the article at 4chan say that the power would be through two 8pins or another configuration with an added six pin for oc'ed cards? Instead of having just one six pin and one eight pin?

But that would also mean that the power draw of this card would be quite big with a size just over 300 km...
Altron 26th January 2011, 16:32 Quote
Interesting that they moved the GPUs to the edges. I guess the vrm circuitry is all in the middle, versus the prior designs that had a GPU near the back, a GPU in the middle, and the vrms for both at the front end.
Fingers66 26th January 2011, 16:32 Quote
Quote:
Originally Posted by the_kille4
But didn't the article at 4chan say that the power would be through two 8pins or another configuration with an added six pin for oc'ed cards? Instead of having just one six pin and one eight pin?

But that would also mean that the power draw of this card would be quite big with a size just over 300 km...

As the article says, it is not clear yet what power connection configuration it will be.

Also, a 300km card is rather a large card...:D
Zurechial 26th January 2011, 16:35 Quote
4GB of RAM on a graphics card.. I can't help but wonder what 32-bit XP would make of that.
I don't think I want to know.
Showerhead 26th January 2011, 16:35 Quote
What kind of resolution do you have to be running to notice a difference from 4GB of vRAM?
Tulatin 26th January 2011, 16:41 Quote
Based on the way this card vents, we may need to change the way we have airflow in cases. Maybe intake on the sides, and exhaust at the front, rear, and top?
Sketchee 26th January 2011, 16:59 Quote
I really fail to see the point in this card tbh.

I'm probably just being an ignorant twonk but where is the advantage of this compared to sli / xfire which would be considerably cheaper?

By advantage i mean an area where it will outperform a multi-card setup so drastically that it will be worth the undoubtedly astronomical pricetag
wuyanxu 26th January 2011, 17:10 Quote
i think point of this card is so that low end P55/P67 motherboard owners can use crossfire without limited by their motherboard.
earlydoors 26th January 2011, 17:12 Quote
4GB of RAM = pricey
Snips 26th January 2011, 17:22 Quote
I really don't know why but I really do like these larger GPU cards. I don't own one myself and never have as my 3 previous cards have been pretty normal and probably half the size of this one. Even from personal experience or lack of and not knowing whether it would fit. Regardless of the whether it was AMD or Nvidia, I do think they are more aesthetically pleasing to look at. They may be impracticle but they do look cool to me.
Cei 26th January 2011, 17:23 Quote
Quote:
Originally Posted by wuyanxu
i think point of this card is so that low end P55/P67 motherboard owners can use crossfire without limited by their motherboard.

Why on Earth would you have a low end board and then spend the serious amount of cash for the GPU? Surely people who would buy this card are rocking high-end setups anyway, and just want to have the ability to CrossFire a pair of pre-Crossfired cards for 4 GPUs...
wuyanxu 26th January 2011, 17:28 Quote
Quote:
Originally Posted by Cei
Why on Earth would you have a low end board and then spend the serious amount of cash for the GPU? Surely people who would buy this card are rocking high-end setups anyway, and just want to have the ability to CrossFire a pair of pre-Crossfired cards for 4 GPUs...
i5 2500k + a £100 motherboard + this card = some very good gaming performance.

lower end motherboard doesn't mean less performance, just means more features that you usually don't need. so why spend more on features you don't ever need?
SlowMotionSuicide 26th January 2011, 17:51 Quote
Quote:
Originally Posted by wuyanxu
lower end motherboard doesn't mean less performance, just means more features that you usually don't need. so why spend more on features you don't ever need?

For bragging rights? You underestimate the power of e-peen.
Skiddywinks 26th January 2011, 18:22 Quote
Quote:
Originally Posted by earlydoors
4GB of RAM = pricey

I dunno. The 1GB 6950 is only ~$20 cheaper than the 2GB version (and that is based on RRP, you can actually get 2GB versions that are only $10 more expensive). I think the two GPUs and the sheer amount of stuff (the amount of copper, components etc) on the PCB is going to be the real cost.
play_boy_2000 26th January 2011, 18:47 Quote
The only glaring problem I see with this is the single DVI port. Displayport still hasn't gone mainstream and as such you either need to hunt around for a supporting monitor, or shell out $20 or more for an adapter.
Nexxo 26th January 2011, 18:51 Quote
OMG IT'S HIOOJ!!! KILL IT! KILL IT!!! KILL IT BEFORE IT EATS US ALL!!!
Madness_3d 26th January 2011, 19:11 Quote
4GB of ram = 2GB/ GPU Mirrored in CF. so no different to a 6950 or 6970 :-/ don't get what the fuss is about
RichCreedy 26th January 2011, 19:16 Quote
also only way to get crossfire on a mini-itx board

yes i know, its a big card for mini-itx setup, and you would need to find a case to match, along with a pokey power supply, also not ideal for mini-itx, but hey, thats what modding is about, isn't it?
j_jay4 26th January 2011, 19:26 Quote
The point of this card is so that AMD have something to compete against the 580, The red fanboys will be lapping it up
Kovoet 26th January 2011, 19:44 Quote
it will be mine oh yes it will be mine
d3m0n_edge 26th January 2011, 20:00 Quote
Why price it strategically in competition to nVidia? If this card is what it may appear to be (for now), then you simply get what you pay for. But that's just me being impulsave.
greigaitken 26th January 2011, 20:27 Quote
I guess this will be £500 or £550
Even if my mb couldnt crossfire/sli, I'd buy a new one and 2 x 560ti's
will be faster, cooler, quieter and cheaper.
I'm sure there is some sort of single slot niche for this card but surely gamers get bigger e-peen if they have 2 graphics cards.
Instagib 26th January 2011, 20:54 Quote
Given that the 5970 was based on two 5850's, will this one be based on two 6950's? And if so will each gpu unlock?
Altron 26th January 2011, 21:25 Quote
Quote:
Originally Posted by Instagib
Given that the 5970 was based on two 5850's, will this one be based on two 6950's? And if so will each gpu unlock?

No... the 5970 is two 5970s at lower clocks. Both of the GPUs in the 5870 had all 1600 shaders of the 5870, not the 1440 shaders of the 5850. They just were underclocked in comparison to the 5870.
Quote:
Originally Posted by Zurechial
4GB of RAM on a graphics card.. I can't help but wonder what 32-bit XP would make of that.
I don't think I want to know.

It shouldn't make anything of it. The GPU is addressing it, not the CPU.
Quote:
Originally Posted by Showerhead
What kind of resolution do you have to be running to notice a difference from 4GB of vRAM?

Both cards duplicate the VRAM in crossfire and SLI. So each GPU only has 2GB, the same as a reference 6950 or 6970. It might be a little high at the moment, but Eyefinity triple monitor setups are being bottlenecked by 1GB per GPU. They might have been fine with 1.25gb or 1.5gb like some nvidia chips, but why not just do an even 2gb?
Zurechial 26th January 2011, 21:44 Quote
Quote:
Originally Posted by Altron

It shouldn't make anything of it. The GPU is addressing it, not the CPU.

As you and others have said, the 4GB of RAM on the 6990 is a 2x2GB pair, and so only exposes 2GB of addressable memory to the OS, if that's what you mean by the GPU addressing it.
XP 32-bit can address ~4GB of RAM in total, including GPU RAM - Which would leave roughly 1.75GB of usable physical memory judging by how XP typically addresses memory; which was my point.

It's purely an academic thing, as nobody in their right mind would pair a 6990 with XP, but I'm curious to see just what would happen if XP was forced to deal with a graphics card that actually exposed 4GB of addressable RAM to the OS, or multiple cards that do so collectively; on top of the system RAM, since it usually addresses the graphics memory before the system, thus squeezing the system RAM into the remaining addressable range.

If it were left with no address space for the system RAM would it refuse to boot, or is it smart enough to reserve enough system memory for itself before addressing the graphics memory?
The notion reminds me of how 98SE behaves on systems with more than 512MB of system RAM.
cgthomas 26th January 2011, 21:50 Quote
is it just me or is this card missing another Display / DVI port?
I can only see 5 - shouldn't a card like thise one support 6 monitors for eyefinity?
play_boy_2000 26th January 2011, 21:51 Quote
Quote:
Originally Posted by Altron
It shouldn't make anything of it. The GPU is addressing it, not the CPU.
A quick google, cofirmes that the CPU sees 256MB of memory of GFX card present in the system (not sure if additional SLI/xfire cards require 256mb each as well?) and the rest is somehow hidden.
http://www.gamasutra.com/view/feature/3602/sponsored_feature_ram_vram_and_.php?page=3

edit: correction of facts
leveller 26th January 2011, 22:44 Quote
At the point I got my 4870x2 it was the best card on the market. If this 6990 can earn that crown then it's likely to be on my next upgrade list.

Will Nvidia have a single card solution to compete?

Although I guess the actual benchmarks we want to see, are this card, vs Nvidia's equivalent, vs ATi's most powerful single card, vs Nvidia's most powerful single card, vs both ATi's and Nvidia's most powerful single cards in crossfire and SLI setups. A benchmark I look forward to immensely!
frontline 26th January 2011, 23:44 Quote
Overkill for most, but fascinating anyway. It will be interesting to see if the improvements already seen on 6xxx series crossfire results, when compared to the 5xxx series, are carried over or even bettered on the 6990.

The next few months looks like it will be a great time to pick up a new card, particularly at the sub £200 segment of the market.
Yslen 27th January 2011, 00:32 Quote
I'm thinking this will be a pretty epic card, actually. Everything I've seen and head of 6000 series crossfire is good. The scaling is incredible and there seem to be far fewer of the traditional multi-gpu problems. I've been asking around the forums for the impressions of people who are now using such a set up and none of them have told me of any micro-stuttering or performance variability problems - just epic performance at a great price.

Also, guru3d tested the 6800 cards in crossfire and said "I can already tell you this though, this was the first time ever we had no issues with drivers, and yeah, in most cases the CrossfireX scaling was just extraordinary good."

A card like this would be great for eyefinity (obviously)
feedayeen 27th January 2011, 01:27 Quote
4 DisplayPorts, if those are 1.2, this card might just be able to support 12 monitors... assuming of course that they ever actually get around to releasing the DisplayPort 1.2 video splitter.
Tulatin 27th January 2011, 02:30 Quote
Quote:
Originally Posted by feedayeen
4 DisplayPorts, if those are 1.2, this card might just be able to support 12 monitors... assuming of course that they ever actually get around to releasing the DisplayPort 1.2 video splitter.

Or screens which allow you to daisychain displayport.

When you consider that angle, it would be pretty incredible to build a smaller system (12 x 20 x 32cm) which can power 12 displays. Granted, when you have a display array like that, I'm sure a computer tower wouldn't get in your way so much.
Enzo Matrix 27th January 2011, 06:42 Quote
This article:
http://www.hardwarecanucks.com/news/video/amd-shows-upcoming-radeon-hd-6990-flagship/

Answers the question of the power connectors: we will have 1 8 pin and 1 6 pin.

It also claims that 3072 stream processors is confirmed.
penryn 2 hertz 27th January 2011, 08:10 Quote
id love to see how it competes against the dual gtx 560 EVGA are going to release interesting...
V3ctor 27th January 2011, 09:55 Quote
Where is the HDMI connector? At least ONE would be nice, instead of those displayports
wuyanxu 27th January 2011, 10:19 Quote
Quote:
Originally Posted by V3ctor
Where is the HDMI connector? At least ONE would be nice, instead of those displayports
DVI-HDMI converter?

a single normal sized displayport would help. considering displayport monitors give a free normal sized displayport cable.
Blarte 27th January 2011, 12:59 Quote
good to see the tone lowered Archandel ... <shock realisation> Are you Andy Gray ?
Waynio 27th January 2011, 16:00 Quote
Wow that looks like a really long gpu :D.
Bakes 27th January 2011, 21:33 Quote
Quote:
Originally Posted by play_boy_2000
The only glaring problem I see with this is the single DVI port. Displayport still hasn't gone mainstream and as such you either need to hunt around for a supporting monitor, or shell out $20 or more for an adapter.

They normally bundle a few adaptors.

Displayport *might* not be mainstream *yet* but it soon will be - there was a recent article explaining how Intel, AMD, Samsung, Lenovo, Dell, etc are all planning to migrate towards Displayport and HDMI for their products.
leveller 29th January 2011, 22:21 Quote
The gloves are off, and my palms are sweaty in anticipation to see who wins:

http://vr-zone.com/articles/confirmed-nvidia-releasing-geforce-gtx-590-in-february-with-dual-gf110/10954.html
Pete J 29th January 2011, 23:55 Quote
Quote:
Originally Posted by leveller
The gloves are off, and my palms are sweaty in anticipation to see who wins:

http://vr-zone.com/articles/confirmed-nvidia-releasing-geforce-gtx-590-in-february-with-dual-gf110/10954.html
The article expects both the 590 and the 6990 to be over $1000. What a load of tripe! If they're priced that high, they won't sell!
Waynio 30th January 2011, 01:14 Quote
Way too much for me, I'll sit happy with the 580 a nice powerful uncomplicated single gpu :D.
wuyanxu 30th January 2011, 10:56 Quote
Quote:
Originally Posted by Pete J
The article expects both the 590 and the 6990 to be over $1000. What a load of tripe! If they're priced that high, they won't sell!
too right.

i expect them to be less than 2x price of their GPU counter parts.
eg. 6990 is two 6970 then it will sell for less than £550
eg. 590 is two 570 then it will sell for less than £600

it's multi-GPU on a stick, multi-GPU setup have always been cheaper than a single faster GPU
leveller 30th January 2011, 23:08 Quote
Quote:
Originally Posted by wuyanxu
too right.

i expect them to be less than 2x price of their GPU counter parts.
eg. 6990 is two 6970 then it will sell for less than £550
eg. 590 is two 570 then it will sell for less than £600

it's multi-GPU on a stick, multi-GPU setup have always been cheaper than a single faster GPU

This sounds about right. And yes, they will sell!
Xir 31st January 2011, 12:00 Quote
Meh... the beefiest single core card wins...
Shame they don't go for that.
Siwini 6th February 2011, 11:57 Quote
No hdmi port?
TheQuadFather 8th February 2011, 21:47 Quote
Quote:
Originally Posted by Siwini
No hdmi port?

there probably will be a DP to hdmi converter included for a card in its price range.
Ahadihunter1 20th April 2011, 10:58 Quote
The point of the card is useless, Okay guys it's official, AMD/ATI are at oiur money again...
chrismarkham1982 20th April 2011, 11:07 Quote
the point of both the nvidia and amd dual gpu cards is to be the fastest gaming cards on the market therefore the point of these cards is not useless, overpriced and just a bit too much for most people maybe but useless? nope definately not
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums