bit-tech.net

New dual-GPU ATI Radeon 5970 spotted

New dual-GPU ATI Radeon 5970 spotted

The first look at ATI's forthcoming dual-GPU, DX11 graphics card.

ATI’s forthcoming dual-GPU graphics card, codenamed Hemlock, has been pictured in the labs of AlienBabelTech.com, which we were pointed to by the good folks at Engadget. The first surprise is that the site calls the card an ATI Radeon HD 5970, rather than a HD 5870X2. While the latter would be consistent with previous generation Radeons, the newer name makes more sense to us, as it’s clearer to the uneducated that a HD 5970 is superior to a HD 5870.

The pictured card definitely has two GPUs, as the cut-outs in the rear of the cooler show the two mounting points quite clearly. With the Radeon HD 5870 measuring an unwieldy 11in in length, it’s no surprise that this card measures an incredible 12.5in long – and it only just fits in AlienBabelTech’s Antec Twelve Hundred case. The length is a consequence of ATI using only one PCB in the HD 5970, rather than sandwiching two circuit boards together, as in some previous dual-GPU Nvidia cards.

AlienBabelTech’s source - the mysterious ‘DJ’ - says that ATI ‘is pretty confident that they will win this round of the GPU war with Nvidia’s Fermi architecture’ and that ‘ATI has something special up their sleeve, but are waiting for Nvidia to release their DX11 cards first’. DJ is clearly not a journalist then, as we all know that companies are singular.

New dual-GPU ATI Radeon 5970 spotted Dual-GPU ATI Radeon graphics card spotted New dual-GPU ATI Radeon 5970 spotted Dual-GPU ATI Radeon graphics card spotted
New dual-GPU ATI Radeon 5970 spotted Dual-GPU ATI Radeon graphics card spotted New dual-GPU ATI Radeon 5970 spotted Dual-GPU ATI Radeon graphics card spotted
Click to enlarge

It seems that AlienBabelTech posted performance numbers for the HD 5970, but were forced to take these down by AMD/ATI. The site was allowed to keep its preview images online. ATI was also kind enough to inform AlienBabelTech that the card it has was an engineering sample, and that ‘the board, the drivers, and the [v]BIOS are not finished… according to AMD, the leaked pictures are not the final card.’

Let us know what you think of the new card (and whether you think it’ll shrink at all) in the forums.

And now you can read our full review of the Radeon HD 5970

54 Comments

Discuss in the forums Reply
AngusW 2nd November 2009, 14:48 Quote
Wow that things a BEAST. Its massive.
mi1ez 2nd November 2009, 14:48 Quote
Sadly I don't have a house big enough to keep a video card of that size in! What's that? 13"? Maybe 14"?
Sifter3000 2nd November 2009, 14:49 Quote
Quote:
Originally Posted by The Article
it’s no surprise that this card measures an incredible 12.5in long
mi1ez 2nd November 2009, 14:50 Quote
OK, I scanned the article rather than reading it properly...
Singularity 2nd November 2009, 14:50 Quote
It really looks like an amazing card. Even if I'll never get to own it, it's still something awe-inspiring :D
Hugo 2nd November 2009, 15:06 Quote
Quote:
Originally Posted by Clive
‘ATI has something special up their sleeve, but are waiting for Nvidia to release their DX11 cards first[/i]’. DJ is clearly not a journalist then, as we all know that companies are singular.

Clive clearly isn't a real journalist either, as demonstrated by his broken italics tag and the placement of an inverted comma inside a full stop. :D
frontline 2nd November 2009, 15:08 Quote
There's clearly enough room for a 3rd GPU in there :)

Saying that, after trying crossfire on 2900, 3800 and 4800 series cards, i will probably stick with my single 5870. It is more than adequate for my native resolution and the games out there at the minute.
[USRF]Obiwan 2nd November 2009, 15:17 Quote
what with the secret 'sleeve', I guess it is physics/cuda support which is faster then Nvidia. That would probably cause a big laughter all over.

Hey, I can dream can't i?
proxess 2nd November 2009, 15:25 Quote
What a beast. Hopefully someone was able to grab the performance numbers.
mjm25 2nd November 2009, 15:35 Quote
yum. big fan of the 4870x2 (as i bought one) this is more likely to be my next upgrade now that i've got a proper job.

looked at the length and thought it was pretty similar to the 4870x2 (which is HENCH) but just realised that 10.5 inches is the length of that! so in other words FECK! that thing is long. have to be a full tower case or a bit of a dremel expert to fit that in. hopefully itll shrink a bit when they get the final cards out.
mrb_no1 2nd November 2009, 16:20 Quote
i dont need one... but i want one
Panos 2nd November 2009, 16:34 Quote
I DO want one by the end of the year...........
Jasio 2nd November 2009, 16:45 Quote
Quote:
Originally Posted by mi1ez
Sadly I don't have a house big enough to keep a video card of that size in! What's that? 13"? Maybe 14"?

13.5" according to info that's available online. It will only fit in certain cases because of this restriction.
TWeaK 2nd November 2009, 16:51 Quote
This definitely isn't the finished article. According to ATi's plan for the 5 series (can't remember where I saw it, probably somewhere on here) they're all going to have the same 'Batmobile' cooler only they'll have different sizes of cooler for the different PCB lengths. I wouldn't be concerned with the performance figures either - they're not going to be indicative of the retail product. Unless of course the performance was through the roof, in which case, it can only get better.
cgthomas 2nd November 2009, 16:52 Quote
At first glance I though "That's a very thin gpu!!" then I noticed the massive size makes it look quite thin in comparison to its length
dec 2nd November 2009, 16:56 Quote
i wonder if these will have sideport and if it will be enabled, unlike the 4870x2's
roblikesbeer 2nd November 2009, 17:12 Quote
Long card is long.
Cerberus90 2nd November 2009, 17:44 Quote
Its not really, its just a piece of steel girder with some plastic stuck to it.

How does it not snap the motherboard in half!!!!
And it seems that they picked the smallest case possible to put it in, that makes it look even bigger.
greigaitken 2nd November 2009, 17:56 Quote
Many people are fine with £450on a dual card but cry they gotta spend £100 on a big case to house it. It'sbetter buying a big case, so ati/nvidia get more space to play with. I like the idea of a separate graphics box that plugs into a pci express slot. This would make cooling those gpus so much easier. nvida made one before for multiplesli cards,but that was for pro rendering etc. Wouldn't it be nice to have a consumer version
Star*Dagger 2nd November 2009, 18:06 Quote
I am buying one once they are released, I'll let you know how it runs.

Yours in Eyefinity Plasma,
Star*Dagger
ChaosDefinesOrder 2nd November 2009, 18:09 Quote
lol @ CPU fan held on with a cable tie...
alwayssts 2nd November 2009, 18:26 Quote
Quote:
Originally Posted by [USRF
Obiwan]what with the secret 'sleeve'...

First of all, apologies for speculation and being long-winded, but hear me out.

More than likely the 5890, which will be clocked/priced to out-'compete', if not outperform, the GTX260 448sp, is what's up their sleeve. 975mhz-1ghz seems reasonable.

The way I figure it is this:

nVIDIA salvage parts start as 2 clusters disabled, then move to 1. IE, the 8800gts (96, then 112) and GTX260 (192 then 216). Two clusters this time around would equal 448sp, later moving to 480sp for a rev2/'375'-type part. Full Fermi is 16x32sp, or 512sp (1024 flops per clock).

Also, Fermi likely keeps the same TMU/ROP/FLOP ratio as GT200, if you figure 48 ROPs, 128TMUs and a 2.5/1 shader/core clock ratio - all which seem likely for obvious advantageous reasons.. Through this you can somewhat extrapolate performance. I figure 700c/1750s (or close) for the gtx380 and 640/1600 (or close) for the GTS360 as this mimics disparity from last gen, putting them at 1.35 and 1.7 the GTX285 spec...with GTS360 being '80%' (more-or-less) of GTX380. Yes, the architecture is different and is a variable. That being said, nVIDIA's architecture likely to scale more linearly in performance compared to ATi's because most games are geared toward their higher pixel and texture to flop ratios, which like I said, will likely remain consistent, if not optimized by the new arch.

So, anyways... considering where 5870 lies compared to GTX285, I think an overclocked 5870 (5890) seems like the secret weapon. Think of it as GTX260 v. 4870 v. GTX260216 v. 4890 v. GTX275 v. 4850x2 part 6. If there's a GTS360 480sp to take on the 5890, consider that part 7. 5950 is essentially a preemptive part 8, and I'll explain in a second.

The point of it all is, whatever nVIDIA wanted to charge for desktop Fermi, they won't be able to. Say it was $500 and $400. 5970 will actually likely surpass Fermi at Dual Precision FP at the same price, granted at a higher TDP, but gaming won't even be close. The salvage part would be attacked from above by two highly salvaged chips on one pcb and below by one highly binned one. Between the rumored 40% less die size (plus lesser ram costs because of bus size) ATi will be able to be price competitive at every point for Fermi between 5890 (gts360 rev 1), 5950 (gts360 rev2/GTX380) and 5970 (FermiGX2? If it exists, it will likely be cut down to GTS360x2), if they need to be.

In other words, ATi's waiting until Nvidia's specs and prices are out to complete their lineup and price them out of the equation.

Consider this possible realistic price structure after the 5900, 5890, and Fermi launches:

5970: $500 (725-750mhz, full part)
5950: $400 (625-650mhz, 90% part)
5890: $350 (~1ghz)
5870: $300
5850: $230

*Worth noting is that with these specs, it allows for fine binning of Cypress chips, as none would be used in the same configuration in another card. 5950 would actually use the most salvaged chips, hence why it could afford the small price difference between it and the highest binned single chip.

Those speculative prices are pretty much exactly set up to reflect the difference in theoretical performance. While obviously performance doesn't scale linearly (it's actually closer to half most of the time in ATi's architectural case) and xfire certainly doesn't scale perfectly in all scenarios, it goes to show two things. One is that it correlates well with the law of diminishing returns of investment towards the high-end. The other is that is shows very well that when all is said in done (when the whole Evergreen family is launched and prices stabilize), ATi wants to say they offer you 1TF per $100, and that will likely be the case.

So, I ask, where does Fermi fit in to this equation? Surely not where nVIDIA wants it to...Unless they REALLY JUST "DON'T CARE ABOUT GAMERS." I jest.

Apologies again for writing a whole article in the comments, just thought it might be worth an explanation of my speculation. While surely everything won't line up exactly how I forecast, I think it'll end up being close enough so the points are valid.

Feel free to pick me apart and call me crazy! :)
p3n 2nd November 2009, 19:22 Quote
ZZZZ no way two cores will perform as well as a single GPU in performance/watt/$$$
Goty 2nd November 2009, 19:52 Quote
Quote:
Originally Posted by p3n
ZZZZ no way two cores will perform as well as a single GPU in performance/watt/$$$

Well, considering the fact that a single GPU also won't be as fast (in most cases), I don't think that's much of a concern.
HourBeforeDawn 2nd November 2009, 19:55 Quote
how is the memory cooled? it looks like all it came in contact with what the GPUs or is it just using thermal pads and they simply touch the bottom of it?
Tulatin 2nd November 2009, 19:56 Quote
I really hope that one of the accessories included with this by the OEM is either one of those metal wings that branches back to a support bracket, or an adjustable stand to support it from the case floor.
johnnyboy700 2nd November 2009, 19:57 Quote
Just remember how much of a dual cards performance depends on its drivers
liratheal 2nd November 2009, 20:51 Quote
...gimme.
500mph 2nd November 2009, 20:51 Quote
The naming scheme leave room for a 5990 :D
wuyanxu 2nd November 2009, 20:58 Quote
the naming scheme is a direct copy from the very badly named gtx295......

why not call this 5870x2?
Jetfire 2nd November 2009, 21:38 Quote
Jeez, we'll be buying external cases soon to house our graphics cards.

A 4-way SLI box sitting on top of your PC case anyone? :D
Fozzy 2nd November 2009, 23:36 Quote
I do find it funny that you are so quick to point out the original posters faults in writing or "journalism" when almost every article I've ever read from bit-tech seems to skip any sort of revision and contains numerous mistakes that could be considered those of a novice..... just thought I'd point that out.

As for the GPU. I love the matted industrial effect as oppossed to the glossy black that has been the norm. I'm very excited to see a performance review soon.

back to the bashing. I love bit-tech....just watch the flaming
Muunsyr 3rd November 2009, 00:27 Quote
That was a great post alwayssts. I read a while ago (and don't know how accurate it is) that nVidia will also be using GDDR5 chips this time around? If so, they could possibly use a lower speed (GDDR5, as compared to ATi's parts) in order to help keep costs down?
Mentai 3rd November 2009, 00:28 Quote
Quote:
Originally Posted by Fozzy
I do find it funny that you are so quick to point out the original posters faults in writing or "journalism" when almost every article I've ever read from bit-tech seems to skip any sort of revision and contains numerous mistakes that could be considered those of a novice..... just thought I'd point that out.

As for the GPU. I love the matted industrial effect as oppossed to the glossy black that has been the norm. I'm very excited to see a performance review soon.

back to the bashing. I love bit-tech....just watch the flaming

+1
SMIFFYDUDE 3rd November 2009, 00:47 Quote
13.5", i'd have to do without hard drives if i'm to get one in my case. Why don't they make them wider. My case is nothing special, but it still has a 3" gap between the side window and the gfx card.

I think 5870 X2 makes more sense, if your in the market for this card you'll be educated. Nobody will be going to PC World and asking if the card will allow them to surf the internet.
Slizza 3rd November 2009, 01:07 Quote
Going to be a interesting year in the graphics department!
Think i will ride it out and upgrade my GTX 280 as late as possible wich could be a while yet seeing it still tanks all games.
Krayzie_B.o.n.e. 3rd November 2009, 05:44 Quote
what happened to 4890 x2 ? and am i the only one thinking we are seeing 2nd generation directx 11 cards from ATI/AMD without so much as a whisper from NVIDIA. ATI is going ballistic right now with the video cards. (wish they apply this to the AMD side) 5970! this card is going to be a beast and a half. Can anybody tell me how NVIDIA is going to counter this onslaught of GFX card supremacy?

and what trick up their sleeve do they have waiting in cloak mode? (4890 x3) on a 45mm board.
l3v1ck 3rd November 2009, 08:00 Quote
Am I the only person here who refuses to even think about getting a dual GPU card. I just want a powerful GPU that doesn't need driver updates every other day to get the most out of it.
feedayeen 3rd November 2009, 10:17 Quote
Quote:
Originally Posted by l3v1ck
Am I the only person here who refuses to even think about getting a dual GPU card. I just want a powerful GPU that doesn't need driver updates every other day to get the most out of it.

You're not the only one. I'd be nice if bit-tech tested there X2 cards with a few more games that tend to be gimped by their drivers rather than games that are next to guaranteed to have frame rates in the 70s and 80s.
Claave 3rd November 2009, 11:10 Quote
Quote:
Originally Posted by HugoB
Clive clearly isn't a real journalist either, as demonstrated by his broken italics tag and the placement of an inverted comma inside a full stop. :D

:D
Claave 3rd November 2009, 11:17 Quote
Quote:
Originally Posted by TWeaK
This definitely isn't the finished article. According to ATi's plan for the 5 series (can't remember where I saw it, probably somewhere on here) they're all going to have the same 'Batmobile' cooler only they'll have different sizes of cooler for the different PCB lengths. I wouldn't be concerned with the performance figures either - they're not going to be indicative of the retail product. Unless of course the performance was through the roof, in which case, it can only get better.

Well, there's already a HD 5000 card without a batmobile cooler:
http://www.amd.com/uk/products/desktop/graphics/ati-radeon-hd-5000/hd-5750/Pages/ati-radeon-hd-5750-overview.aspx
nicae 3rd November 2009, 12:48 Quote
Quote:
Originally Posted by 500mph
The naming scheme leave room for a 5990 :D

And a 5999 :D
Turbotab 3rd November 2009, 14:45 Quote
12.5 inches long and barely fits in an Antec Twelve Hundred, they will have to bundle a tub of Vaseline with it!
A Crossfire setup with 2 of those beasts, is going to cause some airflow headaches.
I personally think ATI has been a tad weak in naming the card, given its length, they should have named it the 'Trojan Magnum XL'
leexgx 3rd November 2009, 15:35 Quote
this card is an sample card not the final product so it be longer then it should be, it most likely be an 11.5" card when it comes to retail as no card has been bigger then that before
bogie170 3rd November 2009, 17:01 Quote
How much longer is it from a 4870x2?

I'm not sure whether it will fit in my Antec P182 without having to remove the hard drive cage.
wuyanxu 3rd November 2009, 21:56 Quote
Quote:
Originally Posted by l3v1ck
Am I the only person here who refuses to even think about getting a dual GPU card. I just want a powerful GPU that doesn't need driver updates every other day to get the most out of it.
+1 to that.

what's the point of having zillian frame rates per second with 2 FPS being the minimal due to incompatible new game engine, when you can only see 60? isn't it better to have 60FPS with a steady minimal of above 30?
Tulatin 3rd November 2009, 21:58 Quote
The advantage to having a framerate far above 30 (or 60, in any case) is that it gives you some breathing room. I mean, if a card is consistantly capable of delivering 60 in all games, it'll choke during REALLY intensive moments. Damn smoke grenades!
thehippoz 4th November 2009, 15:23 Quote
cool- huang trying to squeeze one out and ati is all not this time sulu!

what if those lucid boards actually work though.. 2 low end cards that don't have to deal with sli/crossfire driver code, and beat nvidia and ati's money maker sounds like a better deal to me

hope it pans out anyway.. still waiting on the review when the boards are out
uz1_l0v3r 5th November 2009, 08:19 Quote
Quote:
Originally Posted by Slizza
Going to be a interesting year in the graphics department!
Think i will ride it out and upgrade my GTX 280 as late as possible wich could be a while yet seeing it still tanks all games.

Indeed. In my view, any dual-gpu solution is pretty much overkill in today's pc game market. Unless you play Crysis/Crysis Warhead ALL the time.
PandaMonster 5th November 2009, 16:19 Quote
Companies that are in direct competition with only one other company are singular? ROFL. *no fuc**** comment*

Do some homework.
TAG 8th November 2009, 07:17 Quote
We very nearly had a similar length card aaaaages ago; the voodoo5 6000 which unfortunately was never to find home in our desktops.

Here it is alongside a 9800GX2 and a 3870X2
http://www.dvhardware.net/article26081.html


EDIT: on second thought maybe it wasn't quite that long considering the 3870X2 is around 9.5" the voodoo5 6000 was probably to be 10.5" only :p
Still a beast for the time though, with ridiculous heatsink for today's standards
HourBeforeDawn 8th November 2009, 07:26 Quote
Quote:
Originally Posted by TAG
We very nearly had a similar length card aaaaages ago; the voodoo5 6000 which eventually was never to see the light of day.

Here it is alonside a 9800GX2 and a 3870X2
http://www.dvhardware.net/article26081.html

now that is a great pic. ;)
TAG 8th November 2009, 07:46 Quote
and here it is in it's backside support full length glory
http://www.tweakpc.de/news/14076/ebay-3dfx-voodoo-5-6000-128mb-wird-versteigert/
Looks like this particular one was auctioned last year. Best bid at the time the page was published was 1347€
That's some nice collector's item :)

Now with that support bracket that card may well have reached 11.5" which is the max length we've seen so far for a graphics card.
It was referred to as a "full length" card, which I suppose is a standard coming from the server environment.

Considering there is actually a standard for a card's maximum length, can we expect desktop oriented products to follow these server standards?
I hope so since the mod I'm planning on my V2010 currently won't allow much more than 11.5" with the 14 hard drives in the front half of the case

These shots were described by ATI as those of an early engineering sample, but I doubt the design will change much (if at all) or to the point that they'll be able to fit 2x RV870 on a PCB the same size as a 5870 :(
Trin 8th November 2009, 10:00 Quote
This year is going to be interesting video card wise. I currently have a 3870x2, given to me by a friend, love this card. Would like to get a hold of one or two for crossfireX. Hope the 40nm process issues at TSMC get ironed out.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums