bit-tech.net

R680: AMD ATI Radeon HD 3870 X2

Comments 51 to 75 of 84

Reply
WhiskeyAlpha 29th January 2008, 20:27 Quote
Quote:
Originally Posted by legoman666
"05/02/08" oh lol, silly brits and their different way of writing dates. I'm used to MM/DD/YY instead of DD/MM/YY

Silly? Seems far more logical to me as it starts with the smallest value (day), then the next (month) and so on...(year). Still, I guess if there isn't a "u" to remove anywhere, just shuffle it up a bit.

But then again I'm a silly Brit ;)
mclean007 29th January 2008, 22:11 Quote
Quote:
Originally Posted by WhiskeyAlpha
Quote:
Originally Posted by legoman666
"05/02/08" oh lol, silly brits and their different way of writing dates. I'm used to MM/DD/YY instead of DD/MM/YY

Silly? Seems far more logical to me as it starts with the smallest value (day), then the next (month) and so on...(year). Still, I guess if there isn't a "u" to remove anywhere, just shuffle it up a bit.

But then again I'm a silly Brit ;)
At risk of going completely off topic, I'm totally with you on this one (although for total consistency, why not have the whole time/date field ordered most significant first - e.g. 2008/Jan/29/22:05:00? Still, middle significant, then least significant, then most significant makes no sense at all!). Have you or anyone else noticed the insidious creep of the American date format into UK publications - you can't pick up a copy of Metro without being assaulted by references to "January 29" or whatever. I have seriously thought about writing a short HTML parser to de-Americanify my browsing - revert all dates to proper format, correct the crazy Colonial spelling etc., and remove every last reference to "off of" ("off" will be fine), "gotten" (it's "got"), "in back of" ("behind"), "could care less" ("couldn't care less") and similar. That would be awesome.
willyolio 29th January 2008, 23:08 Quote
the CoD 4 benchmark seems a bit odd. Tech Report and Anandtech both found the 3870 x2 beating the GTX and Ultra by a fairly large margin, on all resolutions.
LVMike 29th January 2008, 23:38 Quote
the crysis graphs showing 3870x2 512mb crossfire is a little confusing. As the other graphs read 3870 512mb crossfire. It looks like you had 2 3870x2's with 512 mb that you somehow got into cf. Rather than 2 single gpu 3870s. You might want to change those graph labels.
Arkanrais 30th January 2008, 00:17 Quote
any overclocking results? I may have missed something, been up for 30 hours so my brain isn't functioning too well without caffeine
trig 30th January 2008, 03:21 Quote
maybe im too tired, but i just dont know how i feel about this yet. you can get a bfg gts 512 for $299 plus a $30 MIR, and then there's this for $449. still seems like despite it's performance(which i am fairly psyched about), the bang for the buck is not there. but of course, i never really thought it was there with the gtx or ultra either.
Amon 30th January 2008, 03:39 Quote
Very pretty heatsink underside.
Hardware Canucks conducted a review of the same card and it ended up trumping nVidia's cards in the majority of their tests.

"ATI Radeon HD 3870 X2 512MB" appears in each graph on Page 7 - Crysis.
Tim S 30th January 2008, 07:18 Quote
Quote:
Originally Posted by willyolio
the CoD 4 benchmark seems a bit odd. Tech Report and Anandtech both found the 3870 x2 beating the GTX and Ultra by a fairly large margin, on all resolutions.

Anandtech benchmarked a cutscene... not sure about Tech Report though. Our tests were done manually inside the game.
Quote:
Originally Posted by Anandtech
A surprisingly successful FPS on the PC, Call of Duty 4 also lacks any sort of in-game benchmark so we benchmark the cut scene at the beginning of the first mission. We start the frame rate counter as soon as we're in the helicopter and stop once the man in the chair gets shot.

Oh and while I'm at it... not sure why they're running FLYBYS?!
Quote:
Originally Posted by Anandtech
We used the built-in vCTF flyby in Unreal Tournament 3, using the -compatscale=5 switch to ensure the highest in-game quality settings were used. We ran each flyby for 90 seconds and are reporting the average frame rates.
Jeez, I can't remember the last time I ran a flyby demo... UT2003? Anything I've run here that isn't manual has at least been verified as representative of real gameplay. If it's not, I manually benchmarked it (take World in Conflict's built in demo for example).
Spaceraver 30th January 2008, 07:39 Quote
I'd love to support AMD by buying this card, but unfortunately the driver suite/interface puts me off.. The way Nvidia has the "classic" control panel is still the best imho.
Jipa 30th January 2008, 09:44 Quote
Woah a bizarro card with two GPUs sorta.. kinda... beats the ~year old Nvidia card. Merry christmas to Ati fanboys. Most likely with proper drivers (are such an urban legend btw?) it will prevail in even larger number of benchmarks.
menemsis 30th January 2008, 11:09 Quote
Stupied & useless GPU

waiting for 9800GX2
wuyanxu 30th January 2008, 14:05 Quote
it must be boring benching each game mannually one by one, but so much thanks for providing reliable data.

seems like gameplay results are same as HardOCP, where the more than 1 year old G80 core is still better. this dual GPU on one PCB card is good for benchmarking (as other reviews showed) but bad for gameplay, where most of the time, with worse minimum FPS than even GTX.

im now not sure about 9800GX2 is a good idea, just 2 G92 stuck together, may be more shader processing power, but will the drivers keep up? will nVidia's driver offer better minimum FPS figures? (IMHO, minimum FPS is what we should be looking at, where it represents the shutter moments, and if it's better than 20FPS, it'd be fine)
Tim S 30th January 2008, 14:31 Quote
Quote:
Originally Posted by wuyanxu
it must be boring benching each game mannually one by one, but so much thanks for providing reliable data.
No problem - it's more mindless than boring... but it's not as concentration-intensive as the best-playable testing used to be. When we changed back to the apples to apples tests, we didn't want to drop the real world aspect (indeed, everything we present, even if it's automated is representative of gameplay performance) - it almost just changing the way the results were presented. ;)
Bladestorm 30th January 2008, 17:34 Quote
I'm certainly not seeing a good reason to abandon my plan of upgrading to an 8800 GT for my 1680*1050 gaming, and given this I suspect nvidia won't be feeling too much pressure to rush out a new generation to beat it themselves either (though some is better than the none of a few months ago!)
bustamove 31st January 2008, 00:58 Quote
I thought you guys might be interested, I got one of these yesterday. slotted it in run 3dmark 06 and got this...

http://i33.photobucket.com/albums/d96/noway_1961/38703dmark.png

Not bad for on stock settings with immature drivers i'd say, also from a first person visual perspective.
the images on the whole, and in game are nice and crisp.

qx6700@ 3.2ghz
HIS 3870x2 all stock
2gb crucial ballistix 6400
asus commando

my last card was a bfg 8800gtx oc2 version, I have to say already I prefer the 3870x2, saying that my experience with the gtx was not a happy one.
Amon 31st January 2008, 02:47 Quote
^ Good to hear second opinion on the reviewed product from a customer. Thanks.
wuyanxu 31st January 2008, 08:54 Quote
Quote:
Originally Posted by bustamove

my last card was a bfg 8800gtx oc2 version, I have to say already I prefer the 3870x2, saying that my experience with the gtx was not a happy one.

may i ask why?

run more games, im interested in whether it will have compatibility issues with games. afterall, it runs on principle of crossfire (aka, software managing GPU power, not like a single 8800Ultra, pure power)
bustamove 31st January 2008, 12:29 Quote
Quote:
Originally Posted by wuyanxu
may i ask why?

run more games, im interested in whether it will have compatibility issues with games. afterall, it runs on principle of crossfire (aka, software managing GPU power, not like a single 8800Ultra, pure power)

Hi if your asking why, my experience with a gtx was not a happy one. The card was rma'd 6 times in five months. This is just my experience.
I know many people who have had great experiences with gtx's.
My theory is the retailer stocking my particular gtx got a faulty batch, [chipset maybe]the problem was if I left my pc on 24/7 the card was great. if I powered off over night
the card would not post at all from a cold boot, like I said this happened 6 times in total, at the end of it all the gtx cost me close to £400.
Fortunately for me the retailer refunded me, even though on the last one I was outside their refund period.

So I was a bit nervous about going nvidia again. [btw] the issues described above were not related to my psu or any other factor] the cards were dead.

This 3870 [back on topic] ive played crysis and cod4. crysis I played on defualt high spec settings, I ran the crysis benchmark tool. In game I didnt experience any glitching at all as reported in many reviews, though in saying that I didnt play it for very long.

heres some results from the benchtest with the hotfixed 8.1 cat drivers..

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

1/30/2008 5:13:45 PM - XP

Run #1- DX9 800x600 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 57.08
Run #2- DX9 1024x768 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 53.345
Run #3- DX9 1024x768 AA=2x, 32 bit test, Quality: High ~~ Overall Average FPS: 37.56

The resolutions are on low res settings due to running on a 17" monitor, ill hook it up to a 24" later and get a better idea.

Im not certain, but I think those averages are not bad considering the drivers are basically beta drivers.
Im running 8.1 drivers with a hotfix addon.

http://i33.photobucket.com/albums/d96/noway_1961/atigpuz.png

I have'nt played many other games yet because there is an issue with my rom drive. Ill swap the drive over and get some results up later.

heres a screen from crysis..

http://i33.photobucket.com/albums/d96/noway_1961/Crysis2008-01-2921-51-08-18-1.jpg

Im looking forward to cf being enabled with up coming drivers which will technically be quadfire...

need another card though..:(
Tim S 31st January 2008, 12:55 Quote
I don't think I made it clear enough in the review: the 3870 X2 is by no means a bad card - it's good to see ATI back. The long term success of it is going to be driver support, which is why I cannot widely recommend the 3870 X2 above a single GPU card.

bustamove: sorry to hear about your bad experience with the GTXs - it could have been any number of factors that were triggering the cards to die. I've had similar undiagnosable situations and that's when it sucks the most because you know you're not getting what you should be getting. :(
bustamove 31st January 2008, 13:12 Quote
Thanks Tim, The retailer confirmed that each card was in fact dead and replaced each and every one with a new card replacement, none of them could be repaired. [gtx] and were returned to bfg.
I should have said before it was a bfg oc2 version gtx. I do believe there has been problems with that particular model. [could be the factory overclock maybe]

When I put my old x1950xt in the gtx's place I got post immediately.

as a comparisin, and as you rightly say Tim [for a single card] the gtx was getting 14750 in 3dmark 06 with the same set up as above. card was at stock settings 626/2000mhz.
which actually closely matches ultra performance.

For what its worth and I have no way of proving it. the image quality in general seems more defined and crisp with this 3870x2 than the gtx I had.
The only reason I draw a comparisin to the gtx is because I very recently had one in the system Im using the 3870x2 in.
wuyanxu 31st January 2008, 14:48 Quote
Quote:
Originally Posted by Tim S
The long term success of it is going to be driver support, which is why I cannot widely recommend the 3870 X2 above a single GPU card.

that is precisely why i am againest multi-GPU card/setups.

those 3Dmark 06 scores means little verses gameplay, as long as one is happy with their gameplay experience, there's no reason compare scores. and this is why i asked about other games, looking at Bittech's Crysis minimum FPS, i wanted to know whether you get a good gameplay experience. some game may experience glitches due to the fact that driver isn't well optimised (which comes back to the disadvantage of multi-GPU)
Tyinsar 31st January 2008, 16:28 Quote
@bustamove: It's great to hear from another person (in addition to the staff) with firsthand experience. ;)

Are you running XP and if so could you tell me if span mode is available on this card. (I think ATI calls it "stretch" - basically the system sees two monitors, at the same resolution, as one larger monitor.)

@Tim Smalley / Bit-Tech staff: Can you tell me what happens when you connect two monitors. I know that the 7950 GX2 ran in either SLI mode or single GPU mode. If I wanted two monitors it shut down the second GPU. :? What does the 3870 X2 do in the same situation?
Tim S 31st January 2008, 17:50 Quote
Quote:
Originally Posted by Tyinsar
@Tim Smalley / Bit-Tech staff: Can you tell me what happens when you connect two monitors. I know that the 7950 GX2 ran in either SLI mode or single GPU mode. If I wanted two monitors it shut down the second GPU. :? What does the 3870 X2 do in the same situation?
I'm sure I mentioned it in the text (sorry if I didn't, must've cut it out in a final edit)... but multi-monitor works fine in both 2D and 3D. It's much more transparent than the 7950 GX2 in that respect.
bustamove 31st January 2008, 19:16 Quote
Played cod4 today and the image quality is truly fantastic! I only played in single player mode, no glitching or sketchy frames at all.
very smooth gameplay.
Is anyone here aware of a benchtest for cod4? is there a command or something?

I really am impressed with this card so far. @ Tyinstar, im going to hook it up to my 24" monitor in the morning, my rig is watercooled and its a bit of a pain moving the pc around.
not sure about the stretch thing but ill check it out.

Im really happy i bought this card now!

Can i quote something someone said on another forum? I would be interested to see what you guys say about it.

quote; Both gpus have 512mb each. It has 1 gb total, but seeing how it's technically CF on a card, it has to store the same info in both gpu's memory banks, resulting in it functioning like it was only 512mb.

also, quote; A LOT less latency than regular cross-fire.

You see, regular cross-fire has to do a majority of it's communication through the chipset. Meanwhile this card has it's own splitter and does it's communicating all on card. As such, there's less latency all the way across the board.

ALL current multi-gpu set-ups have to duplicate data for each gpu in ram. They have to, because both gpus need the same textures and shader information as they both work together to render the scene. Essentially they all run as if they had the same amount of ram as the lowest card does. That's just how things have to work currently, and why I believe the R700 will have to have an external memory controller for the MCM package.

To sum it up, this card is like a lower latency CF set up. unquote;

well I kind of see what the guy is getting at, but it doesnt seem logical. i.e. how, at stock settings does it perform very similar to a single card cf set up?

any thoughts on this?
Tim S 31st January 2008, 19:25 Quote
Crossfire / SLI copies all frame data to both GPUs and the scheduler chooses which frames which GPU renders. So yes, the card has 2x 512MB onboard memory partitions, but the data is the same in both partitions so it's technically just a 512MB GPU.

I don't think there's really a latency issue with standard CrossFire - if there was, we'd definitely know about it (you'd notice it when you were playing games). Of course, the fact that the data doesn't have to go via the CrossFire interconnect/chipset is a bonus... but since the GPUs are rendering three frames ahead anyway, it's not going to make any difference IMO.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums