bit-tech.net

ATI CrossFire - Hands on

Comments 1 to 25 of 40

Reply
Da Dego 26th September 2005, 14:49 Quote
Wow.

I really appreciate that you guys didn't just bullox up the review by saying "Crossfire sucks because it can't beat a 7800 in SLI." Like some sites did. :p

It looks like, at the very least, crossfire is as good a solution as SLI, meaning that we're back to a battle of the individual cards that one chooses to use. This is nice, except that I don't like the crossfire "master card" BS. With SLI, you buy two identically specced cards, you plug them in. It works. With crossfire, you have to spend extra to get one of the cards, reducing the value of the solution. Maybe they'll fix this in the x1800?

By the way, can't wait for a genuine motherboard review. The "Get Bindi Busy" club needs to have a meeting again. Part of what will determine which way I lean if x1800 comes out comparable will be the mobo featuresets, since we now have to get mobos specific to our graphics card loyalty. Did I mention I hate that "feature"? Whichever company got off their high-horse enough to have their product work on any 2x16 PCIe board would be much more likely to get my business.
atanum141 26th September 2005, 14:58 Quote
Quote:
"Crossfire sucks because it can't beat a 7800 in SLI." Like some sites did.

yeah that dissapointed me in the other reviews..this preview was much more fair..and it actually looks like Ati are back in the game(nearly) but its a good sign..but im not too sure if its as good as the x1800 dont seems as its supposed to be...but we shall see.

excelent preview!!
Tim S 26th September 2005, 14:59 Quote
I don't think there will be a change with Radeon X1800 series.
Da Dego 26th September 2005, 15:02 Quote
Quote:
Originally Posted by bigz
I don't think there will be a change with Radeon X1800 series.
How do you mean? That the benefits of CS won't change, or that the 1800 still won't compete?

I'm betting that the latter is true, but that's because the thing has been in development since before the 7800...so they've spent all this time perfecting technology that is already older than what's out now.
RotoSequence 26th September 2005, 15:16 Quote
Theyve had a full nine months to get Super Tiling working since they started talking about it, six months since they first demoed it, and three months since one other major demonstration-and its not working with mainstream titles? :| It seems ATI is having problems getting into the game ;)

Right now, no matter how you look at it, with ATI demanding comparison only with the 6800 series, you can basically conclude that they have some glaring weaknesses.
PA!N 26th September 2005, 15:18 Quote
Interresting concept...but I would't buy an ATI card because they dont
(at the moment?) support Shader model 3.0 and many of those
" " effects are only possible on cards with sm 3.0 !!!
specofdust 26th September 2005, 15:33 Quote
Yeah, I won't touch ATi either untill they have SM3 implemented properly. I don't even really know fully what it does, and I don't care. But I do know that the best graphics effects are going need it over the next few years, and that without proper implementation in a card I'll get crap performance. That's enough for me, someone who keeps a graphics card for about 3 years.
Tim S 26th September 2005, 15:40 Quote
Quote:
Originally Posted by Da Dego
How do you mean? That the benefits of CS won't change, or that the 1800 still won't compete?

I'm betting that the latter is true, but that's because the thing has been in development since before the 7800...so they've spent all this time perfecting technology that is already older than what's out now.
You still need master/slave. I believe that the 60Hz issue will be fixed though, as they can use a different transmitter on the master card along with the required transmitter on the slave in order to give enough bandwidth for high resolutions and high refresh rates.
Tim S 26th September 2005, 15:41 Quote
Quote:
Originally Posted by PA!N
Interresting concept...but I would't buy an ATI card because they dont
(at the moment?) support Shader model 3.0 and many of those
" " effects are only possible on cards with sm 3.0 !!!
Not true, you can run HDR in SM2.0.
blackerthanblack 26th September 2005, 16:39 Quote
As I've said before 'ahem' i see no point in SLI/Crossfire unless you go for near top of the range cards. As the article said you can get a single card with equal or greater performance and you have greater expansion potential for future.

Oh and probably with less hunger for power than two lower spec cards. Save electricity save the Earth Yeh.
Bindibadgi 26th September 2005, 17:42 Quote
Quote:
Originally Posted by Da Dego
The "Get Bindi Busy" club needs to have a meeting again.
roffles :D It happened this morning in the form of:

Wil: Im sending you some ****
Me: k

That's the cut down, "100 minute bible" version of what actually happened in the long and boring conversation.
Quote:

Part of what will determine which way I lean if x1800 comes out comparable will be the mobo featuresets, since we now have to get mobos specific to our graphics card loyalty. Did I mention I hate that "feature"? Whichever company got off their high-horse enough to have their product work on any 2x16 PCIe board would be much more likely to get my business.

I'll try to look into it for you.
Quote:

Not true, you can run HDR in SM2.0.

On selected titles, whereas if you own an SM3 card you can run HDR on EVERYTHING that supports HDR, in either 2 or 3.
PA!N 26th September 2005, 17:44 Quote
Quote:
Originally Posted by bigz
Not true, you can run HDR in SM2.0.
I'm not just talking about HDR...ever played Splinter Cell: Chaos Theory
with SM 2.0 and right after with SM 3.0 ??? Quite a diffrence...huh :|
Highland3r 26th September 2005, 17:53 Quote
Quote:
Originally Posted by bigz
You still need master/slave. I believe that the 60Hz issue will be fixed though, as they can use a different transmitter on the master card along with the required transmitter on the slave in order to give enough bandwidth for high resolutions and high refresh rates.

The output of the master card (ie to VDU) isnt limited to 1600*1200 @ 60hz, but the internal link of the 2 cards IS limited. Its not a massive problem, as when rendering with AFR, 1600*1200 @ 120 fps would be the maximum the cards could possibly output...
Its not the best of soultions but shouldnt limit the actual output to display a massive amount (unless you're wanting stupidly high fps in situations)
Bindibadgi 26th September 2005, 17:56 Quote
Anyone else think the preview mobo is slightly.. unupgradable? You want to use a PCI card? tough.

Also - anyone else read ATI Crossfire as being the pirates best friend? "Oooh look, now i can play my game i downloaded before it hits the shelves with multigpu WITHOUT having to wait until an NVIDIA profile comes out after it goes retail". :D;)

FYI: The Crossfire board articles might be a while once they get to me, because im gonna have to play every-single-game i own through, again, at 14AA :D:D:D
Tim S 26th September 2005, 19:01 Quote
Quote:
Originally Posted by PA!N
I'm not just talking about HDR...ever played Splinter Cell: Chaos Theory
with SM 2.0 and right after with SM 3.0 ??? Quite a diffrence...huh :|
That's related to Tone Mapping, not to Parallax Mapping, HDR or Soft Shadows. I don't think that there is *that* much difference and I've sat playing SC:CT side by side on an X850 XT PE and a GeForce 7800 GT. Same monitors, same settings. The difference wasn't that big.
Quote:
The output of the master card (ie to VDU) isnt limited to 1600*1200 @ 60hz, but the internal link of the 2 cards IS limited. Its not a massive problem, as when rendering with AFR, 1600*1200 @ 120 fps would be the maximum the cards could possibly output...
Its not the best of soultions but shouldnt limit the actual output to display a massive amount (unless you're wanting stupidly high fps in situations)

I actually believe it is due to the lack of reciever chip on the slave (or standard) cards. The required Sil 1172 chip (that can do 1600x1200 at 75Hz) needs a Sil 1171 chip on the slave card in order to function correctly - the current hardware doesn't have that chip, so ATI had to use a slower chip on the master card to compensate for it.

To further expand on this, the current slave cards that many of you have now have the Sil 1161 transmitter chip to convert the contents RAMDAC in to an image on the screen. In order to combine two display outputs you have to use the same transmitter chip on the master card. The Sil 1162 receiver chip is required to compose the two Sil 1161 outputs in to a single image. The Sil 1161/1162 transmitter/reciever combination is limited to 165MHz (1600x1200 60Hz).

You can't just slap a Sil 1171 transmitter and Sil 1172 receiver chip on the Master Card and expect it to work with the Sil 1161 transmitter on the slave card, because the two transmitters need to be identical. ATI made the decision to offer support for the current X8x0 series owners and thus made the compromise of 1600x1200 60Hz.

The issue will be fixed with X1800 series as far as I know, because they'll change the chips to the Sil 1171/1172 chips along with having dual dual link DVI. The Sil 1171/1172 chip combination can output at 1600x1200 75Hz - not ideal, but a damn sight better than 1600x1200 60Hz. There will still be the requirement for a dongle with the cards though, from what I understand.

Basically, the short version of the story is that they got caught with their pants down and never really gave multiple consumer video cards working together in tandem to improve gaming performance a second thought. Thus, they had to make a bodge job in order to 'make it work'.
Adnuo 26th September 2005, 19:49 Quote
About the 1600*1200@60Hz restriction, no matter what the hardware on the card, won't you eventually run into a restriction that's less than VGA? As DVI has less bandwidth than VGA, there's going to be a restriction right there in terms of the way the cards interconnect. So it can still never be the full 2048*1534 or whatever the max on the video cards is, as the DVI out will restrict it, correct?

Then again, I could be totally off my rocker :/
Sathy 26th September 2005, 20:33 Quote
Surprised to see that nobody else thought about asking why the SC:CTand Doom3 tests were run on such a low resolution? would've expected atleast an additional test with 1280x1024?

Who in their right minds plays with 1024x768 resolution if you have that sort of hardware?

Other than that, the article was ok, eventhough a bit on the short side...left me feeling like I had read a preview...

Interesting to see what will happen with x1800 cards and what of the rumours are actually true.
Cheap Mod Wannabe 26th September 2005, 21:30 Quote
"OH My God They managed to beat 6800 SLI" - first thing that pops to my mind.

But then if you look into the future we can imagine possible plans that Ati has. While nVidia is saying "Oh this is the best best best buy now" and then 3 months later laughs at your card saying you have to upgrade to be cool... And if Ati keeps this Crossfire going with a little price lowering they could make cheaper upgrading theme. You buy one card+mobo etc. THen six months later you add another card. Then some time you change again.

It's great Idea on paper I think, but in real life.... mehhh kinda like communism (no offense)

Problems would occur when new innovations would leave your first crossfire compatible card underpowered, and instead of adding another card, You'd just buy faster single or something.

Well it is getting interesting though. I am quite convinced that we'll have surprises... Big surprises for the Christmas

MWUAAHAHAHA
unclean 26th September 2005, 21:57 Quote
The comment about CRTs:

"If you've got a CRT, forget about the high resolution - just stick it down at 1280x1024 at 85Hz (perfectly acceptable)"

is a bit silly, considering noone buys top end hardware to have something that is "acceptable"?

And I'd have thought the idea of a solution like crossfire was to have full everything at 1600x1200 and 2048x1536, opening the highest resolutions playably.

Having a Iiyama Pro514 22" which can easily do such resolutions at 85Hz, it seems odd to have to downgrade to a TFT to make the refresh rate a non-issue - by which time you're limited to 1280x1024 or at best; 1600x1200, before getting into £600+ for the screen alone.
Tim S 26th September 2005, 22:03 Quote
it is perfectly acceptable from a refresh rate POV, not from a resolution POV in my opinion.

I think that refresh rate is something worth taking in to account when gaming with CrossFire - I'll be taking that in to account when completing future reviews around CrossFire I think. I don't think 60Hz is 'playable' for more than about 5 minutes - I'd get a headache if I ran 60Hz on a CRT for longer than that.
specofdust 26th September 2005, 22:34 Quote
60Hz isn't acceptable on a crt, I havnt read the thread/article, but I guess thats all it can do in 1600x1200, which means lots of people who like high end kit aren't going to buy it. A shame, ATi are really dissapointing me these days :(
Adnuo 26th September 2005, 22:38 Quote
Quote:
Originally Posted by bigz
it is perfectly acceptable from a refresh rate POV, not from a resolution POV in my opinion.

I think that refresh rate is something worth taking in to account when gaming with CrossFire - I'll be taking that in to account when completing future reviews around CrossFire I think. I don't think 60Hz is 'playable' for more than about 5 minutes - I'd get a headache if I ran 60Hz on a CRT for longer than that.
Did you mean to switch those two? Makes more sense in my mind that way :)
Beatbox 26th September 2005, 23:05 Quote
Quote:
Originally Posted by Cheap Mod Wannabe
"OH My God They managed to beat 6800 SLI" - first thing that pops to my mind.
"Oh My God the *single* X850 XT beat the 6800 Ultra in Doom 3!" - is the first which pops into mine.

ATi never beat nVidia in Doom 3 ... never ...

OK, the CrossFire result is believable, it's new technology and anything's possible, and it's great to see ATi finally clawing back (and passing) nVidia's lead. But with the single card results...?? Either the ATi drivers you used or the reference mobo (I assume you used this for the single X850 figures too) is/are seriously fast! I'd like to see benchies of some single nVidia cards on that mobo sometime, if that's even possible of course.

All in all though, a very good and informative preview. ;)
sadffffff 27th September 2005, 04:27 Quote
i cant say that im too impressed with this article. not at all. things are run at very low resolutions which seems to be the reason that the ATi side is coming up on top in doom3.. running at these low resolutions is really testing the motherboard and cpu more than anything. i dont see too many people running their new expensive setup at 1024 4xAA (once) and 8xAF.. i know you said you werent going all out on perf numbers but this is giving people an incorrect view of what crossfire is. I have to suggest that people go to a site like anandtech. their review is run at high resolutions talks about the 1600x1200 60hz limitation in greater detail and also shows image quality improvements as bit did. the numbers over there gives you an actual idea of whats going on. frankly i was baffeled at the doom3 perf numbers till i went there.
RotoSequence 27th September 2005, 04:32 Quote
In bit-tech's defence, ATI didnt leave them a lot of options; they were given pretty specific instructions on how to perform the reviews, though I agree, more high resolution gaming should have been done. The fact that Bit cant show a single 7800 GTX (which beats current crossfire options hands down...) hurts a bit too. Really, what I think this crossfire (pre)review needs is time.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums