bit-tech.net

Radeon HD 2600 XT vs. GeForce 8600 GT

Comments 1 to 25 of 30

Reply
antiHero 14th August 2007, 14:13 Quote
Very good read there!

I am actually looking into buying one of these cards but now i have to consider otherwise. Is it really so that a last gen. card gives you a better value for money then the current gens? And whats about the 512mb models of both cards?
xion 14th August 2007, 14:56 Quote
Great read, yet dissapointing these midrange cards still fall behind my aged and cheaper x1950pro, no wonder people (read: Technophobes) get confused with nomenclature and prices when upgrading...

on a tangent... anyone have an opinion weather it'd be worth lumping another in for crossfire considering they're so cheap?
Seraphim Works 14th August 2007, 15:37 Quote
Ah, it's a refreshing change when 1680x1050 is used in reviews, makes comparisons for me that little more tangible. Especially when I can see my x1950pro still holding up alright.

I would have thought that a 256bit memory interface would have made it onto cards like the 2600 et al by now as well, why are AMD and Nvidia still dragging the cards down with 128bit?
Tim S 14th August 2007, 16:26 Quote
Quote:
Originally Posted by antiHero
Very good read there!

I am actually looking into buying one of these cards but now i have to consider otherwise. Is it really so that a last gen. card gives you a better value for money then the current gens? And whats about the 512mb models of both cards?
I don't think the 512MB cards are going to make much difference... and without playing tomorrow's games on these cards, it's impossible to gauge how well the X1950 Pro / 7900 GS will stand up. We'll be keeping an eye on things though without doubt - Bioshock is obviously the first candidate there. ;)
Quote:
Originally Posted by xion
Great read, yet dissapointing these midrange cards still fall behind my aged and cheaper x1950pro, no wonder people (read: Technophobes) get confused with nomenclature and prices when upgrading...

on a tangent... anyone have an opinion weather it'd be worth lumping another in for crossfire considering they're so cheap?
I've got a second card in house, but not had chance to test them under CrossFire... I guess the competition for CrossFired 2600 XTs is an 8800 GTS 320MB - could be an interesting comparison. :)
Quote:
Originally Posted by Seraphim Works
Ah, it's a refreshing change when 1680x1050 is used in reviews, makes comparisons for me that little more tangible. Especially when I can see my x1950pro still holding up alright.
Thanks, we decided to add it after the survey we did in the hardware forum a month or two ago - from the information gathered there, it looks like 1680x1050 is the next "1280x1024" and therefore it's very important to deliver the right information to our readers. ;)
Quote:
I would have thought that a 256bit memory interface would have made it onto cards like the 2600 et al by now as well, why are AMD and Nvidia still dragging the cards down with 128bit?
I don't honestly know why we're still stuck with 128-bit memory interfaces on these cards - you'd hope to be able to play games with at least 2xAA on a mid-range card these days, but unfortunately that's not going to happen in most scenarios. It's a shame, because I think we've had a few good mid-range cards over the last few years - 6600 GT, X800 GTO, 7600 GT, X1950 Pro. :(
Amon 14th August 2007, 16:31 Quote
For what they're worth, they perform reasonably well. BIG kudos to the bit-tech staff for including an often-overlooked digital video media playback performance; very useful for HTPCs. Contrastingly, there was no mention of fan noise.
Tim S 14th August 2007, 16:37 Quote
Quote:
Originally Posted by Amon
Contrastingly, there was no mention of fan noise.

It was covered on the each of the product pages:

Sapphire HD 2600 XT GDDR4:

"Just like the reference card we received from AMD, Sapphire’s card comes with a single-slot cooling solution that’s similar to the one on the original Radeon X1950 Pro cards. It’s not silent, but it’s also not obtrusively loud, even under heavy load. Instead, it lets off a mild hum that will more than likely be drowned out in your PC’s chassis."

Asus EN8600 GT:

"The cooler bears resemblance to some of Zalman’s VF-series graphics card cooling solutions and the cooler’s 70mm fan is at least as quiet as what we’ve seen from Zalman, too. The heatsink itself is made of aluminium and has been anodised gold for effect."

;)
Amon 14th August 2007, 16:42 Quote
Ah thanks, Tim. I must've missed it when reading lightly =/
Tim S 14th August 2007, 16:54 Quote
Quote:
Originally Posted by Amon
Ah thanks, Tim. I must've missed it when reading lightly =/

no worries
Joeymac 14th August 2007, 17:28 Quote
Great review. Shame on the gaming performance... I've been thinking about getting a 2600XT for a HTPC and am wondering if the H264 and VC1 acceleration works in VLC (or overlay video in general)... or is it just for PowerDVD?
jeff_vs 14th August 2007, 17:50 Quote
One thing that confuses me about the integrated audio in the ATI/AMD cards is how they react to other audio cards in the system.

As an HTPC user (with a 110" widescreen 720p projector in one room) I exclusively run SPDIF optical from the audio card to the 5.1 receiver. When Auzentech came out with it's first DTS-Connect card I was one of the first in line, and I love the hell out of it. I can now play some older games on the big screen while getting the full effect of the surround sound through my SPDIF cable (I only play Descent III, lame I know).

THE PROBLEM is the second TV I have set up in the kitchen, a 27" LCD with an HDMI input. I would like VERY MUCH to hook up the audio to this TV via the HDMI, which, according to the spec will accept an AC3 signal (though it will downmix it internally to 2 channel, of course). I can set the Auzentech card to Dolby Live, which will (on paper) give me a signal that will work in both rooms.

Will these ATI/AMD internal audio controllers override the audio of the Auzentech, or work in conjunction with it? I have never seen a setup in windows that allows 2 audio controllers to be used at once.

I have been keeping my eye on the new MSI NX8600 GTS Diamond Plus, which has an onboard HDMI, but with an external SPDIF input to mix into the HDMI. This would be ideal, but the cost is more than 512MB GDDR4 equipped 2600XT cards, with more GPU clock.

I like the idea of having one audio controller that does Dolby Live or DTS Connect, but I don't want to compromise on performance just to get a card that has an HDMI hole with SPDIF from an external source. The HD video comparison favoured the 2600XT over the 8600GT, but the GTS is not shown. Would I expect marginaly better performance, but still worse than the 2600XT?

Confused,
Jeff
Nature 14th August 2007, 18:05 Quote
AMD hath suck, whilst Nvidia blows..

Both of these cards, although new tech., shoould be under 50 pounds... less.

AMD can produce and manufacter cards and chips so low! Why don't they just become "the apple" for the PC world and mass make HTPC's?
Da Dego 14th August 2007, 20:35 Quote
Quote:
Originally Posted by jeff_vs
...
Will these ATI/AMD internal audio controllers override the audio of the Auzentech, or work in conjunction with it? I have never seen a setup in windows that allows 2 audio controllers to be used at once.

I like the idea of having one audio controller that does Dolby Live or DTS Connect, but I don't want to compromise on performance just to get a card that has an HDMI hole with SPDIF from an external source. The HD video comparison favoured the 2600XT over the 8600GT, but the GTS is not shown. Would I expect marginaly better performance, but still worse than the 2600XT?

Confused,
Jeff

Hey Jeff,

The best thing that we can do is try to plug in a Creative X-Fi or other card in and give it a go. I'll talk to Tim and Rich about this later, as it is a curious question.

The AMD 690G motherboards also have a built-in HDMI-out, and it is used as an output source for the onboard sound. This means that using any other type of audio controller removes the ability to have audio-out on the HDMI (I have tested this personally as I have one of the 690G boards).

Whether this does the same thing when using a video card as the HDMI output would be different, as that would HAVE to use another sound card to generate the signal to begin with. If anything, your Auzentech would then be the output and it would either "mirror" the signal by passing it through HDMI or it would shunt the S/PDIF signal.

What could be more interesting is the DRM implication. HDMI is, as Richard pointed to me, a "protected path" that should not allow the audio to go through anything BUT the HDMI if it can go through it at all. Anything less would be a failure in the DRM model, which could pose some interesting questions.

We'll take a look as soon as we can. Won't be today, but this will sit on our "list" ;) so keep watching.


<EDIT>
After a little thought and discussion with Rich, we both think it wouldn't happen. If you want HDCP proper (for BR or HD-DVD requirements) you will have to go with the ATI card, which has an onboard sound controller. That means that all other controllers will be disabled because, as you mentioned, no OS has support for multiple sound cards receiving the same signal. In the meantime, the NVidia card will tie up your S/PDIF output on your card in order to get any audio to the HDMI to begin with. So you're still faced with a "one or the other" situation.

That being said, we'll still give it a try. But I bet that plugging in the video card, we'll have to change the default audio device to AMD HDMI output (like I do on my 690G board).

If your soundcard has two or more SPDIF outputs that can be used simultaneously, you might have a winner there with the NVidia board. But it wouldn't be HDCP protected audio, and there's no guess as to how that will react with "proper" HD content.
</edit>
arcanes 14th August 2007, 20:35 Quote
I guess for a 22inch 1680x1050 monitor, you need a high end card to play at playable frame rates. a shame :(

About the whole "Deep Color" thing HDMi 1.3 can pass through. it's just a marketing trick. HD DVD/Blu-ray DOESN'T support "Deep Color" and never will.the specifications of both formats just doesn't include the option to support a higher color format, so you will have to find another way to play your HD movies in "Deep Color". that, by itself, makes the whole HDMI 1.3 feature "deep color" useless. and say you have got a format that supports "Deep Color". well it will help you to know that today, nothing is done in "Deep Color"(expect for special uses), so there will be nothing to play in the superior "Deep Color" format.
to the companies for misleading everyone.
Tim S 14th August 2007, 20:40 Quote
Quote:
Originally Posted by arcanes
I guess for a 22inch 1680x1050 monitor, you need a high end card to play at playable frame rates. a shame :(

About the whole "Deep Color" thing HDMi 1.3 can pass through. it's just a marketing trick. HD DVD/Blu-ray DOESN'T support "Deep Color" and never will.the specifications of both formats just doesn't include the option to support a higher color format, so you will have to find another way to play your HD movies in "Deep Color". that, by itself, makes the whole HDMI 1.3 feature "deep color" useless. and say you have got a format that supports "Deep Color". well it will help you to know that today, nothing is done in "Deep Color"(expect for special uses), so there will be nothing to play in the superior "Deep Color" format.
to the companies for misleading everyone.

We had a discussion about this before the article was published and we've got both TVs and an HD DVD set top box player that support Deep Colour in house. We're waiting for software and, fwiw, every studio we've spoken to has said that Deep Colour is on the roadmap.
Da Dego 14th August 2007, 21:16 Quote
Quote:
Originally Posted by Tim S
We had a discussion about this before the article was published and we've got both TVs and an HD DVD set top box player that support Deep Colour in house. We're waiting for software and, fwiw, every studio we've spoken to has said that Deep Colour is on the roadmap.
Deep colour blah blah blah. It's scientifically believed that the human eye tops out at around 10 bits per channel, and that's pretty much the north bound of a very colour sensitive person. Figure that men see colour worse than women to begin with, then factor in that ~8% of men are colour blind to some degree anyway and you leave me wondering...why do we care? :)

"Truecolor" (24b+alpha) is statistically more than what about 90% of this forum can actually differentiate...which makes me really wonder why you think studios would work hard to bring "deep colour" in as anything more than a phantom marketing ploy, since not just the majority but surely all but a select sliver of the audience couldn't tell anyhow.
korhojoa 14th August 2007, 21:21 Quote
okay: so simple question:
someone who plays not-so-new games, and just wants something that will play video smoothly, what should this person get?
games in question: cs:s, wow and currently has a x550 which struggles with both. would the x1950 pro or any of these newer ones be better?
Meanmotion 14th August 2007, 21:27 Quote
Your absolutely right Brett, I couldn't care less about Deep-Colour. In fact, there is nothing of HDMI 1.3 that is of real consequence for a graphics card. Even CEC has debatable merit as you are unlikely to want your PC to turn off when you turn your TV off.
Quote:
Originally Posted by korhojoa
okay: so simple question:
someone who plays not-so-new games, and just wants something that will play video smoothly, what should this person get?
games in question: cs:s, wow and currently has a x550 which struggles with both. would the x1950 pro or any of these newer ones be better?

What sort of resolution are you talking? For 1280x1024 this new generation of cards will handle most things but if you run at a higher resolution and don't care about upcoming games then the X1950Pro would be your best bet.
jeff_vs 14th August 2007, 23:00 Quote
Quote:
Originally Posted by Da Dego

If your soundcard has two or more SPDIF outputs that can be used simultaneously, you might have a winner there with the NVidia board.
Thanks to the Auzentech having an onboard connector for the "Hoontech" (which, as you may recall is an SPDIF daughter board for the SB Live! series) I have a total of 4 SPDIF outputs, 2 optical, 2 coax. Hooking up the MSI would tie up one coax, but leave the two optical free for the other things I have hooked up.

Interesting about the DRM, though. If it's any help, I run XP MCE and not the Vista. I am not planning on getting an HD drive until either the Toshiba Slimline, Slot Loading HD-DVD drive SD-T913A becomes available, or the Blu-Ray Panasonic UJ-215 comes down in price. Even when (or if) they do, I may wait out this whole "new media" battle.

And even at that, if or when that happens, I'm cheezed that XP-MCE doesn't support playing HD content natively, I have to get PowerDVD 7.3 Ultra. Does Vista MCE play HD discs natively? That would be the ONLY way I would consider Vista, but that's a whole other story.....

Thanks,
Jeff
Tim S 14th August 2007, 23:56 Quote
Quote:
Originally Posted by korhojoa
okay: so simple question:
someone who plays not-so-new games, and just wants something that will play video smoothly, what should this person get?
games in question: cs:s, wow and currently has a x550 which struggles with both. would the x1950 pro or any of these newer ones be better?

The X1950 Pro will be fine... the only times I'd recommend the 2600 XT or 8600 GT over the X1950 Pro is if:

1) you want to play back HD DVD/Blu-ray Disc movies
2) you really want DirectX 10 compatibility and of course that requires Windows Vista. If you're running Windows XP, you won't be able to run games in DX10 mode.
Tim S 15th August 2007, 00:19 Quote
Quote:
Originally Posted by Da Dego
Deep colour blah blah blah. It's scientifically believed that the human eye tops out at around 10 bits per channel, and that's pretty much the north bound of a very colour sensitive person. Figure that men see colour worse than women to begin with, then factor in that ~8% of men are colour blind to some degree anyway and you leave me wondering...why do we care? :)

"Truecolor" (24b+alpha) is statistically more than what about 90% of this forum can actually differentiate...which makes me really wonder why you think studios would work hard to bring "deep colour" in as anything more than a phantom marketing ploy, since not just the majority but surely all but a select sliver of the audience couldn't tell anyhow.
When you bought your new TV, did you buy an LCD or a Plasma? And if so, why did you buy it? I believe you bought the plasma because colours are more vivid and blacks are better. I think you'll see a similar impact with deep colour with support for "up to" 36-bit colour.

The human eye is analogue and can therefore process an infinite colour pallete. Really speaking then, the question isn't how many colours can the human eye detect, it's how many colours can the brain detect. That, ultimately, depends on the individual.

In this report, the reporter says he was unable to see any difference above 30-bit colour. While that may be the case for him, what's to say he isn't colour blind?

The same can be said about the human ear. Can someone with imperfect hearing hear the noises at the same frequencies as someone with perfect, or near perfect hearing? No, of course it doesn't, but that doesn't mean there isn't sound (that people can hear) at the frequencies which the person with imperfect hearing cannot hear.
Hugo 15th August 2007, 01:21 Quote
Saying Deep Colour is pointless is like saying Lossless audio formats are pointless. I can tell the difference between FLAC and 320kbps MP3 and Rich would probably agree that the former sounds superior, but many would argue MP3 offers "all the detail you need."

The difference between "standard" 24-bit and Deep Colour in terms of colour depth should be like changing from 16 to 32-bit for those of us who remember that transition (a lot I hope given the nature of the site). Ok, so you may not be able to tell at every given moment that your getting a slightly different shade of green to if you were watching a 24-bit source, but your overall perception will improve.

Now I'm of to play counterstrike 1.3 at 800 x 600 on a 30in 2560 x 1600 monitor (because no-one can see the difference in dot-pitch anyway) with 16bit colour...
Da Dego 15th August 2007, 03:24 Quote
Actually, archangel, though your comment on 16 to 24+ may seem valid, the comparison is not. It comes down to statistical understandings and Z scores. Differences between 2 and 3 standard deviations, for example, are far less detectable than those between one and two. Or, in this case, it's more like between 2-2.5 and 2.5-3.

As Tim mentioned, I did buy a plasma because the "colors are brighter." In that respect, he's dead on. However, a lot of that has to do with the brightness of each pixel independent of the next, and the ability to protect each against "bleed" from those neighbouring it.

There are quite a few studies that (if anyone is truly bored) I can throw out here, as I did my homework before stating such bold things. However, I think maybe a look into digital colour and depth as a whole is in order, as too much discussion here is kind of crapping on this thread.

Would people be interested in this? Because I'd be happy to do one.

The way I read Tim's thoughts, the ideas of true vs. deep colour are little more than a side note for the generally poor performance of these cards, and what's important is that these cards don't include the possibility if you DO want it. That, on top of all of the other performance issues compared to both current and last generation hardware make these cards both good ones to avoid.
Renoir 15th August 2007, 03:48 Quote
Brett, I'd love an article on the subject. It's clear that it's an important aspect of video for the future one way or another so it'd be good to have something that cuts through the marketing crap and gets to the nuts and bolts of the situation.
[cibyr] 15th August 2007, 05:28 Quote
Quote:
The Radeon HD 2600 XT GDDR4, on the other hand performs worse than the GeForce 8600 GT in four of the five titles we’ve tested with anti-aliasing, but fares a lot better against the 8600 GT when you disable anti-aliasing. On that front, the HD 2600 XT GDDR4 looks to have taken a leaf out of the Radeon HD 2900 XT’s book and seems to solidify the fact that there is something seriously wrong with AMD's anti-aliasing performance on its HD 2000-series of products.
I thought ATi didn't include dedicated MSAA hardware, instead doing it with programmable shaders. Since DX10 requires the ability to do custom AA, ATi chose to leave off the MSAA hardware to make room for more shaders while nVidia went with less shaders but left the MSAA hardware. End result: nVidia kicking ATi's ass in almost everything with AA, except for Call of Juarez which uses custom AA.

I could be wrong though, please correct me if I am :)
Woodstock 15th August 2007, 10:04 Quote
Quote:
Originally Posted by Da Dego

Would people be interested in this? Because I'd be happy to do one.

Any article you guys are prepared to write, im happy to read...
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums