bit-tech.net

ATI Radeon X1900XT 256MB

Comments 1 to 18 of 18

Reply
atanum141 13th October 2006, 12:56 Quote
Verry interested in this card. No all of us can spend £300+ on a gfx card, this maybe my next purchase.
trig 13th October 2006, 15:53 Quote
why would someone buy this over the x1800xt?
Tim S 13th October 2006, 15:55 Quote
because it's faster?
DXR_13KE 13th October 2006, 16:15 Quote
this is a good GC but.... i am still waiting for DX10........ i wont buy this.
Charles1 13th October 2006, 23:10 Quote
enlighten me DX10?
Hovis 14th October 2006, 00:32 Quote
When Vista launches there will be Direct X 10, and with it a whole slew of new effects that Direct X 9 cards, even the super fast ones, can't do. With the first DX10 cards due soon the price of DX9 cards will plummet in the coming months, if not weeks.

Getting a cut rate DX9 card new though, for my money, is a bad move. What I would advise the budget conscious folk of the world to do is wait until the new high end cards come out, the G80s and such, and buy a second hand top end card from somebody fishing for a G80. Or get the budget version of the G80 card in that generation. In my experience Nvidia mid-range cards, 7600, 6600 and so forth, are generally very good.

BTW in case I sound like an Nvidia fanbwoi, my current card is a X1900XT 512 and it's very nice thankyouverymuch.
trig 14th October 2006, 01:23 Quote
its faster? 1450 memory on this vs 1500 on the x1800xt is faster? i must be bad at math...oh...not to mention the 625 core is the same as mine...
Tim S 14th October 2006, 02:41 Quote
Quote:
Originally Posted by trig
its faster? 1450 memory on this vs 1500 on the x1800xt is faster? i must be bad at math...oh...not to mention the 625 core is the same as mine...
There is less pixel shading power on the X1800XT by a factor of three. X1800XT and X1900XT are based on different GPUs and different architectures - X1900XT has 48 pixel shaders while X1800XT only has 16.

The X1800XT is as fast as a 7900 GT over a wide range of games, the X1900XT is considerably faster than 7900 GT in shader-intensive games.
sadffffff 14th October 2006, 08:31 Quote
i still dont see why you stick 1280x1024 under a crt gaming guise.
</pet peeve nitpicking>
Kasrkin Guard 14th October 2006, 13:10 Quote
Regardless of Vista being around the corner soon this would be a good card to have for a nice price for those of us that might have been behind a bit in hardware and upgrading soon (I hope to soon anyway :D). Quite honestly I'm not concerned for Vista at all to rush out and use it with anything lingering that might still be left to fix, so I'll wait for the inevitable service pack to follow no matter how long that takes on that. As for the card itself, even with half the memory it's a solid performer and the main difference which appears to me mostly is resolution with AA/AF thrown in for the memory difference. ;)
LoveJoy 14th October 2006, 19:19 Quote
Finaly ATI came up with something that can compete with Gfroce
GJ ATI :D
trig 14th October 2006, 21:34 Quote
thanks tim...thats what i was looking for
JeffDM 16th October 2006, 04:28 Quote
Quote:
Originally Posted by sadffffff
i still dont see why you stick 1280x1024 under a crt gaming guise.
</pet peeve nitpicking>

I agree, it's a dumb resolution for a CRT. Some LCDs have that as a native resolution, and many of them have the proper 5:4 aspect ratio to display it properly. On a standard CRT, the resolution really should be 1280x960, otherwise the image is squeezed, circles would look like ovals among other oddities.
Tim S 16th October 2006, 09:48 Quote
Quote:
Originally Posted by JeffDM
I agree, it's a dumb resolution for a CRT. Some LCDs have that as a native resolution, and many of them have the proper 5:4 aspect ratio to display it properly. On a standard CRT, the resolution really should be 1280x960, otherwise the image is squeezed, circles would look like ovals among other oddities.
We have discussed this many times before and it is because more people use 1280x1024 than 1280x960. While I don't disagree that 1280x960 is the correct resolution for a CRT, I think that many people actually use 1280x1024 on a CRT, despite it being the wrong resolution. Add that to the fact that there are over 50% of bit-tech's readers running that resolution to browse the site and it makes even more sense to use it (1280x960 is not listed - it's part of the 'others' that has a 4% share).
Iago 16th October 2006, 10:10 Quote
I see it still sports the "awesome" x1900 series cooler...

My advice to those interested on this card then, would be to avoid it like the plague or stick an aftermarket solution on it. I'm sick of that cooler (512 mg x1900XT here). Bloody sick, I tell you. It's not that bad when idling, but it's simply unbearable when gaming.

Heck, I even have to be very careful when I turn the PC on at mornings, because it's so bloody noisy at startup that it can easily wake my fiancee up (I get up 1 hour earlier to go to work), and that's being on the other side of the apartment.

Now...that XFX 7950GT with passive cooling :drool: That is a card worth the price... I'm not buying a DX10 card until there's some passively cooled 8800GT.
Tim S 16th October 2006, 10:21 Quote
Quote:
Originally Posted by Iago
I see it still sports the "awesome" x1900 series cooler...

My advice to those interested on this card then, would be to avoid it like the plague or stick an aftermarket solution on it. I'm sick of that cooler (512 mg x1900XT here). Bloody sick, I tell you. It's not that bad when idling, but it's simply unbearable when gaming.

Heck, I even have to be very careful when I turn the PC on at mornings, because it's so bloody noisy at startup that it can easily wake my fiancee up (I get up 1 hour earlier to go to work), and that's being on the other side of the apartment.

Now...that XFX 7950GT with passive cooling :drool: That is a card worth the price... I'm not buying a DX10 card until there's some passively cooled 8800GT.
Yep, X1900 cooler sucks and it's something we recommend changing. However, as stated, I understand why it's been left as-is (in order to keep costs down). :)
Iago 16th October 2006, 11:56 Quote
Quote:
Originally Posted by Tim S
Yep, X1900 cooler sucks and it's something we recommend changing. However, as stated, I understand why it's been left as-is (in order to keep costs down). :)

Uuups...I missed that paragraph :o

Still, most reviews don't make justice to the noise the X1900 series output (and I've read that the 7900GT cooler is really loud when under load too). The performance is great, no doubt but the noise levels are becoming ridiculous. I bought mine under the idea that it was somewhat loud on load, but not I'd-rather-stick-cotton-in-my-ears loud :D

Seriously people...it can't be stressed enough...you really, really, really have to need that level of performance in Oblivion to cope with such a nosie (not only loud, but with a really unpleasant rattle too).

Perhaps it's just that I'm getting old, but gaming is hardly a pleasant experience when you can't play at night without waking your family
Bindibadgi 16th October 2006, 12:09 Quote
No, I totally agree. I HATE doing CrossFire testing in the lab because it envolves TWO X1900s and their noisy ass coolers. I can't wait until we change to using X1950s.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums