bit-tech.net

XFX GeForce 7950 GT 570M Extreme

Comments 1 to 25 of 38

Reply
samkiller42 14th September 2006, 20:36 Quote
Think i shall wait for DX10, and ile try and get Vista to download :D

Sam
Veles 14th September 2006, 20:44 Quote
I have to agree, that XFX card looks very cool. I think more manufacturers should do things like this. I'm glad almost everyone is shying away from the horrible green PCBs of old but I think more should go the whole hog like XFX have done here. I want all my backplates made out of bolt gun metal now :p

I'm with sam here though, I don't really see why they're releasing this one since it doesn't really have an improvement over the 7900 GTX and I'm waiting for some good DX10 cards to come out to go with the 2x 24" monitors I'm planning on getting :p

First I need the money :p
Tim S 14th September 2006, 20:48 Quote
The big improvement over GeForce 7900 GTX is price. It's really a replacement for GeForce 7900 GT. ;)
samkiller42 14th September 2006, 21:14 Quote
Ok then Tim, if you and XFX are happy to send one to me, for £0 inc P&P then ile quite happily take one :D , other than that, DX10 it is.

Sam

What does happen to the products you review anyway?
TomH 14th September 2006, 21:23 Quote
Quote:
Originally Posted by samkiller42
What does happen to the products you review anyway?
Raffled-off at Bit HQ? Bit-Bingo at the weekends? You didn't really think Bit employees paid for their hardware, did you? :p

I must say, I'm only really interested in two things about this card; the style/design is pretty neat -- as previously said, if all cards looked like this then the world would be a much smarter place to live (open to interpretation) :)

Also, does the HD output work? Eg. Is there full resolution when using a DVI -> HDMI cable?

Edit: Maybe in future, seeing that a lot more people will begin using an HDTV as a secondary output; maybe you should have a small part of GFX card reviews to HDTV quality? Or is there really no discernable difference from card to card?
pjotero 14th September 2006, 21:28 Quote
Quote:
Originally Posted by samkiller42

What does happen to the products you review anyway?

they got to send them back (I think)

BTW when are dx10 cards due (I heard september a while back, but now it is a bit silent about them)
samkiller42 14th September 2006, 21:32 Quote
Quote:
Originally Posted by pjotero


BTW when are dx10 cards due (I heard september a while back, but now it is a bit silent about them)

http://www.bit-tech.net/news/2006/09/13/NVIDIAs_G80_will_be_a_little_late/

Thanks
Sam
DarkReaper 14th September 2006, 22:04 Quote
Am I the only one who winced at the idea of 100+ degree temperatures?
DXR_13KE 14th September 2006, 22:08 Quote
Quote:
Originally Posted by DarkReaper
Am I the only one who winced at the idea of 100+ degree temperatures?

that part was scary.

as for this card.... i will wait until DX10
Tim S 14th September 2006, 22:09 Quote
Quote:
Originally Posted by DarkReaper
Am I the only one who winced at the idea of 100+ degree temperatures?
90 degrees is typical for a high end GPU under load. The XFX isn't much hotter when you factor in decent airflow.

For what it's worth, CPUs are much cooler than GPUs. Most GPUs idle at above what I'd consider a respectable 'load' temperature for a CPU.
DarkReaper 14th September 2006, 22:30 Quote
Ah right, that makes me relax a bit. I'm used to reading about 40-60 degree core temps so to see something double that produces a double-take!
will. 14th September 2006, 22:33 Quote
i dunow if the readings on that ati controll panel are correct or not, but my gfx card rarely goes above 60.. Even after I tried to play oblivion on it with settings all set to full. (1 frame per every 10 seconds!)

if nothing else, that thing looks funkylicious!
DeX 14th September 2006, 22:38 Quote
I know all I seem to do is complain about bit-tech reviews but it's only in attempt to make them as great as possible.

Anyway, seeing as this card seems to be a new alternative for those that were previously looking to buy either a 7900 GT or a 7900 GTX, why don't you add the benchmarks for these older cards to the review as well? The problem I find with most bit-tech reviews is that you often compare products from different brands and manufacturers but almost never different models of products from the same brand. All we can tell from the review is that this card is faster and around the same price as the 7900 GT but it would be nice to be able to see that in the numbers. Since benchmarks tend to remain pretty similar in each new product release, I don't see that it can hurt to include the results from benchmarks made in previous reviews.

Anyway other than that, great review as always. If I hadn't postponed my new system build until Vista+DX10 time I'd definately be tempted to get one of these. It'd be perfectly around my price range.
Tim S 14th September 2006, 22:54 Quote
I understand where you're coming from DeX and I agree with you - I would have LOVED to benchmark more cards, but I was in the office past 11pm two nights running to get this blighter out at the time I did (I finished benchmarking at 2am this morning). As I mentioned in the news story I wrote when NVIDIA announced GeForce 7950 GT and 7900 GS, they're expecting us to perform miracles in the time we're given but I'm not going to compromise the number of benchmarks run (as I feel that helps to add depth to our opinion). I got this card on Tuesday morning (or late Monday - I forget exactly) and worked solid for three days to get it done. (It's nice to be sitting at home eating real food tonight, btw ;))

The time it takes to run our benchmarks makes benchmarking lots of cards hard, unless I cut things down considerably or move away from what we're trying to achieve in our reviews. For what it's worth, there are many more benchmark runs done than the three results we average our frame rates on, as we're essentially investigating just how far we can push a card in each game and maintain acceptable gameplay. Typically, there's 30-40 minutes per resolution per card, depending on the game.

I've also recently changed platforms, which makes previous benchmarks somewhat 'invalid' for direct compairson purposes (since Core 2 is now the platform of choice). I don't think that will change again soon, so it'll make things fairly easy for me to re-benchmark cards I've already benchmarked with an updated driver. Re-benchmarking doesn't take the time, it's finding caveats and sweetspots on a new platform that can sometimes take the time, otherwise it's time taken to find the sweetspots of completely new GPUs.

Hope this helps to explain why we're a bit short on the comparison side of things in this particular review - I do try my best to make things as relevant as possible. Manufacturers are really trimming us down to the bones though, but we're trying to spend as long as possible with a product to allow us to formulate a wide opinion. We should have more time with some future GPUs if what I'm being told is true - it seems like our voice is being heard. :)
DeX 14th September 2006, 23:05 Quote
Fair enough, if the benchmark systems change between reviews then you shouldn't include results from previous benchmarks. I just figured you'd have some standard system that didn't change quite as often as the graphics cards or whatever you're reviewing does. I appreciate that with new processors, motherboards and graphics cards arriving all the time this must be pretty difficult. As I've said before the amount of benchmarking and quality reviews you can create in such a short time amazes me so I wouldn't expect you to be able to complete more benchmarks of other cards than you do already.
Tim S 14th September 2006, 23:15 Quote
We'll be building up a backlog - I've got more graphics cards to review than anyone would ever want at the moment.

well, maybe that's not quite true...

:D
Nature 14th September 2006, 23:23 Quote
Isn't the G80 and 64 pipe ati card coming out in a couple months? Patience fellow nerds...
phide 14th September 2006, 23:38 Quote
Tim -

Why refer to the additional 256MB of memory as the "frame buffer"? Aren't the frame buffer and local video memory are two completely different things?
Cheap Mod Wannabe 15th September 2006, 00:14 Quote
Senjor Tim, maybe you could evaluate 7900GT vs 7950GT performance. How much of an improvement there is?

Thank you, I really do appreciate working hard to bring us technology news that make our current hardware feel obsolete =)
Tim S 15th September 2006, 00:34 Quote
Quote:
Originally Posted by phide
Tim -

Why refer to the additional 256MB of memory as the "frame buffer"? Aren't the frame buffer and local video memory are two completely different things?
The frame buffer is the memory installed on a video card - it's where the frames are stored before they can be sent to the monitor. The amount of memory can sometimes dictate the maximum texture size and resolution that will deliver acceptable frame rates in game.

For example, G.R.A.W. can be quite choppy on 256MB parts if you're above 1280x1024, as the texture size (on medium quality textures) is quite large. Sometimes you'll be almost saturating a 256MB frame buffer at 1280x1024 with medium quality textures in G.R.A.W.. In order to run high quality textures in that game, you need 512MB of frame buffer/video memory, because it physically uses more than 256MB at 1280x1024 or higher.
Tim S 15th September 2006, 00:35 Quote
Quote:
Originally Posted by Cheap Mod Wannabe
Senjor Tim, maybe you could evaluate 7900GT vs 7950GT performance. How much of an improvement there is?

Thank you, I really do appreciate working hard to bring us technology news that make our current hardware feel obsolete =)
I have some other 7950 GTs that I will be looking at soon (they didn't arrive in time for this article), so I can certainly accomodate there - the platform isn't going to change. ;)
r4tch3t 15th September 2006, 01:07 Quote
I see alot of people going for EVGA cards at the moment due to their Step-up Thing. Get a fast Video card now, and when DX 10 comes, not so much of a wallet buster as you are only paying the difference. (But then DX10 cards would probably come out in 3 months 1 day.)
Great Review BTW
Tim S 15th September 2006, 01:09 Quote
Quote:
Originally Posted by r4tch3t
I see alot of people going for EVGA cards at the moment due to their Step-up Thing. Get a fast Video card now, and when DX 10 comes, not so much of a wallet buster as you are only paying the difference. (But then DX10 cards would probably come out in 3 months 1 day.)
Great Review BTW
That's one of my feelings, too - it's a great thing to have at this moment in time, as you can play the games you want to play now at high res, and then get something even better when it comes out without 'losing' any money. :)
will. 15th September 2006, 01:59 Quote
i'll benchmark some of those cards for you... as long as i get keepsies!

(damnit, didn't see a second page, now my little funny comment makes no sence.. booo :( )
phide 15th September 2006, 02:26 Quote
Quote:
Originally Posted by Tim S
The frame buffer is the memory installed on a video card - it's where the frames are stored before they can be sent to the monitor. The amount of memory can sometimes dictate the maximum texture size and resolution that will deliver acceptable frame rates in game.
Not from what I understand. My understanding of the term is as follows:

The frame buffer is an allotment of onboard storage dedicated to storing frame data for output to the display. The size of this frame buffer is dependent on bit depth and resolution, and is typically in the ballpark of 4-8MB. This frame buffer is also utilized for performing simple post-process effects such as color "grading" and frame buffer distortion. It is not the entire chunk of onboard storage - if it were, there would be no room for other buffers or for onboard texture storage. As the name implies, it is solely a buffer.

Is this not correct? When did we start referring to onboard video memory as the "frame buffer", and why would such a misnomer be typically accepted and used? Do nVidia and/or ATi specifically refer to onboard memory as the frame buffer?

I'm very curious about the proliferation of this term.

EDIT: Scoured around a bit more. Here are some definitions from a few folk:
Quote:
Originally Posted by CNET
This memory buffer stores rendered frames offscreen; they are then converted by the RAMDAC and displayed.
Quote:
Originally Posted by PCGuide
The memory that holds the video image is sometimes called the frame buffer.
Quote:
Originally Posted by The Tech Report
The frame buffer holds a bitmap of what you eventually see on the screen, which makes the amount of memory it takes up dependent on your screen resolution and color depth.
I see nothing here that would indicate the frame buffer encompassing texture storage and other buffers, such as the accumulation buffer, the back buffer and the z-buffer.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums