Published on 4th October 2012 by
Originally Posted by SpreadieQuote:Originally Posted by KodongoYou clearly haven't seen the XFX GT 640 2GB.
Is that a real card or a mock up?
 holy Sh**, it's real.
Originally Posted by KodongoYou clearly haven't seen the XFX GT 640 2GB.
Originally Posted by Elton
This at the very least shows a very good integration between the chips.They are all Kepler derived, and to me that's impressive. Even AMD needed three lines of chips, this is just basically cutting the same chip..repeatedly.
Originally Posted by toolio20
If AMD would unfutz their wretched CCC/driver software and quit limiting the OC voltage on their cards they'd get some serious market share gains...too bad that'll never happen.
In the EVGA forums, Product Manager Jacob Freeman confirms that the EVBot functionality has been removed from the card "in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products." Voltage control, even via an external device like the EVBot module, is verboten, Freeman says. Link
Originally Posted by BazDefending this card for being OK if you dial the settings down is like saying that the High end cards are pointless
Originally Posted by ShinyAli
Why did Nvidia bother releasing this when the HD 7770 which is a faster card all around for the same price is already on the market :?
Originally Posted by Paul2011Wish cards like these didn't see the light of day, its no use at all to anyone thinking about gaming at or over 1080p. 60% is far too high a mark since most new games simply will be a slideshow, let alone the big titles of next year. Nvidia can do a lot better im sure, i just can't see the point of it.
Originally Posted by SlothThat same complaint could be made about hundreds of components over the years. Simple reason is that best isn't always the one that sells. For reasons such as brand loyalty, name recognition, or simple ignorance people will still end up buying components which have strictly superior alternatives. Nvidia (or any company releasing such a product) are surely aware of this and know they'll still likely make money off of this card if they can advertise and make the card available enough.
Originally Posted by BazHi Guys
Re: testing methodology, we have a choice whether to do apples to apples, as we've done so, or apples to oranges (best playable). The latter takes a great deal longer especially when you re-test a truck load of games on an all-new test rig as we have, and provides less information regarding comparisons between the high and low end.
As such, we have a unified test methodology for GPUs; each has to tackle the same games, so we can fairly compare them. of course if you're willing to dial down settings then any game can run smooth at 1,920 , but that's not really the point of buying a new GPU. I know if i spent £100 on a new GPU, I'd expect it to play most of my games to a half decent level. Defending this card for being OK if you dial the settings down is like saying that the High end cards are pointless - why not just buy a mid-range card and dial the settings down? Why bother with SLI or CrossFire, just dial the settings down. it completely contradicts the push for performance and quality that bit-tech's ethos is all about.
I know it's silly to include the three-screen numbers, we kind of did it as part of the process, but this card CAN play at 3-screen; surely the revelation that it cant hack the frame rates is a useful conclusion (albeit an obvious one).
Originally Posted by AdnoctumPeople complain about personalities and policies at [H]OCP, but I like the fact that their GPU reviews look at the gaming experience and they find the highest playable settings that a card can play a game at. Perhaps card A can only enable Bx AA and C texture quality at D resolution, what can the alternatives do?
Having said that, I wouldn't like all gaming sites to start doing their reviews the same way. I like hard numbers and easily digested graphics as well. Having both perspectives gives me a better understanding of how a card performs and what I could expect with a purchase.
Originally Posted by ValinorWhat I gathered from what people said when they asked for a "best settings" rating was that they wanted to know something like "How much do I need to spend to get a good experience at resolution x" or "What settings/resolution could I use if I spend y amount on a graphics card". This is useful information, the sort of thing I've been asked several times by various people.
However, despite the potential usefulness of this setting-based rating I still think that a min/average fps system (as bit-tech use atm) is the better solution. This is because having a settings-based system would introduce a lot more subjectivity into the review. If you could get a game "playable" (which is itself a subjective term) at, say, 1680x1050 with mostly low settings but textures at medium, is that preferrable to the same "playability" at 1280x1024 with all high settings? What if you could have either high textures or high shadows giving you a "playable" experience? Which one is "better" to have? You'd have the same problem across reviews; should I buy card x which can play this game at 1336x768 at high settings, or card y which can play that game at 1280x1024 on medium settings?
Originally Posted by ValinorThe point of that last paragraph was that introducing settings as a way to differentiate between graphics cards would in fact reduce clarity, as different people would have different priorities for their graphics settings. A straight-up min/average fps system makes it much easier to decide which card is best for a particular price point, as you can see that Card A performs better in the tests than Card B (although even with this system it can still depend on which games you play the most) and so is better value for money.
Originally Posted by ValinorI guess what'd be nice is for a round-up of graphics cards at some point to show how much you'd need to pay for say maximum settings at a variety of resolutions at a given minimum frame rate (25 or 30fps is usually what I see regarded as playable, but even then would you rather see a card recommended that can provide a smooth performance (60fps) at that resolution or one that can just pass as playable (25fps) at that resolution?). You could then use this information to work either way (how much for these settings/what settings for this amount).
Oh yeah, and trying to find the best settings for a card would take much longer than their current testing, I guess that should be considered.
You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.
23rd January 2017
20th January 2017
© Copyright bit-tech