bit-tech.net

EVGA GeForce GTX 650 1GB review

Comments 26 to 47 of 47

Reply
fdbh96 4th October 2012, 16:50 Quote
This card is obviously not meant for the biggest games at the highest settings, so testing it on bf3 on ultra settings is a bit pointless. No one is going to buy the card and use it like that. Include those tests as well if you like but include at least one benchmark in which the game is playable.
Paulg1971 4th October 2012, 18:50 Quote
I have a 20 inch screen which I am happy with and play at 1600 x 900, with this card being aimed at the lower end of the market why do you go with such big resoulotions, in my opinion you should use lower res options on cards like these and have 1920x1080 as the highest option.
toolio20 4th October 2012, 18:52 Quote
Awful. Just awful.

Pretty sure this is intentional, as if Nvidia only really wants you to buy the GTX 670 with it's fat and healthy profit margin and so they're just kind of phoning in the rest of their lineup.

And perhaps rightfully so. I mean, this card appeals to NO ONE (except a few errant BTers, apparently). Genuine gamers won't even give this a look, and budget/low-end folk who want a casual experience are better off just sticking with IGP (e.g. after RMAing a GPU I was stuck with no card for a while and found out the Batman: Arkham games run pretty smooth at high settings 720p with a lone i5 2500k - sort of a shocker).

If AMD would unfutz their wretched CCC/driver software and quit limiting the OC voltage on their cards they'd get some serious market share gains...too bad that'll never happen.
leexgx 4th October 2012, 18:52 Quote
Quote:
Originally Posted by Spreadie
Quote:
Originally Posted by Kodongo
You clearly haven't seen the XFX GT 640 2GB.

http://imageshack.us/a/img404/6741/gt640ncdf3.jpg
LMFAO

Is that a real card or a mock up?


[edit] holy Sh**, it's real.
http://www.youtube.com/watch?v=EIJLJ1Yy9IY

:):):)

When that card came out I was like “lol“ size box overkill (2gb as well)

650 not to good but should ok as long settings are not high
Narishma 4th October 2012, 20:10 Quote
I agree with the others. It's pointless to review such a card using very high resolutions and ultra settings...
kirk46 4th October 2012, 20:23 Quote
it would be nice for you to test folding performance of GPU's you review :)

the folding section seems to get over looked these days :(
Baz 4th October 2012, 21:28 Quote
Hi Guys

Re: testing methodology, we have a choice whether to do apples to apples, as we've done so, or apples to oranges (best playable). The latter takes a great deal longer especially when you re-test a truck load of games on an all-new test rig as we have, and provides less information regarding comparisons between the high and low end.

As such, we have a unified test methodology for GPUs; each has to tackle the same games, so we can fairly compare them. of course if you're willing to dial down settings then any game can run smooth at 1,920 , but that's not really the point of buying a new GPU. I know if i spent £100 on a new GPU, I'd expect it to play most of my games to a half decent level. Defending this card for being OK if you dial the settings down is like saying that the High end cards are pointless - why not just buy a mid-range card and dial the settings down? Why bother with SLI or CrossFire, just dial the settings down. it completely contradicts the push for performance and quality that bit-tech's ethos is all about.

I know it's silly to include the three-screen numbers, we kind of did it as part of the process, but this card CAN play at 3-screen; surely the revelation that it cant hack the frame rates is a useful conclusion (albeit an obvious one).
Kodongo 4th October 2012, 22:19 Quote
Quote:
Originally Posted by Elton

This at the very least shows a very good integration between the chips.They are all Kepler derived, and to me that's impressive. Even AMD needed three lines of chips, this is just basically cutting the same chip..repeatedly.

Kepler is the name of this generation of nVidia chips which include:
GK104 - GTX 690, GTX 680, GTX 670, GTX660Ti
GK106 - GTX 660
GK107 - GTX 650, some GTX 640s, some GTX 630s
GF108 - some GTX 640s, some GTX 630s
Quote:
Originally Posted by toolio20

If AMD would unfutz their wretched CCC/driver software and quit limiting the OC voltage on their cards they'd get some serious market share gains...too bad that'll never happen.

AMD has gained much more in their drivers than nVidia this generation. CCC 12.7 helped to close the gap between the 7970 and 680 and propelled the 7970GE above the 680.

Also, most Radeons are able to be overclocked to at least 1.3V in software. Conversely, nVidia has not allowed overvolting on their Kepler cards going as far as to stop companies selling certain SKUs, they have forced eVGA to stop selling their evBot and forced MSi to lock down voltage on their Power Edition cards.
Quote:

In the EVGA forums, Product Manager Jacob Freeman confirms that the EVBot functionality has been removed from the card "in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products." Voltage control, even via an external device like the EVBot module, is verboten, Freeman says. Link

nVidia Says No to Voltage Control

What do you think of nVidia locking down voltage?
blackworx 4th October 2012, 22:21 Quote
Quote:
Originally Posted by Baz
Defending this card for being OK if you dial the settings down is like saying that the High end cards are pointless

I understand and agree with the need for a unified methodology and comparable results, but I'm not sure that that's the logical conclusion of the "apples to oranges" argument. The ability to answer the "we all know it can't do that, but what can it do?" question is still useful.
Sloth 4th October 2012, 22:32 Quote
Quote:
Originally Posted by ShinyAli

Why did Nvidia bother releasing this when the HD 7770 which is a faster card all around for the same price is already on the market :?
That same complaint could be made about hundreds of components over the years. Simple reason is that best isn't always the one that sells. For reasons such as brand loyalty, name recognition, or simple ignorance people will still end up buying components which have strictly superior alternatives. Nvidia (or any company releasing such a product) are surely aware of this and know they'll still likely make money off of this card if they can advertise and make the card available enough.
VaLkyR-Assassin 4th October 2012, 23:04 Quote
I like the size of the card, and the performance would be great for my needs - I need to replace a recently deceased 7900GS in an old PC that is only used for playing older LAN games (CoD 4 is the newest game played!) The only issue this card has is price - if it were around £60, I'd snap it up. There is a market for these cards where power is not required for replacing dead cards, just not at these prices. I'd happily pay over £200 for a new card in my main PC of course. For now, the ATi 6670 offers the best price/performance in the price range I'm looking at. I do love the design of that EVGA card though.... :P
ssj12 5th October 2012, 06:46 Quote
Quote:
Originally Posted by Paul2011
Wish cards like these didn't see the light of day, its no use at all to anyone thinking about gaming at or over 1080p. 60% is far too high a mark since most new games simply will be a slideshow, let alone the big titles of next year. Nvidia can do a lot better im sure, i just can't see the point of it.

You do know that there are still a ton of gamers not playing at 1080p yet right? only 28% of steam users report using 1080p. And barely anyone uses higher for single monitor. And I have no idea how many people use multi-monitor, but I doubt half as many as single monitor.

http://store.steampowered.com/hwsurvey/
Adnoctum 5th October 2012, 08:10 Quote
Quote:
Originally Posted by ShinyAli

Why did Nvidia bother releasing this when the HD 7770 which is a faster card all around for the same price is already on the market :?

It would look really odd for nVidia to not release a card they had previously announced, even if it is overpriced and uncompetitive. Not that this matters too much, it is important to have something at the price-point than nothing. By having something it muddies any purchasing decision and prompts customers to price creep up to a GTX660.

To be frank, the manner in which nVidia has released the card (it just arrives without much attention being drawn to it, overshadowed by the heavily pushed GTX660 cards), combined with the lack of distinctive and custom SKUs available or publicly announced and the uncompetitive pricing, tells me that neither nVidia nor their partners care very much about putting an effort into the GTX650. Most will probably end up in some OEM "Gaming" machine.
Quote:
Originally Posted by Sloth
That same complaint could be made about hundreds of components over the years. Simple reason is that best isn't always the one that sells. For reasons such as brand loyalty, name recognition, or simple ignorance people will still end up buying components which have strictly superior alternatives. Nvidia (or any company releasing such a product) are surely aware of this and know they'll still likely make money off of this card if they can advertise and make the card available enough.

Never underestimate the irrationality of slavish devotion to a brand and the successful application of marketing to overcome a testable reality.
The reality is that this card is overpriced for the performance offered, and there are also no attractive or distinct SKUs. All the cards available or publicly planned (and I've been watching) are essentially the same reference board, dual slot cooling and external power connector, meaning there is little reason to choose a GTX650 over a higher performing HD7770 for the same price or cheaper.

And if anyone trots out the tired old canard of "Better nVidia drivers" I think I'll have to scream, rip off my clothes and run down the street waving a tired 9800GT in naked annoyance and frustration.
fdbh96 5th October 2012, 08:26 Quote
Quote:
Originally Posted by Baz
Hi Guys

Re: testing methodology, we have a choice whether to do apples to apples, as we've done so, or apples to oranges (best playable). The latter takes a great deal longer especially when you re-test a truck load of games on an all-new test rig as we have, and provides less information regarding comparisons between the high and low end.

As such, we have a unified test methodology for GPUs; each has to tackle the same games, so we can fairly compare them. of course if you're willing to dial down settings then any game can run smooth at 1,920 , but that's not really the point of buying a new GPU. I know if i spent £100 on a new GPU, I'd expect it to play most of my games to a half decent level. Defending this card for being OK if you dial the settings down is like saying that the High end cards are pointless - why not just buy a mid-range card and dial the settings down? Why bother with SLI or CrossFire, just dial the settings down. it completely contradicts the push for performance and quality that bit-tech's ethos is all about.

I know it's silly to include the three-screen numbers, we kind of did it as part of the process, but this card CAN play at 3-screen; surely the revelation that it cant hack the frame rates is a useful conclusion (albeit an obvious one).

I wasn't saying to test all the cards at a lower setting. But maybe in this example, the 7770 vs the 650 on medium settings for example. It would only need to be in one game really.
Harlequin 5th October 2012, 09:04 Quote
its a GT 640 with DDR 5 :P - they sold them for ages as OEM :D

personally i think that with a screen under 21" eg 1680x1050 or lower it would do `ok`.
xaser04 5th October 2012, 10:06 Quote
Quote:
Originally Posted by Adnoctum
People complain about personalities and policies at [H]OCP, but I like the fact that their GPU reviews look at the gaming experience and they find the highest playable settings that a card can play a game at. Perhaps card A can only enable Bx AA and C texture quality at D resolution, what can the alternatives do?

Having said that, I wouldn't like all gaming sites to start doing their reviews the same way. I like hard numbers and easily digested graphics as well. Having both perspectives gives me a better understanding of how a card performs and what I could expect with a purchase.

The irony here is that this is exactly how Bit used to review cards back in the 8800GTX (and previous) days.

TBH I would prefer it if they actually went back to this method. Tailor the review to meet the card, don't just throw it through some 60 second "benchmark" and comment on the outcome.

There is very little value to be garnered from throwing an entry level card through a 1080p "max settings" benchmark and commenting that it isn't fast enough. Don't tell me what I already know, tell me what the card can actually do.

EDIT: For what it's worth my old temporary HD7770 managed around 45FPS at 1680x1050 "high" settings (no MSAA) in BF3. Perfectly playable for the most part. The GT650 should be comparable.
Sutura 5th October 2012, 10:29 Quote
I like how they put High-Air-Bracket on any of their cards. No matter GTX650 doesn't really needed. I like the card. For me it's not pointless. It depends a lot on the resolution used and the game. Since I only play SC2, the card is pretty much valid. I hope someone makes a low-profile of it like this: http://www.palit.biz/palit/vgapro.php?id=1393 . Because Sapphire low profile 7750 has the crown there for quite some, even at the 6xxx generation. And make it dual-slot. Enough of this single-slot LP-cards :D, I don't care how pointless it may look to slap a dual-slot cooler on a ~50W TDP card, we are enthusiasts after all :) have some fun here and there.
Spreadie 5th October 2012, 12:52 Quote
Were all the nvidia cards retested with the 306 driver?

Despite nvidia's claims 5,10,15% increases in some games, I noticed the 670 didn't show any improvement in BF3.
GuilleAcoustic 5th October 2012, 13:12 Quote
http://www.ldlc.com/informatique/piece/carte-graphique-interne/c4684/p1e48t3o0a1+fv121-7746,8109.html

GTX650 and HD7770 prices in ascending order ... not very good for the GTX650 since they have exactly the same price.
Valinor 5th October 2012, 17:26 Quote
What I gathered from what people said when they asked for a "best settings" rating was that they wanted to know something like "How much do I need to spend to get a good experience at resolution x" or "What settings/resolution could I use if I spend y amount on a graphics card". This is useful information, the sort of thing I've been asked several times by various people.

However, despite the potential usefulness of this setting-based rating I still think that a min/average fps system (as bit-tech use atm) is the better solution. This is because having a settings-based system would introduce a lot more subjectivity into the review. If you could get a game "playable" (which is itself a subjective term) at, say, 1680x1050 with mostly low settings but textures at medium, is that preferrable to the same "playability" at 1280x1024 with all high settings? What if you could have either high textures or high shadows giving you a "playable" experience? Which one is "better" to have? You'd have the same problem across reviews; should I buy card x which can play this game at 1336x768 at high settings, or card y which can play that game at 1280x1024 on medium settings?

The point of that last paragraph was that introducing settings as a way to differentiate between graphics cards would in fact reduce clarity, as different people would have different priorities for their graphics settings. A straight-up min/average fps system makes it much easier to decide which card is best for a particular price point, as you can see that Card A performs better in the tests than Card B (although even with this system it can still depend on which games you play the most) and so is better value for money.

I guess what'd be nice is for a round-up of graphics cards at some point to show how much you'd need to pay for say maximum settings at a variety of resolutions at a given minimum frame rate (25 or 30fps is usually what I see regarded as playable, but even then would you rather see a card recommended that can provide a smooth performance (60fps) at that resolution or one that can just pass as playable (25fps) at that resolution?). You could then use this information to work either way (how much for these settings/what settings for this amount).

Oh yeah, and trying to find the best settings for a card would take much longer than their current testing, I guess that should be considered.

Now, time to get back to what I'm meant to be working on ;)
fluxtatic 6th October 2012, 05:48 Quote
Got to agree with Adnoctum here. Of course this thing wasn't going to be able to handle AAA titles with everything cranked up at 5760x1400. However, [H] would at least show me what sort of compromises I'd be looking at if, for some reason, I was trying to play AAA titles with this card.

Seems like maybe there should be a second suite of tests for low-end cards. If I'm looking to buy a $100 GPU, show me that what sort of performance I'll get in TF2, Portal 2, Dirt 3, etc. Portal 2 and Dirt 3 are still commonly used for benchmarks, even when it gets a little silly for the high-end cards (if it's in the neighborhood of 200 FPS, maybe it should get replaced in that benchmark suite.) But, these are likely the sort of games consumers are looking to play on a $100 card.
Adnoctum 6th October 2012, 21:06 Quote
Quote:
Originally Posted by Valinor
What I gathered from what people said when they asked for a "best settings" rating was that they wanted to know something like "How much do I need to spend to get a good experience at resolution x" or "What settings/resolution could I use if I spend y amount on a graphics card". This is useful information, the sort of thing I've been asked several times by various people.

However, despite the potential usefulness of this setting-based rating I still think that a min/average fps system (as bit-tech use atm) is the better solution. This is because having a settings-based system would introduce a lot more subjectivity into the review. If you could get a game "playable" (which is itself a subjective term) at, say, 1680x1050 with mostly low settings but textures at medium, is that preferrable to the same "playability" at 1280x1024 with all high settings? What if you could have either high textures or high shadows giving you a "playable" experience? Which one is "better" to have? You'd have the same problem across reviews; should I buy card x which can play this game at 1336x768 at high settings, or card y which can play that game at 1280x1024 on medium settings?

I have no issue with the kinds of pure benchmark number based reviews, such as this one. I think it is needed to generate data in order to empirically rank cards in a form that is easily viewed and easily analysed. It lets you look at a glance the basic difference between cards for a given benchmark.

But, if I am being completely honest, I think it is really only half a review, and only half of what is needed to make a fully informed purchase.

Take a look at this review (because we are here, not as a criticism), on page 3 the numbers say that the GTX650 averaged 25fps with a minimum of 20fps in Battlefield 3 at 1920x1080. Now my experience of FPS (but not B3) is that is marginally playable for entertainment, if not competitively (in a single-player FPS I could live with 20-25fps).
But is it? Those two figures can hold a multitude of sins that would make a game unplayable: repetitive and repeated stuttering, infrequent periods of sustained low (20fps) frame rates, predictable slow-downs in areas of high activity.

The subjectivity of "playable" is the easiest to overcome. You state up front what "playable" means. You say that for a type of game, "playable" is the ability for a game to be played with no perceived stuttering, slow-downs or hang-ups that interrupt gameplay. It may be that the requirements to meet the term "playable" is different for competitive FPS (such as B3) compared to an Action RPG (Skyrim) or an RTS (SC2).

The level of technology and capability between the two brands are largely comparable and easily graphed at the moment, but what if one maker implements something significantly better? I'm using this as an easily explained example, not reality: the AA used in this article was MSAA. What if the GTX650 could implement FXAA with no performance impact and the resulting image was no different than if it was using MSAA? And what if AMD didn't have their MLAA or it was inferior (performance or quality)?

Because neither FXAA nor MLAA is implemented during the benchmarks, only MSAA was, there is no opportunity to examine how implementing FXAA impact the GTX650 performance. Not only that, but there is no analysis of how much of an impact MSAA has on the card. There are no benchmarks for 0x, 2x, 4x, 8x. Would turning the MSAA down to 8x make B3 @ 1920x1080 a smooth experience with the GTX650? That is a reasonable question to ask of a review, and if I was reading an [H]OCP review I would know.
Quote:
Originally Posted by Valinor
The point of that last paragraph was that introducing settings as a way to differentiate between graphics cards would in fact reduce clarity, as different people would have different priorities for their graphics settings. A straight-up min/average fps system makes it much easier to decide which card is best for a particular price point, as you can see that Card A performs better in the tests than Card B (although even with this system it can still depend on which games you play the most) and so is better value for money.

It still wouldn't be possible to show what a card does at all resolutions, at all possible settings to suit a single need. But that is not is what is being suggested. All that is suggested would provide a bit more depth of analysis that would give a bit of clarity to the capability of the card, so that we can infer where possible from the available data what card would satisfy our requirements.
If I wanted to know how the GTX650 would do playing B3 at 1920x1080/4xAA/16xAF/Ultra then this review is great! It even tells me how it does at 5760x1080 in case I ever want to play B3 on 3 monitors with this card. Unfortunately, it does nothing to inform me at what would happen at 0xAA or 1650x1050.
Quote:
Originally Posted by Valinor
I guess what'd be nice is for a round-up of graphics cards at some point to show how much you'd need to pay for say maximum settings at a variety of resolutions at a given minimum frame rate (25 or 30fps is usually what I see regarded as playable, but even then would you rather see a card recommended that can provide a smooth performance (60fps) at that resolution or one that can just pass as playable (25fps) at that resolution?). You could then use this information to work either way (how much for these settings/what settings for this amount).

Oh yeah, and trying to find the best settings for a card would take much longer than their current testing, I guess that should be considered.

To do this does take time. You actually have to watch what is happening instead of running a benchmark and compiling the numbers spat out into a spreadsheet. Which is why [H]OCP doesn't compare cards to every other card out there. For the nVidia GTX670 launch, [H] compared it to the HD7950 and the GTX580 and tested it in 5 or 6 games. For the HD7970GHz launch they compared it to the vanilla HD7950 and the GTX680. You may have noticed that these are the competitors for the card in question, and not an irrelevant HD6770/GTX550 in sight.

They don't give us useless information about cards that are irrelevant. If you want to see how the GTX650 compares to the GTX690, then open up the GTX690 review and compare the benchmarks. What would have been useful would have been the inclusion of the GTX550/560. Does it beat the GTX550Ti which I can buy for $20 less, in B3 @ 1920x1080? From this review I would never know, although the answer is "No, it doesn't".

Not complaining, but I suspect that the data for all the other cards were lying around from previous articles, and throwing them in costs nothing and it makes the review look more complete. Otherwise there is no reason to be testing them again just for this article.

I hope that my post doesn't come across as combative, just that I've been enjoying the conversation about testing in this thread. Your post made some compelling points and a lot of sense.

Similarly, I don't really have much to be unhappy about the article and no complaints. I got from it what I need to make an informed purchase of a graphics card: I'll either get the HD7750 I was thinking about or I'll hold on until the end of the year when prices will be cheaper and there will be chatter about the AMD mid-range replacement.
I'm actually grateful to Bit-Tech for the review, because many sites haven't even bothered, which is a bit disturbing when you think about it. The GTX650 price/performance level should be the meat of lower-mainstream gaming.
It just seems that reviews seem to be a little more shallow than they used to be, which I will admit may be a result of rose tinted glasses, as reviews used to rip into settings to find out how AA impacted performance at different resolutions and examined how good the image quality was.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums