bit-tech.net

EVGA GeForce GTX 650 1GB review

Comments 1 to 25 of 47

Reply
blackworx 4th October 2012, 11:02 Quote
How could you give only 60% to something so cute, you heartless old meanie!
Combatus 4th October 2012, 11:19 Quote
Quote:
Originally Posted by blackworx
How could you give only 60% to something so cute, you heartless old meanie!

I have to admit it's one of the cutest things I've seen in our lab for a while!
rollo 4th October 2012, 11:23 Quote
that card looks tiny compared to some of the big cards out there.
Paul2011 4th October 2012, 11:35 Quote
Wish cards like these didn't see the light of day, its no use at all to anyone thinking about gaming at or over 1080p. 60% is far too high a mark since most new games simply will be a slideshow, let alone the big titles of next year. Nvidia can do a lot better im sure, i just can't see the point of it.
Spreadie 4th October 2012, 11:45 Quote
It is cute, but largely pointless. Also, why does such a low power card need a dual slot cooler?

The third graph on the Witcher 2 page needs the resolution changing - it's showing the same as graph 2.
GuilleAcoustic 4th October 2012, 11:45 Quote
Quote:
Originally Posted by Paul2011
Wish cards like these didn't see the light of day, its no use at all to anyone thinking about gaming at or over 1080p. 60% is far too high a mark since most new games simply will be a slideshow, let alone the big titles of next year. Nvidia can do a lot better im sure, i just can't see the point of it.

Again, not everyone plays the last big title @ ultra setting. This card is small, easy on the power cord and do not heat a lot. For people willing to play games with less "high end" graphics, it's perfect.

This the heat level is low, even under load, it could perfectly sit inside a very small case. It's enough to play some CS:S or TF2.

I agree it's no use for BF3 or Crysis, but it's not the targeted market. I'm glad there's affordable little card for people willing to play a little. That appart, the HD7770 is one of the best bang for buck right now (but it is 15° hotter under load and consume 20 more watts).

It's all about what you want to use ut for.

EDIT:

@BT staff: Again, why did you only test with BF3, Skyrim, Crysis @ ultra setting ? I know you're gonna say "It's part of our standard test procedure and it's been like that for years now". But nobody will be supprized that the frame rate is low and unplayable.

Why didn't you try to show us the best compromise between perf and eye-candy ? Why no medium quality test ? Why no less-demanding games, like Street Fighter 4 for example ?

Sorry to sound hard, but this review is useless and irrelevant to me. You should really focus the reviews depending on the targeted consumers. You do not review an high-end card the same way you review an entry level one.
blacko 4th October 2012, 11:51 Quote
i want to take it home and give it a cuddle and tell it everything will be ok....

Hopefully Nvidia will issue a TI version like the previous generation.
Ripitup121 4th October 2012, 11:58 Quote
indie gaming rig material at best but DAM its MotherF**kin cute
damien c 4th October 2012, 12:06 Quote
This would be perfect for a LAN rig, where really you can get away with not having Ultra settings and also it's abit better for the hosters electric.

Could also be used in a Media pc that could be used for some light gaming as well.
Shirty 4th October 2012, 12:08 Quote
I think a Ti version priced half way between this and the 660 with performance to match will be a very desirable little card.

Off topic slightly but what would be a good card for my mum's PC? She games fairly lightly at 1280 x 1024 using a 5450, but it does struggle with games like the Sims 3 with the detail turned up. I assume the 650 would be overkill, but what would be a good compromise? I know nothing of low end cards.
ShinyAli 4th October 2012, 12:16 Quote
Quote:
Originally Posted by Paul2011
Wish cards like these didn't see the light of day, its no use at all to anyone thinking about gaming at or over 1080p. 60% is far too high a mark since most new games simply will be a slideshow, let alone the big titles of next year. Nvidia can do a lot better im sure, i just can't see the point of it.

Can't see the point? Why now you can run aero desktop in all it's wondrous glory!

Some of the test results are odd, at idle the AMD Radeon HD 7870 runs at 6 degrees C and the GTX 650 runs at 14 degrees C but under load the GTX 650 runs at 29 degrees C and the HD 7870 runs at 40 degrees C, so either the GTX 650 cooling at idle is poor or the cooling under load is excellent, I guess it's better to have better cooling under load.

It overclocked well but who's going to bother? It looks like the type of card some PC manufacturers would use and then call their PC an "Entry level gaming machine" It'll run Myst and Broken Sword games at max levels

Why did Nvidia bother releasing this when the HD 7770 which is a faster card all around for the same price is already on the market :?

Graphics cards..can't live em, can't live without em :(

Oh, and I'm not AMD biased, I've just bought a MSI N560GTX-Ti Twin Frozr ll
Zurechial 4th October 2012, 12:26 Quote
The impression I'm getting is that this card (and cards like it) are a false economy for the average consumer in this day and age.

Now that both AMD and Intel CPUs have built-in GPUs that can handle light gaming workloads (very light in the Intel case); why would anyone bother with a card like this?
If crappy, low-resolution graphics in modern games or middling settings in older games are all you want; then an on-die GPU from AMD or Intel makes more sense and would work out cheaper in a new build, too.
If you want anything approaching decent performance and visuals, buy a discrete gpu that doesn't suck.

Or have I missed something here, some market segment that these cards are actually still useful for at this point?
Surely the "my CPU is too old to have a decent GPU on-die and I can't afford to wait or spend more on upgrading for decent visuals so I'll drop 100 on a crappy, low-end graphics card" segment can't be that large?
GuilleAcoustic 4th October 2012, 12:27 Quote
Quote:
Originally Posted by ShinyAli
Some of the test results are odd, at idle the AMD Radeon HD 7870 runs at 6 degrees C and the GTX 650 runs at 14 degrees C but under load the GTX 650 runs at 29 degrees C and the HD 7870 runs at 40 degrees C, so either the GTX 650 cooling at idle is poor or the cooling under load is excellent, I guess it's better to have better cooling under load.

The Zero Core technology from the HD7000 series is why the HD7870 runs cooler than the GTX650 at idle.
Quote:
Originally Posted by Zurechial
Or have I missed something here, some market segment that these cards are actually still useful for at this point?
Surely the "my CPU is too old to have a decent GPU on-die and I can't afford to wait or spend more on upgrading for decent visuals so I'll drop 100 on a crappy, low-end graphics card" segment can't be that large?

For cases with length limited to 180mm for additional card (like many ITX case), this card is nice. Trinity IGP is nice, but still offer half the performance of an HD7750. Intel IGP can't event handle half decent fps with detailed Skecthup models. Both HD7750 and GTX650 are nice for small footprint rigs.
Guinevere 4th October 2012, 12:31 Quote
Quote:
Originally Posted by Paul2011
Wish cards like these didn't see the light of day, its no use at all to anyone thinking about gaming at or over 1080p. 60% is far too high a mark since most new games simply will be a slideshow, let alone the big titles of next year. Nvidia can do a lot better im sure, i just can't see the point of it.

Did you not read the article?
Quote:
the GTX 650 can handle games at 1920 x 1080 with levels of detail that are impressive for a sub-£100 card

I think this cute and wubbly ickle card has done its mummy proud by being able to stand up and play tough like the big boys do.

Sure it's not going to give you massive amounts of AA at 1080 or a high minimum FPS but it'll still play BF3 / Skyrim if you crank the settings down.

27FPS in skyrim at 1080p in ultra mode isn't bad on a little pre-school card like this.

Isn't this 650 a 'slightly' better spec than the 650M and even a bit better spec than the tweaked 650M apple use in the retina macbook? I play games quite happily on my rMBP and while I can't game at 2880x1800 or 2560x1440 will all the eye candy on, it's fine if you crank the res / settings down a little.
Paradigm Shifter 4th October 2012, 12:32 Quote
I agree with GuilleAcoustic, for the most part. I know you have a 'standard' benchmark suite, but all that review really showed is that a GTX650, the 'entry level' 600 series card, is useless at playing demanding games at demanding settings. What a shocker. :|

On the card itself, however... I'd be more interested in a single-slot GTX650, or possibly a dual-slot passive design if the TDP was low enough. It would also be interesting to see how useful it is at CUDA work, or as a pure PhysX processor if there was a single-slot version available.
Hustler 4th October 2012, 12:58 Quote
There bringing out a 650Ti as well next week, apparently, with 768 Cuda cores and about 10% more bandwidth.

Crazy...just too many cards these days.
blackworx 4th October 2012, 13:07 Quote
Quote:
Originally Posted by damien c
Could also be used in a Media pc that could be used for some light gaming as well.

I'd go for Sapphire's passively cooled 7750 for that. Costs more but 100% silent and great temps.

EDIT: And powered directly from the slot with no 6-pin power supply required.
Adnoctum 4th October 2012, 13:38 Quote
Quote:
Originally Posted by blackworx
I'd go for Sapphire's passive 7750 for that. Costs more but 100% silent and great temps.

I've been wanting to upgrade my mITX legacy gaming machine from the current 9800GT to something from this generation and have been waiting for this so I could make an informed choice. And waiting. Then some more waiting.
I wish I hadn't because, for me, this is all kinds of fail:
* It has a external PCIe power connector - with the kinds of low power draw I was expecting, this shouldn't have been an issue. The competition in the HD7750 doesn't need it.
* Currently there is no passive option available - Once again, there is a passive HD7750 from Sapphire.
* It is too expensive - the TWIMTBP program tax strikes again. For the third time, the passive Sapphire HD7750 Ultimate is either the same price (as a Galaxy GTX650) or cheaper (than Asus, Gigabyte, etc).

The only thing that is stopping me pulling the trigger on the Sapphire is the thought that perhaps the HD87xx is just around the corner (within 4 months or so).
mrbungle 4th October 2012, 14:47 Quote
Would be far more use to test the card at lower settings to see if it could get a playable rate out of say BF3.

Most cards would struggle with ultra settings.
Noob? 4th October 2012, 15:05 Quote
Quote:
Originally Posted by mrbungle
Would be far more use to test the card at lower settings to see if it could get a playable rate out of say BF3.

Most cards would struggle with ultra settings.

Would agree with you on this mate.
Adnoctum 4th October 2012, 15:30 Quote
People complain about personalities and policies at [H]OCP, but I like the fact that their GPU reviews look at the gaming experience and they find the highest playable settings that a card can play a game at. Perhaps card A can only enable Bx AA and C texture quality at D resolution, what can the alternatives do?

Having said that, I wouldn't like all gaming sites to start doing their reviews the same way. I like hard numbers and easily digested graphics as well. Having both perspectives gives me a better understanding of how a card performs and what I could expect with a purchase.
Kodongo 4th October 2012, 15:41 Quote
Quote:
Originally Posted by Spreadie
It is cute, but largely pointless. Also, why does such a low power card need a dual slot cooler?

You clearly haven't seen the XFX GT 640 2GB.

http://imageshack.us/a/img404/6741/gt640ncdf3.jpg

Double dissipation and double slot cooler to handle the vast amount of heat coming off of that goliath GPU as well as cooling the 2GB of memory being pushed to its absolute limit.

GuilleAcoustic 4th October 2012, 15:56 Quote
Quote:
Originally Posted by Kodongo
You clearly haven't seen the XFX GT 640 2GB.

http://imageshack.us/a/img404/6741/gt640ncdf3.jpg

Double dissipation and double slot cooler to handle the vast amount of heat coming off of that goliath GPU as well as cooling the 2GB of memory being pushed to its absolute limit.

It is a low profile card. the dual slot / dual fan makes sense if you do not want a noisy quick spinning 40mm fan.
Spreadie 4th October 2012, 16:01 Quote
Quote:
Originally Posted by Kodongo
You clearly haven't seen the XFX GT 640 2GB.

http://imageshack.us/a/img404/6741/gt640ncdf3.jpg
LMFAO

Is that a real card or a mock up?


[edit] holy Sh**, it's real.
http://www.youtube.com/watch?v=EIJLJ1Yy9IY

:):):)
Elton 4th October 2012, 16:43 Quote
The first thing I thought: AWWWW SO CUTE! But then again, it's made for very low power usage. Hell, they managed to fit an entire mobile version with hardly any gimps into laptops.

This at the very least shows a very good integration between the chips.They are all Kepler derived, and to me that's impressive. Even AMD needed three lines of chips, this is just basically cutting the same chip..repeatedly.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums