bit-tech.net

Gigabyte GeForce GTX 780 GHz Edition Review

Comments 1 to 14 of 14

Reply
Corky42 14th March 2014, 11:16 Quote
How strange for a factory OC'ed card not to OC the memory, it seems they really missed a trick there.
Dogbert666 14th March 2014, 11:27 Quote
The memory chips are only rated for 6Gbps, and as there are twelve of them that's twelve potential points of failure as soon as you start overclocking to anything meaningful, and instability caused by memory isn't always immediately apparent, so that's probably the reasoning. However, I have to say I still agree with you - from what I've seen the GTX 780 memory always overclocks pretty nicely, so it would have been nice to see them running here at around 6.5GHz or so.
Stanley Tweedle 14th March 2014, 12:04 Quote
1,176mhz is so yesterday. I get more than that on my 680.
Panos 14th March 2014, 12:41 Quote
Having bought the PNY 780 XLR8 OC, which is at the same (bit more) speeds, can say that all those factory Ghz 780s are total monsters for their price.

Also from the review we see how well the Nvidia drivers matured on BF4, which few months ago, all Nvidia cards were receiving a beating from the AMD ones on that game.

Great review as always :)
Bindibadgi 14th March 2014, 13:00 Quote
Quote:
Originally Posted by Corky42
How strange for a factory OC'ed card not to OC the memory, it seems the really missed a trick there.

It's easy to jack up the volts and OC a GPU. OCing memory means qualification of new parts, BIOS adjustments, testing...
Shirty 14th March 2014, 13:42 Quote
Quote:
Originally Posted by Stanley Tweedle
1,176mhz is so yesterday. I get more than that on my 680.

Likewise, my 770 boosts to over 1200, and the memory is currently comfortable at a few Hertz shy of 8GHz (@QDR) :p

Still noticeably slower than this card, but numbers!!
Gurdeep14 15th March 2014, 11:54 Quote
Could you guys also include the 1080p benchmark of Valley for the overclocking test. I'd like to compare this to my EVGA 780 Classified Hydro Copper, running at 1.3Ghz, but I am only using a 1080p display.
Lenderz 15th March 2014, 23:57 Quote
Just got myself a Gainward GTX 780 Phantom Goes Like Hell edition, it's clocked to a similar level, and I'm really impressed with the power for the cost, besting a Titan for £380.

These factory OC 780's are absolute beasts.
AlienwareAndy 16th March 2014, 09:14 Quote
Didn't EVGA use different. better chips for their Classified cards? Just wondering... Sure I read they used better memory..
DystopianDream 8th April 2014, 12:07 Quote
When did bit-tech stop including 1080 benches? I know 1440 and 4K are all the rage but the vast majority are still using 1080/1200p and it would be nice for a comparison to what we currently have. We know most of the new £250+ cards can do 1080/60Hz in most games, but what about those with 120/144hz monitors?

Not including 1080 seems a bit of an oversight as 4k benches are nice to have but irrelevant for the vast majority of users, as most 4k displays cost more than a whole midrange system, whereas those looking to upgrade (like myself) would like a yardstick to compare to. I love that you have minimum and average benches as well, the omission seems odd.

In the review it even says
Quote:
We test at 1,920 x 1,080 (1080p) and 2,560 x 1,440, as well as at 5,760 x 1,080 (AMD Eyefinity/Nvidia Surround) and 3,840 x 2,160 (4K) with higher end cards.

:(

edit
Steam Hardware Survey
1920 x 1080 = 33.12% of users (largest single group)
2560 x 1440 = 0.96%
2560 x 1600 = 0.14%
5760 x 1080 = 0.07%

http://store.steampowered.com/hwsurvey

I know this is an enthusiast site, but be real
AlienwareAndy 8th April 2014, 12:13 Quote
Quote:
Originally Posted by DystopianDream
When did bit-tech stop including 1080 benches? I know 1440 and 4K are all the rage but the vast majority are still using 1080/1200p and it would be nice for a comparison to what we currently have. We know most of the new £250+ cards can do 1080/60Hz in most games, but what about those with 120/144hz monitors?

Not including 1080 seems a bit of an oversight as 4k benches are nice to have but irrelevant for the vast majority of users, as most 4k displays cost more than a whole midrange system, whereas those looking to upgrade (like myself) would like a yardstick to compare to. I love that you have minimum and average benches as well, the omission seems odd.

In the review it even says



:(


I think the general consensus is sense. IE - you don't really need a GTX 780 or a card of that stature for 1080p. That's what the 280x/770 are for I guess?
Dogbert666 8th April 2014, 13:24 Quote
Quote:
Originally Posted by DystopianDream
When did bit-tech stop including 1080 benches? I know 1440 and 4K are all the rage but the vast majority are still using 1080/1200p and it would be nice for a comparison to what we currently have. We know most of the new £250+ cards can do 1080/60Hz in most games, but what about those with 120/144hz monitors?

Not including 1080 seems a bit of an oversight as 4k benches are nice to have but irrelevant for the vast majority of users, as most 4k displays cost more than a whole midrange system, whereas those looking to upgrade (like myself) would like a yardstick to compare to. I love that you have minimum and average benches as well, the omission seems odd.

Steam Hardware Survey
1920 x 1080 = 33.12% of users (largest single group)
2560 x 1440 = 0.96%
2560 x 1600 = 0.14%
5760 x 1080 = 0.07%

http://store.steampowered.com/hwsurvey

I know this is an enthusiast site, but be real

I'm well aware that 1080p is the most popular resolution, but you're making it seem as if we've stopped including it all together. In truth, the only cards where we don't include it are board partner versions of the highest end cards, such as the GTX 780 or R9 290X or higher. However, when a new reference SKU is launched, no matter how high end, we'll always include 1080p results:

http://www.bit-tech.net/hardware/graphics/2014/02/26/nvidia-geforce-gtx-titan-black-review/1

http://www.bit-tech.net/hardware/graphics/2014/04/08/amd-radeon-r9-295x2-review/1

If you're looking at buying a custom version of such a high end card, then chances are you're already familiar with the 1080p performance of the reference model, and if you're not then you can always read the original review for said card. With custom cards, the aim is to see how much better it is than the reference card, and in the case of such high end cards 1440p is enough to assess single screen performance. Admittedly, it doesn't take a massive amount of time to run the 1080p tests and generate the graphs etc., but even so it doesn't seem necessary in this case - with lower end custom cards we would of course include 1080p as it would be more relevant.
BD Hopkins 8th April 2014, 15:48 Quote
Why don't you include the test results of previous cards in the same class?
At the 780 level, you've tested the PNY GeForce GTX 780 XLR8 OC and the Asus ROG Poseidon GTX 780.

You've used the same test system for all of them, so you can't claim that it's apples and oranges.
You don't publish ambient testing temps, so it's not an issue of temperature testing mismatches.

Why, it's almost like your review is designed primarily to make the card look good, keeping the focus on how much better than a stock card it is, and how close it gets to a more expensive, higher-tiered card. Some might call this an advertisement.

We know an factory OCed card is going to be better than a stock card, dudes. Help us choose between them.
DystopianDream 23rd May 2014, 00:34 Quote
Quote:
Originally Posted by Dogbert666
I'm well aware that 1080p is the most popular resolution, but you're making it seem as if we've stopped including it all together. In truth, the only cards where we don't include it are board partner versions of the highest end cards, such as the GTX 780 or R9 290X or higher. However, when a new reference SKU is launched, no matter how high end, we'll always include 1080p results:

http://www.bit-tech.net/hardware/graphics/2014/02/26/nvidia-geforce-gtx-titan-black-review/1

http://www.bit-tech.net/hardware/graphics/2014/04/08/amd-radeon-r9-295x2-review/1

If you're looking at buying a custom version of such a high end card, then chances are you're already familiar with the 1080p performance of the reference model, and if you're not then you can always read the original review for said card. With custom cards, the aim is to see how much better it is than the reference card, and in the case of such high end cards 1440p is enough to assess single screen performance. Admittedly, it doesn't take a massive amount of time to run the 1080p tests and generate the graphs etc., but even so it doesn't seem necessary in this case - with lower end custom cards we would of course include 1080p as it would be more relevant.

Very true, thanks for explaining the logic. Looked at a couple of other reviews before posting and I must have by chance looked at all non-reference ones.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums