bit-tech.net

Gainward GeForce GTX 560 Ti 2048MB Phantom Review

Comments 26 to 50 of 53

Reply
Redbeaver 8th November 2011, 18:57 Quote
need more gainward availability in Canada :(

/sigh
kt3946 8th November 2011, 20:32 Quote
I stopped reading bit-tech for their GFX card reviews simply because they take AGES to come up with a new group of games to test.

Not to mention, only a goofball would test a 2GB card capped with 1920x1080 resolution. 2GB+ is meant for using at 1920x1200+ (try gaming on a 2560x1440 Dell U2711 at native with only 1GB... blech!). Higher resolutions are key here.

Too bad they stopped listening to their users a long time ago...
Cei 8th November 2011, 20:48 Quote
But would a 560Ti have enough grunt to play modern games at 2560x1600 anyway?

I think the point Bit-Tech were making is that it's odd to have 2GB of VRAM on a card that struggles to get good performance past 1920x1200 anyway.
Ljs 8th November 2011, 21:30 Quote
Quote:
Originally Posted by t.y.wan
Why would you not get a 6950 DirectCU II... It's even 1 pound cheaper... and the performance is WAY WAY better - -" (Price from overclock.net not overclockers.co.uk)...

If you are looking for quiet, just get a better case ~_~"

Me. I like quiet. I have an R3 and would still pick a quiet cooler anyday over an extra ~4fps or whatever a factory OC would give.
Bede 8th November 2011, 22:10 Quote
I can't believe this wasn't tested on 2560*1440. The only real reason for 2gb is for very high resolutions, and it would be interesting to see if the 560Ti has the grunt (when equipped with 2gb VRAM) to keep up with more expensive cards at that res.

Tbh I don't see why the standard test doesn't include a 27" or 30" monitor, many people have them and it is a useful way of distinguishing cards.
Combatus 8th November 2011, 23:04 Quote
Quote:
Originally Posted by kt3946
I stopped reading bit-tech for their GFX card reviews simply because they take AGES to come up with a new group of games to test.

Not to mention, only a goofball would test a 2GB card capped with 1920x1080 resolution. 2GB+ is meant for using at 1920x1200+ (try gaming on a 2560x1440 Dell U2711 at native with only 1GB... blech!). Higher resolutions are key here.

Too bad they stopped listening to their users a long time ago...

There's little point testing a mid range card at super-high resolutions because in games such as Bad Company 2 and Arma II, it's simply not going to cut the mustard, however much RAM it has. Even a GTX 580 can barely achieve playable framerates here, a GTX 590 is what you should be looking at and even then the minimum drops to just over 40fps with double the RAM of the Phantom. What we conducted was a real world test pitching the card against tests we'd expect it to face. Finding the extra RAM did make a difference but the minimum frame rate was still unplayable would be pointless.

We're holding off updating our games until we've found a suitable set of benchmarks and the next gen cards have landed. At which time we'll be updating our drivers too - interesting hardware in the graphics market is understandably scarce at the moment and it takes a heck of a lot of work to retest everything. It's not something we can do every time a new game drops.
leexgx 9th November 2011, 03:10 Quote
2gb only needed for 3 way screen setups, 560 only needs 1gb as it not be able run at the screen res to even use all of the vram up
kt3946 9th November 2011, 03:27 Quote
Quote:
Originally Posted by Combatus

There's little point testing a mid range card at super-high resolutions because in games such as Bad Company 2 and Arma II, it's simply not going to cut the mustard, however much RAM it has. Even a GTX 580 can barely achieve playable framerates here, a GTX 590 is what you should be looking at and even then the minimum drops to just over 40fps with double the RAM of the Phantom. What we conducted was a real world test pitching the card against tests we'd expect it to face. Finding the extra RAM did make a difference but the minimum frame rate was still unplayable would be pointless.

Wait a second here, where did MINIMUM 40 fps at maximum detail with a rez of 1920x1080 suddenly become 'not good enough'?? What are you guys on over there?? 40 FPS minimum is MORE than enough. You want generally, an average somewhere around 30-40 for good gameplay. It'd be great to have 60FPS in everything, but isn't necessary to be able to still nail the headshot. If a card manages a range of 20-60 FPS that should be PLENTY to at least enjoy it. It may not be enough for professional competition, but in those cases, they're dialing down the fluffy graphics anyways.

I take any ARMA test with a VERY large grain of salt. That game has such a poorly optimized graphics engine in it, even though it's already 3 years old and tops out with Direct X v9.0 graphics it manages to bring a GTX580 down to 50 FPS?? In my line of work, we'd never be able to release code that bad.

Stick with the optimized and heavily used engines: Unreal, Gamebryo, Hero Engine, Id Tech, Frostbite, etc.
Quote:
Originally Posted by Combatus

We're holding off updating our games until we've found a suitable set of benchmarks and the next gen cards have landed. At which time we'll be updating our drivers too - interesting hardware in the graphics market is understandably scarce at the moment and it takes a heck of a lot of work to retest everything. It's not something we can do every time a new game drops.

Agreed. Everyone is predominantly sitting on their hands waiting for the next releases, but that doesn't mean you can't move forward with the games list. You don't have to retest EVERYTHING, just add a new one or two to the list and drop some others that are old DX9 engines that no one really concerns themselves with anymore...
Combatus 9th November 2011, 11:32 Quote
Quote:
Originally Posted by kt3946


Wait a second here, where did MINIMUM 40 fps at maximum detail with a rez of 1920x1080 suddenly become 'not good enough'?? What are you guys on over there?? 40 FPS minimum is MORE than enough. You want generally, an average somewhere around 30-40 for good gameplay. It'd be great to have 60FPS in everything, but isn't necessary to be able to still nail the headshot. If a card manages a range of 20-60 FPS that should be PLENTY to at least enjoy it. It may not be enough for professional competition, but in those cases, they're dialing down the fluffy graphics anyways.

I take any ARMA test with a VERY large grain of salt. That game has such a poorly optimized graphics engine in it, even though it's already 3 years old and tops out with Direct X v9.0 graphics it manages to bring a GTX580 down to 50 FPS?? In my line of work, we'd never be able to release code that bad.

Stick with the optimized and heavily used engines: Unreal, Gamebryo, Hero Engine, Id Tech, Frostbite, etc.

Agreed. Everyone is predominantly sitting on their hands waiting for the next releases, but that doesn't mean you can't move forward with the games list. You don't have to retest EVERYTHING, just add a new one or two to the list and drop some others that are old DX9 engines that no one really concerns themselves with anymore...

Sorry, I think you misunderstood me there - 40fps is what the GTX 590 3GB manages at those resolutions. That is all. The minimum we consider to be playable is 25fps but it seems highly unlikely that the Phantom, which costs less than a third as much, will manage a playable framerates.

ArmA isn't a tough game to run just because it's not as optimised as more mainstream games - the game worlds are enormous and very detailed, the AI is very hardcore - a whole load of stuff really. It also scales with more powerful hardware which makes it a good benchmark - other tough-to-run games such as Crysis and FSX don't scale nearly as well.

Unfortunately adding a single game to our tests means re-testing a considerable number of graphics cards - some of our tests are manual too meaning we have to play through it scores of times, with several runs to make sure it was consistent. At the moment there's over 30 individual test results for graphics cards at every resolution (90 tests in total if we do all resolutions) - dropping a new game in would mean testing all of these at around 10 mins a piece plus reinstalling Windows when we switch from AMD to Nvidia testing. This doesn't include using the game to find a consistent manual benchmark to start with either. In short, it's not something that can quickly be done every time a game is released, and especially not when there are new generations of cards just around the corner.
Bede 9th November 2011, 12:14 Quote
Quote:
Originally Posted by Combatus
There's little point testing a mid range card at super-high resolutions because in games such as Bad Company 2 and Arma II, it's simply not going to cut the mustard, however much RAM it has. Even a GTX 580 can barely achieve playable framerates here, a GTX 590 is what you should be looking at and even then the minimum drops to just over 40fps with double the RAM of the Phantom. What we conducted was a real world test pitching the card against tests we'd expect it to face. Finding the extra RAM did make a difference but the minimum frame rate was still unplayable would be pointless.

We're holding off updating our games until we've found a suitable set of benchmarks and the next gen cards have landed. At which time we'll be updating our drivers too - interesting hardware in the graphics market is understandably scarce at the moment and it takes a heck of a lot of work to retest everything. It's not something we can do every time a new game drops.

Fair enough, thanks for answering :) I still think that a 27" or 30" screen should be included in the tests, I know it creates more work but it is another way of distinguishing between cards and very useful for people who have/are considering a large screen.
Redbeaver 9th November 2011, 15:09 Quote
Quote:
Originally Posted by Hustler
Tech Report just had an article pulled, presumably for breaking an NDA...

'New 560Ti launches 29th November....1280Mb Vram and 448 Cuda Cores.

...so it may well be worth waiting till the end of this month if anyone is thinking about buying a 560..as the new specs will make it roughly the same speed as a 570, and supposedly the launch price will be the same as the regualr 560.

now this is interesting..........
jon 9th November 2011, 16:45 Quote
"Disappointingly, the Phantom posted similar results to other stock-speed cards we've tested."

Disappointingly? No, *expected* -- it's a stock speed card, you said that at the beginning. To expect it to perform better is a bit ridiculous. What I want to know is why more concentration wasn't given to the overclock performance. With a cooler like that (7C idle and 37C under load -- very nice) this card's niche is in overclocking -- perhaps not in getting the best OC, but keeping the chips cool during that OC. How about checking noise, thermals, and power draw on the card once the OC has been set? Then show a comparison of performance against its stock settings, and relate that to the cooling differences with other cards at OC settings.

To have done all that testing of the card at stock speeds was both foolish and a waste of time -- it was set to reference specs when you got it, you should have known it was going to give you reference performance.

I hope there's a follow-up article coming ...
[USRF]Obiwan 10th November 2011, 09:29 Quote
All the benchmarks are a complete waste of time, all cards manage to get +30 frames, most of them do +60 frames, 30 fps is already smooth and more then 60 fps is only noticed by a small amount of placebo people.

Only a handful of games manage to get low(er) scores like BF BC2 and ArmaII, where ArmaII is a old very badly written DX9 game.

I can play BF3 on 1920 res with all on 'ultra' smoothly. And that is on a quadcore AMD with a GTX460. Maybe the 16GB of ram helps a lot who knows?
[USRF]Obiwan 10th November 2011, 09:37 Quote
Quote:
Originally Posted by Redbeaver
Quote:
Originally Posted by Hustler
Tech Report just had an article pulled, presumably for breaking an NDA...

'New 560Ti launches 29th November....1280Mb Vram and 448 Cuda Cores.

...so it may well be worth waiting till the end of this month if anyone is thinking about buying a 560..as the new specs will make it roughly the same speed as a 570, and supposedly the launch price will be the same as the regualr 560.

now this is interesting..........


What NDA?

http://news.softpedia.com/news/Nvidia-s-GTX-560-Ti-448-Core-Get-Detailed-Expected-on-Nov-29-233176.shtml

http://www.guru3d.com/news.html#14526

http://www.techpowerup.com/154808/Everything-You-Need-To-Know-About-GeForce-GTX-560-Ti-448-Cores.html

http://www.maximum-tech.net/nvidia-gtx-560-ti-448-core-specs-leaked-coming-on-nov-29-6900/
Anfield 10th November 2011, 23:48 Quote
Quote:
Originally Posted by kt3946
only a goofball would test a 2GB card capped with 1920x1080 resolution. 2GB+ is meant for using at 1920x1200+ (try gaming on a 2560x1440 Dell U2711 at native with only 1GB... blech!). Higher resolutions are key here.

The GTX 560 Ti is a mid range card, the fact that it has the same amount of memory as a high end card doesn't change that.
Due to it being a mid range card it makes perfect sense to test it at a mid range resolution.

For resolutions above 1920x1080 you don't want a GTX560 Ti, regardless if it has 64KB or 2GB of memory.
kt3946 11th November 2011, 02:28 Quote
Quote:
Originally Posted by Anfield
The GTX 560 Ti is a mid range card, the fact that it has the same amount of memory as a high end card doesn't change that.
Due to it being a mid range card it makes perfect sense to test it at a mid range resolution.

For resolutions above 1920x1080 you don't want a GTX560 Ti, regardless if it has 64KB or 2GB of memory.

You're completely missing the point. Regardless if it has a 560 chip in it, or a 520, the ONLY reason that you'd have 2GB on a card is to handle higher resolution frame buffers and/or multiple monitors at higher rez.

You don't need a 2GB card for 1920x1080, much less a 1GB card. You *could* use the extra GB for texture cache, but frankly, systems are fast enough that most engines stream that data off of disc instead (see Rage/Unreal 3/Crytek, etc.), so most use the extra memory for things like extra back-buffers, higher levels of AA, all to deal with handling higher resolution displays.

Not testing a 2GB card with higher resolutions is akin to not testing a car setup for rally racing on a dirt track just because it has a 4-banger in it...

and as for other comments regarding having to go retest 16 cards each time?

Really?

You go and retest EVERY card for EVERY test EVERY time? Unless the drivers specifically state they've added optimization for a particular game in the driver change-log, the respective changes dont' generally amount to much for older games. In fact, it's been proven (gah, can't find the site that showed it), that newer drivers are commonly SLOWER for older cards, than previous revisions. Resulting really in skewed data over time...

Gah.
Anfield 11th November 2011, 05:56 Quote
Quote:
Originally Posted by kt3946
You're completely missing the point. Regardless if it has a 560 chip in it, or a 520, the ONLY reason that you'd have 2GB on a card is to handle higher resolution frame buffers and/or multiple monitors at higher rez.

I only said it doesn't make sense to test low end / mid range gpus at high res, I did not say it makes sense to build a Videocard with a GTX560 Ti Gpu and 2GB memory.
Its the same BS from the Card Manufacturers as usual, they slap on more Ram for marketing purposes but there is just no practical use for it on those particular Cards, if you want a more extreme example, HIS does a 2GB Radeon 6450...

On a Gpu that is fast enough to actually produce playable framerates in the latest Games at high quality settings, like for example a GTX 580 it can make sense to increase the amount of memory and of course such Cards should then also be tested at the highest res / quality settings possible.
kt3946 12th November 2011, 03:18 Quote
Quote:
Originally Posted by Anfield
I only said it doesn't make sense to test low end / mid range gpus at high res, I did not say it makes sense to build a Videocard with a GTX560 Ti Gpu and 2GB memory.
Its the same BS from the Card Manufacturers as usual, they slap on more Ram for marketing purposes but there is just no practical use for it on those particular Cards, if you want a more extreme example, HIS does a 2GB Radeon 6450...

On a Gpu that is fast enough to actually produce playable framerates in the latest Games at high quality settings, like for example a GTX 580 it can make sense to increase the amount of memory and of course such Cards should then also be tested at the highest res / quality settings possible.

You're absolutely correct, that it's all marketing hyperbole, and I agree with you. However, the point here is that Bit-Tech should be SHOWING that as part of the test. The ONLY reason you need 2GB is for the higher-rez. The problem is, because they stopped at 1920x1080, there's no way to SHOW that the 2GB is effectively useless on a knee-capped card like this as a single card solution.

Given what they've shown so far, one could make the obviously incorrect conclusion that "Hey, this card comes with 2GB and it's as fast as the other 1GB cards, so it must be better!!". Then go an purchase the superfluous 2GB.

No, it won't hurt anything, but it'd be pointless, and if it cost a bit more than any other card, it'd be kind of a waste.

Not to mention, there's also the fact that 2 of these 560's with 2GB in an SLI setup actually works pretty well for the higher-rez. Even though they may not actually TEST the SLI setup, it can be easily extrapolated from the individual card tests if they bothered to test at appropriate resolutions (2560x1440, 2560x1600, etc.). As a single card the frame rates would be too low, but in a SLI setup, they would add at an approximate 85% scale, and one could make the determination that this would work.

Unfortunately, they didn't. So, now we'll have to wait. Which is where the disappointment creeps in with their testing regimen. They so laser focus on the mid-range aspects of this card that they fail to realize that it's actually useful in OTHER situations (multi-monitor setups for one, in addition to SLI setups, CUDA work, etc.).

Sigh. This site originally was based around modders, with a focus on finding creative ways to use components to their utmost.

Since the buyout, it's just turned into another PC Mag review site. No thinking outside the box anymore...
Noob? 14th November 2011, 13:27 Quote
I want one, :drool:.

The Phontom models look so..
sazza6969 21st November 2011, 16:32 Quote
Call me old fashioned but I thought DIRT 2 was a DX10 game,not DX11.

My EVGA GTX 580 with a core i7 920@3.8 does not come anywhere near them fps@1920x1080 in DIRT 3

So has somebody dropped a blob?...... DIRT 2 or DIRT 3 with the wrong fps?
south side sammy 22nd November 2011, 22:52 Quote
why do all these reviews make it look like that extra ram is there for more fps instead of smoother game play at higher settings ?
nathandanielmorris 23rd November 2011, 23:15 Quote
What colour is the PCB? Looks black in the bit-tech photos but all of the unboxing's on YouTube and retailer websites show it as red.
south side sammy 24th November 2011, 02:01 Quote
Quote:
Originally Posted by andrew8200m
Proof??

http://forums.overclockers.co.uk/showthread.php?t=18344290

:)


never saw ram make that much difference in raw fps numbers on any site until I read this.... ????
Siwini 24th November 2011, 08:17 Quote
A VGA port? get out here... I mean COME-ON
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums