bit-tech.net

GeForce GTX 560 sneak peek

GeForce GTX 560 sneak peek

The GeForce GTX 560 will play Duke Nukem Forever at 1,920 x 1,080, apparently.

Nvidia has disclosed a few details about its imminent GeForce GTX 560 GPU, including its launch date. There’s also a video showing a GTX 560 card in action, playing soon-to-be-released games Alice: Madness Returns, Duke Nukem Forever and Dungeon Siege III.

Nvidia says that designed the GTX 560 ‘because we wanted to gamers to be able to play modern games at 1,920 x 1,080, the way they were meant to be played.’ Nvidia cites two facts from the latest Steam Hardware Survey; that the most popular graphics card is the GeForce GTX 9800 GT and that the most popular screen resolution is 1,920 x 1,080.

However, as the company points out, ‘users running 9800 GTs and even older graphics cards would have to make compromises in performance, graphical settings or both to play games at such a high resolution

According to Nvidia, the GeForce GTX 560 will be positioned between the GeForce GTX 460 and the GeForce GTX 560 Ti in terms of price and performance, but more detailed information will have to wait until it launches on 17 May. The company claims that the new chip will enable gamers to play the latest games at 1,920 x 1,080, even with features such as PhysX and 3D Vision enabled. What's more, Nvidia says you'll also be able to pair up two GTX 560s to run games at 5,760 x 1,080 using Nvidia Surround.

In addition to the aforementioned video, Nvidia has also provided some detailed screenshots showing the difference that PhysX makes to Alice, Duke Nukem Forever in 3D (you’ll have to take its word for it) and Dungeon Siege III running across three screens in Surround mode. Here’s a taster of what Duke Nukem Forever looks like, if you haven’t seen it before:


Are you excited by the prospect of the GeForce GTX 560, or have you already bagged yourself a decent DX11 card? Let us know in the forums.

52 Comments

Discuss in the forums Reply
PlayedStation 13th May 2011, 15:49 Quote
Am i missing something here?
billysielu 13th May 2011, 15:49 Quote
Everyone's calling the 560 Ti a 560... this is just going to cause confusion.

-1 clarity
CrapBag 13th May 2011, 15:55 Quote
Why not just call it a 555Ti or something, if its gonna perform somewhere between the 550TI and the 560TI then surely it makes more sense.

Sits back and waits for everyone to start jumping up and down about another nvidia naming conundrum.
alialias 13th May 2011, 15:55 Quote
just got a 560 Ti twin frozr, no regrets
Jaybles 13th May 2011, 15:56 Quote
So what teh hell does the Ti bit mean???
will_123 13th May 2011, 15:57 Quote
confuzled.........
jon 13th May 2011, 15:59 Quote
+1 alialias
Claave 13th May 2011, 16:09 Quote
I'd love to explain but this from Nvidia:
'Please feel free to share the sneak peek with your readers but for those of you currently preparing hardware reviews, keep in mind that all other GeForce GTX 560 information remains under embargo until May 17th '

Sorry.
CrapBag 13th May 2011, 16:10 Quote
Quote:
Originally Posted by Jaybles
So what teh hell does the Ti bit mean???

Ti stands for Titanium if I recall correctly, as for the relevance of that that pass.......
Lance 13th May 2011, 16:10 Quote
Quote:
Originally Posted by CrapBag
Why not just call it a 555Ti or something, if its gonna perform somewhere between the 550TI and the 560TI then surely it makes more sense.

Sits back and waits for everyone to start jumping up and down about another nvidia naming conundrum.

Because this way PC world can sell their shitty PC's with 560!!!! stamped on them, without people noticing its not a 560ti
Cei 13th May 2011, 16:12 Quote
Ti was an old nomenclature that signified a faster card, back in the era of the 4-series cards. So the 560 Ti is faster than a 560, but it'll lead to confusion.
SMIFFYDUDE 13th May 2011, 16:13 Quote
Struggling to see the point of this confusing card unless it's a lot cheaper than the Ti.
Paradigm Shifter 13th May 2011, 16:21 Quote
Quote:
Originally Posted by Cei
Ti was an old nomenclature that signified a faster card, back in the era of the 4-series cards. So the 560 Ti is faster than a 560, but it'll lead to confusion.

GeForce 3 era, too.

GeForce 3 was replaced with GeForce 3 Ti 200 and Ti 500. Just to add to the confusion, the Ti 200 wasn't as fast as the 'vanilla' GeForce 3.

But seriously... when has common sense come into GPU naming schemes? (Or CPU naming schemes either, for that matter...?)
schmidtbag 13th May 2011, 16:23 Quote
@billysielu agreed, it already confused me. when i saw this article title i was thinking "...they already released the gtx 560".

also, i don't agree with 1920x1080 as being the way games are "meant to be played", especially considering nvidia tends to put less memory on their cards than amd/ati. 1920x1080 is a nice resolution but i don't understand where people are acting like it's mandatory for today's standards. to me, the lowest acceptable resolution is 1280x960. i currently use 1680x1050 and i'm very satisfied with this resolution.
rollo 13th May 2011, 16:29 Quote
Difficult to buy a monitor that is not 1920x1080 so it's kinda irelivent unless your gaming on a 19inch monitor
Pete J 13th May 2011, 16:35 Quote
Well, I suppose it's better than calling it the 'GTX 560 Ti SE'.

Nvidia still need to take the guy who names their cards outside and beat him though.
xaser04 13th May 2011, 16:38 Quote
Quote:
Originally Posted by article
According to Nvidia, the GeForce GTX 560 will be positioned between the GeForce GTX 460 and the GeForce GTX 560 Ti in terms of price and performance, but more detailed information will have to wait until it launches on 17 May. The company claims that the new chip will enable gamers to play the latest games at 1,920 x 1,080, even with features such as PhysX and 3D Vision enabled. What's more, Nvidia says you'll also be able to pair up two GTX 560s to run games at 5,760 x 1,080 using Nvidia Surround.

I love the way this is written. A mildly overclocked GTX460 (which is all the GTX560 is, albeit "refined") could do this as well. A pair of GTX460's in SLI could also run Nvidia Surround quite happily.

Pricing this card will be a difficult one what with GTX560 TI's being available for under £170 (OCUK today only as an example) and HD6870's available for just over £150 (Ebuyer).

I assume it will be priced to match a HD6870 but tbh you would be better off just spending a bit more and getting the GTX560TI when it's on offer or buying the equal performing and more efficient HD6870.
Cei 13th May 2011, 17:09 Quote
Quote:
Originally Posted by Paradigm Shifter
GeForce 3 era, too.

GeForce 3 was replaced with GeForce 3 Ti 200 and Ti 500. Just to add to the confusion, the Ti 200 wasn't as fast as the 'vanilla' GeForce 3.

But seriously... when has common sense come into GPU naming schemes? (Or CPU naming schemes either, for that matter...?)

Actually, thinking about it, pretty sure there was a GeForce2 Ti that was the first NVIDIA card to use the name. Still ridiculous though.
knuck 13th May 2011, 17:12 Quote
Quote:
Originally Posted by Cei
Actually, thinking about it, pretty sure there was a GeForce2 Ti that was the first NVIDIA card to use the name. Still ridiculous though.

yes

The first three GF2 were the MX, the GTS and the Ultra. They later released a Ti and a Pro, I believe.

nVidia and ATi have been terrible at naming their products for over a decade. There's nothing new

The only series that were named correctly were the GF3 (Ti200 and Ti500) and the 9x00 (from ATi).

The first GeForce was the GeForce 256 and noobs couldn't know which one was better because they had no idea what DDR and SDR meant. The only way they could know was with the price.

The second GeForce was even more confusing because it existed in 5 different versions (as stated above) including two that came out later and were somewhere in the middle, performance wise

The third GeForce was name correctly but it's only because there was just two different cards

The fourth GeForce was ridiculous. The Ti4200,4400 and 4600 were fine but the MX420 was actually a GeForce 2 MX and the MX440 and MX460 were just superclocked version. Then came the 4800 and 4800SE that were actually just a Ti4400/4600 but with an 8X AGP port

The fifth GeForce was ridiculous as well with the introduction of the "50's" in names as well as LE and Ultra. Oh and the cards sucked

The sixth GeForce was a just as bad with the 6200,6600,6800 as well as the LE, GT, Ultra and TurboCache versions. Oh and newbs had to know if they needed an AGP or a PCI card (but nvidia isn't to blame there, obviously)

ATi is just as bad. The days when a 9500 Pro was faster than a regular 9600 were confusing as hell. Then there was the Radeon 9000 that was just a Radeon 7500 which itself was just a regular Radeon DDR with higher clocks. Oh and the Radeon 9200 that was just a renamed Radeon 8500. Both the 9000 and 9200 had the series 9 name but neither supported DX9 (only 7 and 8 respectively)

Radeon X800 anyone ? Or would you rather have an X800GTO ? X800XT ? X800XL ? I'm pretty sure an X800GTO was actually faster than a regular X850 but I bet nobody knew that except us, hardware geeks.

oh and who doesn't remember the Radeon X1950XTX ? The more Xs you have, the more powerful your card is !


I don't think I need to go on as most of you know the rest quite well. I like being nostalgic and remembering all those cards. I typed all this by memory only though so don't be too harsh if I did a mistake ;)
thehippoz 13th May 2011, 17:32 Quote
when's ati coming out with the shrink?
Madness_3d 13th May 2011, 17:45 Quote
I thought bit-tech had gone back in time!
thetrashcanman 13th May 2011, 17:58 Quote
i hate nvidia's naming scheme, its as bas as intel's seshhh.
Lazy_Amp 13th May 2011, 18:10 Quote
I guess they're scared about putting out a card that has a name ending in 5 after the dreadful 465.
urobulos 13th May 2011, 18:14 Quote
this is terrible. When I looked at the article my first thought was "wtf bit-tech? did someone put up an archived article by mistake" whoever is working in Nvidias (and ATIs for that matter) marketing department and comes up with the naming schemes should really reconsider their ideas. Who the hell at the company thought this makes sense????
B1GBUD 13th May 2011, 18:48 Quote
Oh the old days, how I fondly remember those.

I had the GeForce 3 Ti500, the big buzz about that was the nfiniteFX™ engine.... great I thought... limitless power... then the GeForce 4 Ti4600 came out... which had 2 nfiniteFX™ engines.... 2 x Infinity anyone?
r3loaded 13th May 2011, 19:09 Quote
They should have called the 560 Ti the 565, and leave this new card named the 560 as it is. I think the 465 might have tainted the 65 suffix a lot for them to do this though.
Adnoctum 13th May 2011, 19:15 Quote
Quote:
Originally Posted by thehippoz
when's ati coming out with the shrink?
According to the rumour sites, AMD has taped out their 28nm Fusion designs. I'm sure the Graphics side (Southern Islands?) won't be too far behind. I suppose you can expect them by the end of the year, barring some show stopping issue in the process (the design itself is pretty mature).
OCJunkie 13th May 2011, 19:20 Quote
Pfff, Nvidia and Intel have the worse naming schemes EVER. It's just designed to create confusion so they can sell steps-backward cards as higher end products. Glad to be on AMD's side on this one...
Adnoctum 13th May 2011, 19:41 Quote
The Nvidia naming line up is awfully crowded to be slotting in new models: 520, 530, 540, 550 Ti, 560, 560 Ti, 570, 580, 590. We'd better hope there is no refresh in the works! :)

The "Ti" zombie-like resurrection was a stupid move on Nvidia's part, IMO. Of course the conspiracy-laden explanation for Nvidia's naming is that it is on purpose: A confusing scheme exploits the suckers...I mean, consumers.
The more Nvidia mucks around with their naming scheme* the more I like AMD's, although their upcoming CPU naming scheme is promising to be the biggest, most confusing naming s***-storm of all time... >>OF ALL TIME!<<

*
Dipweasel 1: I've got a Gigabyte GeForce GTX 560 Ti 1GB Super Overclock 1000MHz! w00t!
Dipweasel 2: Pfft! Loser! I've got an EVGA GeForce GTX 560 Ti Maximum Graphics Edition Crysis 2! It's big in Japan!
Cei 13th May 2011, 20:03 Quote
Quote:
Originally Posted by Adnoctum

The more Nvidia mucks around with their naming scheme* the more I like AMD's

Right, with the AMD system of a 6870 being slower than a 5870 and so forth? Sorry, but AMD are just as bad with naming. It should be that a 6870 is superior to a 5870, a 6850 to a 5850, and so on, as it was with the 4xxx series to 5xxx. But no, the 6xxx series just ballsed that up.
Jaybles 13th May 2011, 20:12 Quote
Quote:
Originally Posted by Paradigm Shifter

But seriously... when has common sense come into GPU naming schemes? (Or CPU naming schemes either, for that matter...?)

Anyone thinking the latest AMD GPU names???
Adnoctum 13th May 2011, 20:29 Quote
Quote:
Originally Posted by Cei
Right, with the AMD system of a 6870 being slower than a 5870 and so forth?
Anyone over 11 years of age should know that you don't use naming schemes to compare performance between generations. How old are you?

Quick question: If I had a HD5850 should I get a HD6770 to replace it? It seems a sensible upgrade, because 6770 is a higher number than 5850. Also, my forehead hurts. Do you think I should stop hitting it on the desk?
Evildead666 13th May 2011, 20:49 Quote
What worries me more is the fact that they are showcasing these three games.
What have they paid to get 'good enough' performance out of the 'basic' 560 ?

These aren't TWIMTBP games are they ?
urobulos 13th May 2011, 21:25 Quote
The naming for the 4k and 5k series from AMD made perfect sense. 2xx Nvidia cards and 4xx, 465 excepted, also followed clear logic. So apparently both companies can do it when they feel like it. This is just silly.
jon 13th May 2011, 21:25 Quote
They've got some odd logic here.

According to this article, nVidia is seeing a discrepancy between the % of people using GF9800 cards and the people gaming at 1920x1280, claiming that the former are obviously using the latter, and are having to "make compromises in performance, graphical settings, or both to play games at such a high resolution." Let's assume they're using Dec 2010 stats, in order to justify their rationale. If we convert all percentages to whole numbers to simplify the math (everything is scalable after that), then the percentage of DX10 card users in Dec 2010 was 72.37%, or 72 people (out of 100). 5.9% (call it 6) of those people owned GF9800 cards. .06 * 72 = 4.32 ... let's round it all the way up to 5, just to help out nVidia here. That means 5 people out of 100 are making the compromises nVidia says they are ... if we assume those are the same people gaming at 1920x1080!.

No where does it say those 5 people (5% of the total user base) are the same people as those gaming at 1920x1080 ... in fact, they can't be, because a full 21% of the user base is gaming at 1920x1080!

So, either nVidia is REALLY concerned about the 5% of the total user base in Steam ....

or this is just more marketing bull***t to help them get rid of wafers that didn't make the 560 Ti edition.

You decide. :)
jon 13th May 2011, 21:26 Quote
Never try to edit your post directly on the comments page ... :)
Cei 13th May 2011, 21:47 Quote
Quote:
Originally Posted by Adnoctum
Anyone over 11 years of age should know that you don't use naming schemes to compare performance between generations. How old are you?

Quick question: If I had a HD5850 should I get a HD6770 to replace it? It seems a sensible upgrade, because 6770 is a higher number than 5850. Also, my forehead hurts. Do you think I should stop hitting it on the desk?

Wow, somebody is feeling a bit patronising today aren't they. I never even mentioned a 5850 to a 6770. For that, I'd personally expect the 5850 to be faster, but the 6770 to have features (be that power draw, or whatever) that the other didn't. Lo and behold, that's what happens.

Yet what happens when you go for like-for-like? The assumption would be that a 6850 is faster than a 5850. This isn't the case. Equally, a 6870 would be assumed to be faster than a 5870, but no, you need a 6890.

AMD/ATi used to have it that the x870 was the fastest single card in a generation, which then got supplanted by the x890 in mid-cycle. The 4xxx series and 5xxx series followed this perfectly, and it all made sense. Yet when the 6xxx series came along, you could buy a new GPU that was slower than its equivalent model in the old generation. Sense was not made.
Sloth 13th May 2011, 22:03 Quote
Shame they didn't do exactly as r3loaded said and replace the Ti with 5's on the end. It'd be the easiest naming scheme yet!
Quote:
Originally Posted by Cei
Right, with the AMD system of a 6870 being slower than a 5870 and so forth? Sorry, but AMD are just as bad with naming. It should be that a 6870 is superior to a 5870, a 6850 to a 5850, and so on, as it was with the 4xxx series to 5xxx. But no, the 6xxx series just ballsed that up.
Rule of thumb: add 100 to the 5000 series card. It's still the same scheme just moved up, likely because the 900 range was only being used for one card. Move it up and you have more cards with 9 in their name, looks faster.

It's mildly frustrating when looking between generations, but numbers really only ever mean where the card is intended to sit in the product line up. If you're upgrading purely on product names and not looking at the performance you're already doing it wrong. A quick glance at a benchmark and you'd quickly see something has changed.
CrapBag 13th May 2011, 22:08 Quote
Quote:
Originally Posted by Sloth
Shame they didn't do exactly as r3loaded said and replace the Ti with 5's on the end. It'd be the easiest naming scheme yet!

Rule of thumb: add 100 to the 5000 series card. It's still the same scheme just moved up, likely because the 900 range was only being used for one card. Move it up and you have more cards with 9 in their name, looks faster.

It's mildly frustrating when looking between generations, but numbers really only ever mean where the card is intended to sit in the product line up. If you're upgrading purely on product names and not looking at the performance you're already doing it wrong. A quick glance at a benchmark and you'd quickly see something has changed.

Exactly, bitching at naming/numbering schemes is just an excuse for fanboys to ride in on their white horses and start pointing fingers.

Geez they are big companies they want to sell products so they pull a few marketing tricks here and there big deal.
adidan 13th May 2011, 22:35 Quote
Quote:
Originally Posted by Cei
GeForce2 Ti
I loved that card!
Adnoctum 13th May 2011, 23:00 Quote
Quote:
Originally Posted by Cei
Wow, somebody is feeling a bit patronising today aren't they.

I concede to the reality of your observation.
Everybody has been pushing my buttons today, which may have more to do with a general dissatisfaction at having to return to my normal routine after a break, rather than as a result of the elevated level of interpersonal and environmental chaos I found upon my return...()...but I can't be certain. Correlation and causation and all that.
Quote:
Originally Posted by Cei
Yet what happens when you go for like-for-like? The assumption would be that a 6850 is faster than a 5850. This isn't the case. Equally, a 6870 would be assumed to be faster than a 5870, but no, you need a 6890.

Once you realise that it is actually a made up descriptive system for marketing purposes that bear no resemblance to a system of benchmarked results then you don't feel so bad about it.
I understand your point, and to a certain degree I agree as well, but AMD had a difficult decision to make. They had a good naming system in place, but there was a requirement to reposition the line-up to accommodate the entry level Fusion GPUs (you notice that AMD will say that the APUs include a named GPU, eg. the E350 includes a HD6310 GPU) as well as an entirely new level of GPU (the HD68xx) and this pushed the discrete cards up. It was either make a new system (everyone would scream) or modify the existing one (everyone would scream).
Quote:
Originally Posted by Cei
AMD/ATi used to have it that the x870 was the fastest single card in a generation, which then got supplanted by the x890 in mid-cycle. The 4xxx series and 5xxx series followed this perfectly, and it all made sense. Yet when the 6xxx series came along, you could buy a new GPU that was slower than its equivalent model in the old generation. Sense was not made.

But a look at the price points of the HD5870 and HD6870 would show anyone that they were not equivalent class cards. Everyone was screaming (me included) that the price/performance gap between the HD5770 and the HD5850 was too large (let's face it, it was a yawning chasm!). Now there are comfortable gaps between the HD6770 - HD6870 - HD6870.

It isn't perfect, but it beats Nvidia's seemingly random assortment of letters/numbers. It seems that the GTX is creeping lower and lower. Now we have a GTX550. What ever happened to GTS at the mid-range, and GT at the bottom? GTX doesn't mean anything any more, and the Ti never did mean anything. :(
slothy89 14th May 2011, 00:46 Quote
I fail to see the point of this card.. You have the 460 which is bargain these days, two of those will smash retry much anything.. With similar performance to a 580..
The 560ti is a step up from the 460, and the 550 performs just below the 460 if I recall.. And I mean just..
Is there really a big enough performance gap between the 460 and 560ti to warrant this extra tier? I'm pretty sure the gainward 460 goes like hell edition performs within about 1-2% of the 560ti as it is...

nVidia just needs a way to get rid of their low grade gpus, but calling it the 560? That's just downright confusing.. Much the same as the 460SE was.. And I believe this is gonna be the same type of difference. GTX555 would have been more appropriate so at least the uninformed can tell there's a difference..

PC hardware naming conventions are getting beyond a joke...
Fizzban 14th May 2011, 01:42 Quote
So Intel and AMD were having a beer and Nvidia wanted to join them, but they were refused as none of their line up was confusing enough..........enter the 560. You know, the newer shitter version with essentially the same name as a previous superior version..
Noisiv 14th May 2011, 08:11 Quote
It never gets too old...
Everyone and their grandma have an expert opinion about naming scheme perceived by a brain dead zombie noob.

Scheme which looks like this:
GTX 450 < GTX 550 Ti < GTX 460 < GTX 560 < GTX 560 Ti

Now will zombie noob please stand up, and instead of giving advices to poor poor Nvidia, kindly explain what's so confusing to him? Other then 190AD being more recent then 1900BC.

Because I have a hunch everything would be so much clearer for him if he just looked at the product prices, learned to Google, or ask anyone with a working brain half to assist him with his purchase.
stoff3r 14th May 2011, 11:37 Quote
I do like the resolution 1920x1080, so lets just make it standard for gaming so that we'll atleast have one standard, and one less thing for the softwaremakers to think about.

I wonder when we'll see the first benches of BF3, I gotta know what to buy.
do_it_anyway 14th May 2011, 12:14 Quote
Quote:
Originally Posted by Noisiv
It never gets too old...
Everyone and their grandma have an expert opinion about naming scheme perceived by a brain dead zombie noob.

Scheme which looks like this:
GTX 450 < GTX 550 Ti < GTX 460 < GTX 560 < GTX 560 Ti

Now will zombie noob please stand up, and instead of giving advices to poor poor Nvidia, kindly explain what's so confusing to him? Other then 190AD being more recent then 1900BC.

Because I have a hunch everything would be so much clearer for him if he just looked at the product prices, learned to Google, or ask anyone with a working brain half to assist him with his purchase.
I think I agree with this. Although I have read it twice and still am not entirely sure what Noisiv is saying, I *think* he is saying that its not that difficult to understand.

The funny thing is, that the "older" PC'ers hanker after the naming schemes that were so much more understandable 4 years + ago.
But they weren't. Back then we had the 8800GTS, 8800GTX, 8800GTX Ultra, 8800GT and 8800GS. And thats BEFORE nvidia went a bit daft and released the 9800GT (which ws an 8800GT) and the 9800GTX+ which then became the GTX250.

When we buy a car, we go out and buy; say; a Golf. We can buy the 2.0 in various "flavours" such as the Golf 2.0Twist, Golf 2.0S, Golf 2.0Match, Golf 2.0GT, Golf 2.0R and Golf 2.0GTi. All of them are named Golf, have a 2.0 engine, and all of them cost different amounts and have different levels of performance and kit. And you wouldn't expect someone to go out, buy a golf S and complain they thought they were getting 0-60 in under 8seconds with leather as standard.
The letters after the name tell you what to expect from that car, just like the letter after the card name tell you wether you are getting a top level or mid level version of that card.

It only odd because we haven't had that naming for a while, Nvidia went for mid numbers such as the GTX465, which confused everyone, and now they've changed back and people are still crying foul.
At first I too thought it was silly to name another card the 560, but in retrospect is it really that bad?
Enzo Matrix 14th May 2011, 15:41 Quote
[QUOTE=Paradigm Shifter]
Quote:
Originally Posted by Cei
But seriously... when has common sense come into GPU naming schemes? (Or CPU naming schemes either, for that matter...?)
About the time of the ATI HD 3000 series... but only on the ATI side of things. And only up to and including the 5000 series.
Noisiv 14th May 2011, 16:40 Quote
pregnant women dont make this kind of drama when it comes to giving names
Altron 15th May 2011, 19:56 Quote
Quote:
Originally Posted by Cei
Quote:
Originally Posted by Adnoctum
Anyone over 11 years of age should know that you don't use naming schemes to compare performance between generations. How old are you?

Quick question: If I had a HD5850 should I get a HD6770 to replace it? It seems a sensible upgrade, because 6770 is a higher number than 5850. Also, my forehead hurts. Do you think I should stop hitting it on the desk?

Wow, somebody is feeling a bit patronising today aren't they. I never even mentioned a 5850 to a 6770. For that, I'd personally expect the 5850 to be faster, but the 6770 to have features (be that power draw, or whatever) that the other didn't. Lo and behold, that's what happens.

Yet what happens when you go for like-for-like? The assumption would be that a 6850 is faster than a 5850. This isn't the case. Equally, a 6870 would be assumed to be faster than a 5870, but no, you need a 6890.

AMD/ATi used to have it that the x870 was the fastest single card in a generation, which then got supplanted by the x890 in mid-cycle. The 4xxx series and 5xxx series followed this perfectly, and it all made sense. Yet when the 6xxx series came along, you could buy a new GPU that was slower than its equivalent model in the old generation. Sense was not made.

There was no mid-cycle refresh of the 5xxx - the 5890 never existed. There was also a 4860, which had no 5xxx equivalent.

They haven't been able to make up their minds with the dual GPU cards either. First two generations, 3870X2 and 4870X2, makes perfect sense. Then they decided to make the x9xx into the dual card, so a HD5870X2 became a HD5970. But then they got tired of that, and made a 6970X2 into a 6990. Personally, I think that it would be simpler to just retain the X2 on the end, but maybe it is something that was triggered by nVidia, who switched from a 9800GX2 to denote a dual 9800 to a GTX 295 to denote a dual 275. Perhaps AMD felt that they needed an x9xx card, because nVidia did. The 6790 also kinda bugs me, as it's really a crippled 6870, not the same chip as the 6750 or 6770.

I personally wasn't a fan of nVidia moving the naming around the midrange. The 460 got confusing with the 460 1GB, the 460 768mb, and the 460 SE 1GB. Now the "Ti" has been revived. I thought the GT/GTS/GTX scheme made sense without any suffices.

The OCD engineer in me believes in a very consistent naming scheme, but in reality, AMD and nVidia are concerned with selling cards. If there is a gap in their lineup, and they are losing business, they will find a way to fill it. Just look at how the 460 shook things up for ATI. The 5830 was crappy, and there was nothing between a $150-170 HD5770 and a $270-300 HD5850. The GTX460 just swept in and filled that huge gap. And, of course, within a couple months, AMD rolls out the HD6850 and HD6870 right in that gap. More variety can be a good thing. That critical $175-250 price range had nothing but the crippled HD5830 and GTX465 last generation, and now with a combination of price drops and new cards, there are the 460, 470, 560, 6850, 6870, and 6950 all in that price range, and all good cards.
slothy89 16th May 2011, 05:18 Quote
Most people are complaining about the naming conventions, tbh I'm not worried about that. What I don't get is why we need yet another card in this performance/price range.
As I said in my previous post, is the gap between a factory OC 460 and the reference 560ti really big enough to warrant another card in the middle? I need to re read a review I found that compared a gtx 460 OCd @ 800/4000 to the stock 560ti, and from memory it may have been as little as 3-4% difference. Eg, 57fps vs 60.. Would you really notice that?
Elton 16th May 2011, 07:56 Quote
It's become an arms race for every frame, literally every frame.
CowBlazed 17th May 2011, 01:49 Quote
560 ti is much faster then a 460 1GB now, I was surprised myself but look at any recent reviews.

The 460 is actually looking really weak compared to the cards in it's price range. It's riding off word of success alone at this point selling for $150-180, when a Sapphire 5850 mops the floor with one for $135. Even the 6850 is generally better at everything but HAWX and 1 or 2 other nvidia games.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums