bit-tech.net

What’s next for the GeForce 500-series?

Posted on 10th Nov 2010 at 14:44 by Clive Webster with 56 comments

Clive Webster
Now that we’ve seen the speed of the GeForce GTX 580 1.5GB, our attention naturally turns to the future. Unfortunately, while we know some interesting things about the forthcoming ATI Radeon HD 6900-series, we’re still not allowed to tell you anything about it, so I’ll focus instead on what I think the rest of the GeForce 500-series might have in store.

There are two worrying aspects to the GeForce 400-series if you’re an ATI employee of supporter – many of the the GPUs have had disabled units, and they’ve proved to be roughly 25 per cent overclockable. Add to this the new fp16 capabilities of the GF110 re-vamp that we should assume the whole range will use and you’re looking at some frightening numbers for the Radeon clan.

If we say that every GeForce 500-series GPU will have proportionally the same frequency increase as the GTX 580 1.5GB has over the GTX 480 1.5GB, and if each GeForce 500-series has one extra SM over its 400-series counterpart (as the GTX 580 1.5GB does), we could see the GeForce GTX 570 1.3GB being 29 per cent faster than the GTX 470 1.3GB. That should be enough to overhaul the Radeon HD 5870 1GB in most tests, but that’s not where things are really interesting – when we move to the mid-range, ATI could be truly scuppered.

If we stick to the same conjecture – one extra SM and proportionally the same extra frequency – we see a GeForce GTX 560 GPU that’s up to 38 per cent faster than the GeForce GTX 460 GPU. What’s more, as the chip would be the same size as the GF104 of the GTX 460, a GTX 560 would cost the same, giving us cards that cost £150 to £180. If this conjecture is even remotely true, the Radeon HD 6850 1GB and HD 6870 1GB will look laughably underpowered in comparison.

So, what can we expect from the GeForce 500-series? It could well be seen as one of Nvidia’s most brilliant GPU ranges since the GeForce 8-series. If so, ATI will have failed to take full advantage of the DX11 monopoly it was gifted late last year.

What’s next for the GeForce 500-series? *What’s next for the GeForce 500-series?
Click to see my (fairly rough) working for the above speculation.

56 Comments

Discuss in the forums Reply
rehk 10th November 2010, 15:57 Quote
I predict wallets suddenly becoming lighter if any of this is even slightly true.
wuyanxu 10th November 2010, 16:00 Quote
wow, everywhere i see, people praising the 500 series. is it that good? so good that makes 400 cards obsolete?

of course, by the time a full lineup has been released, it'd probably be one year since 500's corresponding card came to market.
okenobi 10th November 2010, 16:01 Quote
And when is this mythical 560 due to arrive? I'm assuming it's a ways off yet....
Snips 10th November 2010, 16:01 Quote
I know I'm bias towards the green team but even I can't believe AMD would let such an excellent head start slip through their fingers. I know you guys must have the cards in hand but can't you just let slip the potential of the AMD cards?
flipman 10th November 2010, 16:03 Quote
"wow i never saw this coming" so lets wait and see if it will make this differences ,AMD is near the release of there cards
mrbens 10th November 2010, 16:57 Quote
Nvidia do seem to be impressing at the moment. I'm so impressed by my 460 and cannot wait 'till around March time to buy another for SLI.

C'mon Clive!:
Quote:
if you’re an ATI employee of supporter – many of the the GPUs
TWeaK 10th November 2010, 17:07 Quote
Quote:
Originally Posted by Snips
I know I'm bias towards the green team but even I can't believe AMD would let such an excellent head start slip through their fingers. I know you guys must have the cards in hand but can't you just let slip the potential of the AMD cards?

In their defense, most of their advantage was lost due to TSMC screwing up 32nm. That was the cat they were going to let out of the bag after Nvidia responded, instead TSMC threw the bag in a river.
new_world_order 10th November 2010, 17:14 Quote
Quote:
Originally Posted by rehk
I predict wallets suddenly becoming lighter if any of this is even slightly true.

ROFL!!!!!!!!!!!!!!!!!!! :)
memeroot 10th November 2010, 17:32 Quote
indeed - looking forward to them... but WHEN!
Lord-Vale3 10th November 2010, 17:34 Quote
Thats interesting. nVidia seems to have jumped right back on its feet.
chrisb2e9 10th November 2010, 18:02 Quote
This is how is goes. One company has the lead for a while, and then the other one will. and back, and forth. so on, etc.
Xtrafresh 10th November 2010, 18:03 Quote
granted, the 580 is a good card that has truely unlocked the potential of the Fermi architecture, but i'm not a big fan if this kind of wild conjecture. For instance, the NF104 is a completely different design then the NF100 (of which th NF110) is a direct derivative. There is no basis at all to believe that it has as much overhead left to be unlocked as the NF100, and for a start, the GTX460 it doesn't have any unused SMs like the GTX480, has it?

Going even further, you could say that the 6870 is the successor of the 5770, and it has a 70% performance increase (min framerate, 19x12, here: http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/7) , so this means that the 6970 will also feature 70% performance hike over the 5870, obliterating nVidia till kingdom come.

Ofcourse, there's a milion things wrong with all that, just as your ludicrous math has no merit to it whatsoever. From nVidia fanboys i am expecting to see these kind of posts, but you are supposed to be a respected journalistic outlet. Yeah fine, it's a blog, and some speculation is fine, but please don't become a next Charlie.
WarrenJ 10th November 2010, 18:13 Quote
Personally, i would like to see a dual gpu card a'la GTX295's. maybe 2 460's or 2 470s? Are they on they're way yet?
borandi 10th November 2010, 18:18 Quote
Quote:
Add to this the new fp16 capabilities of the GF110 re-vamp

The newer fp16 capabilities were on the GF104 and 106 already. Only the GF100 cards didn't have them.
Evildead666 10th November 2010, 18:52 Quote
This seems a bit too much AMD bashing for a blog on the 500 series.
"...ATi (they're gone btw) could be truly scuppered."
"...the Radeon HD 6850 1GB and HD 6870 1GB will look laughably underpowered in comparison."

Your PR is as strong as Nvidia's, and you state you have info about the 6900 series as well, which could influence some people (insider knowledge and all that).

So basically you're giving free PR to Nvidia, whilst trashing AMD's new 6xxx lineup.

And yet this is 'speculation' ?

I'm not pro- any side, but i do take exception to someones personal speculative blog being on the front page of what should be a respectable tech site.
Being respected is to not be biased. If Bit Tech is leaning to one side or the other, put THAT on the front page, and change your name to NvNews or something.

This should not be on the Front page.
schmidtbag 10th November 2010, 19:25 Quote
i think how that nvidia has finally built that supercomputer they can finally spend their money and attention on their video cards. imo, the gtx500 series is just them milking their old architecture for what its worth but i'm sure the gtx600 series will be much faster, much less power consuming, much less heat, much less noise, and probably smaller. although i currently use ati, i've always been an nvidia fan - they've always tried really hard to be the best and they just couldn't afford it this past year, but i'm sure they'll come back.

although i'm glad how well ati/amd have been doing (they really needed it) i'm REALLY disappointed in their hd6000 series so far. its barely better, i sure hope the hd69xx cards will live up to their expectations.


i do feel that bit-tech is a little biased towards nvidia. from the article i read yesterday about the gtx580, it did do great but it also wasn't #1 in every test, but bit-tech was acting as though it was #1 in all tests. i checked guru3d.com, and it was positioned #2 in almost every test.
Phoenixlight 10th November 2010, 21:11 Quote
"...the Radeon HD 6850 1GB and HD 6870 1GB will look laughably underpowered in comparison..."
Not cool.
fingerbob69 10th November 2010, 21:29 Quote
"...the Radeon HD 6850 1GB and HD 6870 1GB will look laughably underpowered in comparison..."
Not cool.

But Not right.

What's more the 580 is not a new chip in terms of new architecture. It's a mostly fixed 480. If we want to start getting our knickers in a twist over the name on the box a lá 68xx v 67xx then this card should rightly be called the gtc485.
leveller 10th November 2010, 21:29 Quote
This is the bit of the article that has my attention:

"Unfortunately, while we know some interesting things about the forthcoming ATI Radeon HD 6900-series, we’re still not allowed to tell you anything about it"

Which can be taken either way. I hate secrets :(
jason27131 10th November 2010, 21:33 Quote
lol. I use to read bit-tech cause it was unbias. Now according to what I'm seeing in reviews and this specific blog, seems like it's going green. Not cool man.
Kúsař 10th November 2010, 22:09 Quote
Hmmm...and I thought 460 was "final" version of mid-range fermi GPU just like 580 is final version of high end fermi.

I can't see AMD loosing against nVidia unless 6970 performs worse than 6870. It's new architecture and with a few months tweaks we're looking into fine battle between GT5# and HD6# series...
And given the efficiency of 6800 GPUs, AMD can afford some price cuts on these...
Waynio 10th November 2010, 22:26 Quote
Hmmm, nvidia coming back with a vengeance, finally, I was expecting all this with what was said with fermi but was delayed & didn't live up to the hype nvidia were hinting at & instead was noisy & hot.
The best gpu upgrade I ever had though was the 8800gtx. If this turns out true I might be selling my 5870's to buy 1 & put the rest towards aluminium :D.

But

Blah, it's a blog, take with a pinch of salt until results roll in when available ;).
Snips 10th November 2010, 22:47 Quote
What about the Nvidia bashing evildead666?
PopcornMachine 10th November 2010, 23:18 Quote
Thanks for the biased speculation and no actual information. designed to fuel pointless arguments.
The boy 4rm oz 11th November 2010, 00:12 Quote
I was planning on getting a GTX470 for my upcoming build. If they release a GTX570 I will be extremely interested. I would like a GTX580 but the $699AUD asking price is a bit much, although it is less then I paid for my 8800GTX lol.
Evildead666 11th November 2010, 01:17 Quote
Quote:
Originally Posted by Snips
What about the Nvidia bashing evildead666?

My 8800's were great cards. ;)
Not pro or bashing anyone, cards maybe, but not the company itself.
we all have ups and downs, and seeing the prices the gtx470, and 460 are going for now its great price war, and win-win for us, the buyers ;)

just not wanting to see such a thing on the front page.
If it was in the blog section, I wouldn't have seen it lol.

He admits he slants to Nvidia, I get it, just not on the front page.
memeroot 11th November 2010, 02:12 Quote
bit tech bias... lol....

did you read the reviews of the 4xx series?

now there has been a price drop nvidia are competing again (though probably not at much profit) and if they can up the count and clocks on their cards as the report says then ati will have to stop making quite so many $$$ and drop the prices of their small die cards to compete

this is good
Krayzie_B.o.n.e. 11th November 2010, 02:24 Quote
Wow did I miss the Nvidia TWIMTBP screen splash before I read this article?

Is the GTX 580 a good card? Yes it is as it runs cooler, quieter, uses less Watts and has a huge performance boost.(about time) do your speculations make any sense? NO. cause the GTX 460 GF104 isn't missing any SM as it is a fully utilized designed unlike the GF100.

Rumor has it the GTX 570 will use only 480 sm thus making it a cooler quieter more efficient GTX 480. No word on a GTX 560 although I'm more interested in Zotac's GTX 460 x2 which can perform on par with a GTX 580 (see GTX 460 sli benchmarks) but for a lower price (2 GTX 460 1gb $420)

As far as AMD goes a HD 6870 CF beats a GTX 580 (see benchmarks). AMD only needs the HD 6970 and the 6950 to stay within 8 frames of their GTX 580 and 570 counterparts while releasing at a lower price point as this has been AMD strategy for some time. The HD 6990 will take care of the rest at the extreme price level just like the HD 5970 is still the fastest now.

It's all about the $150 to $300 sweet spot in video cards and AMD has realized this (HD 6850 6870) and so has Nvidia (GTX 460 and soon to be GTX 470).
AMD or Nvidia you can't go wrong as both companies are at war for gamer's wallets and it's a win win situation for all pC gamers.
Anakha 11th November 2010, 02:45 Quote
Quote:
Originally Posted by Snips
I know you guys must have the cards in hand but can't you just let slip the potential of the AMD cards?
No. they are under NDA, a legally binding agreement. If they break that NDA, not only do they get sued into oblivion (And CustomPC with them), but they also don't get any early looks at future AMD hardware. Asking them to do so is just plain stupid. Have a little patience.
Quote:
Originally Posted by fingerbob69
What's more the 580 is not a new chip in terms of new architecture. It's a mostly fixed 480. If we want to start getting our knickers in a twist over the name on the box a lá 68xx v 67xx then this card should rightly be called the gtc485.
Going by your definition, a 68xx is "a mostly fixed 57xx". This still makes the GTX580 newer, as Fermi was delayed by a long six months (as all the ATi/AMD fanboys know full well, as they made the most of that time to muck-rake NVidia).
Quote:
Originally Posted by Kúsař
Hmmm...and I thought 460 was "final" version of mid-range fermi GPU just like 580 is final version of high end fermi.
The 460 was the first mid-range card they produced on Fermi.
480 = High end v1, 460 = Mid-Range v1.
Therefore 580 = High End v2, 560 = Mid-Range v2.
Quote:
Originally Posted by Kúsař
I can't see AMD loosing against nVidia unless 6970 performs worse than 6870. It's new architecture and with a few months tweaks we're looking into fine battle between GT5# and HD6# series...
And given the efficiency of 6800 GPUs, AMD can afford some price cuts on these...
Given the performance gap between the 5870 and the 5890, it's possible to estimate the performance difference between the 6870 and the 6970 (Given that they are, essentially, GPU refreshes). That %age increase puts the 6970 still underperforming the 580 at this point, though it'll be a close-run thing.
Quote:
Originally Posted by PopcornMachine
Thanks for the biased speculation and no actual information. designed to fuel pointless arguments.
This is a blog entry. This makes it one person's opinion. Given that he has experience in the field, and the capability to do basic math (As shown by the included tables showing his calculation), this goes from "Biased speculation" to "Educated guess". AMD/ATi fans might not like it, as it knocks them from the pedestal they've been trying to cling to since Fermi came out, so let's try spelling this out:

IF NVidia refreshes their mid-range cards (Highly likely, as they've done so in the past) AND
IF the performance difference between it and the newly-released high-end card is similar to the difference between the 460 and 480 (Quite possible), AND
IF NVidia release it at the same price point (Which they are likely to do, as the manufacturing costs are the same),
THEN the "560" will wipe the floor with all currently released cards at it's price point (Namely, the 6870 and 6850).

Admittedly, there are 3 big IFs in there, but all those IFs are highly likely to happen.

ATi/AMD have shown their hand for mid-range, and at the moment they have a winner. NVidia have not answered that play yet, and they're playing their cards close to their chest, but what they have shown for the high-end indicates a very strong mid-range hand. We'll soon find out just how ATi/AMD are going to answer NVidia's high-end challenge, and shortly after that we will probably hear NVidia's mid-range response. As it stands at the moment, though, if NVidia disabled an SM, downgraded the RAM, and called it good (As it has a history of doing, look at the 470, 9600 and so on) you get an idea of what a 560 would look like, and that is mighty tempting, and more powerful than the 68xx's. Which is where this blog post is coming from, IMHO.

I am not an NVidia rep, nor am I an ATi/AMD rep, not do I have a vested interest in this fight (My machine is still running a Athlon 64 x2 5600+ with 2GB RAM, there's no way I could keep up with cards like these).
Skiddywinks 11th November 2010, 03:03 Quote
The 69xx series is not a refresh. It is considerably different to both the 58xx series and the 68xx series. Maybe not significantly, but definitely considerably.
Krayzie_B.o.n.e. 11th November 2010, 03:58 Quote
The GTX 460 is not a Disabled GTX 480 as the 460(GF104) ran cooler quieter and more efficient than the GTX 480 (GF100) so stop saying that.
Sure it is based on Fermi Architecture but obvious lt went through a different engineering process to achieve it's performance versus your idea of just turning off SM.

Sorry but just disabling the GTX 580 down to a GTX 560 doesn't always work just look at the horror show that is the GTX 465 and on the AMD side see HD 5830.
The engineering process of the GTX 460 got Nvidia to the GTX 580. Sure Nvidia could disable some Sm to acheive a GTX 570 but a GTX 560 may or may not be feasible especially for that price bracket .

I don't see a GTX 560 wiping the floor with a HD 6870, beating it in performance by 2% sure but not at a $230 price point in which the HD 6870 is and of course will drop if needed if a GTX 560 arrives.
The GTX 570 is the key for Nvidia if they can get it to perform slightly better than a GTX 480 for around a price of $350 then it will garner huge sales figures.

Nvidia having to refresh their line up within 8 months of release shows they took a beating but all questions of their impending return to dominance will be answered when the HD 6970 is released.
thehippoz 11th November 2010, 04:20 Quote
I would think the 6970 will beat the 580 and less power- probably better scaling too

but if bit has it in house.. and they are still blowing the green goblin
Krayzie_B.o.n.e. 11th November 2010, 04:41 Quote
Quote:
Originally Posted by thehippoz
I would think the 6970 will beat the 580 and less power- probably better scaling too

but if bit has it in house.. and they are still blowing the green goblin

I seriously don't see the HD 6970 beating the GTX 580, maybe coming up 3 to 6% short but selling at a cheaper price. The HD 6990 will be AMD ultimate card that surpasses the GTX 580 which hasn't surpassed the HD 5970 except maybe in Far Cry 2 and Metro 2033 w/ DOF and Tessellation on.
rickysio 11th November 2010, 06:50 Quote
Oh wow, did I miss the memo about the comments being taken over by a fanboy parade today?
[USRF]Obiwan 11th November 2010, 09:26 Quote
Just as i want to buy a 460 you tell me there is a 560 coming for the same price.. Damn you!
dispie 11th November 2010, 09:59 Quote
this is going a bit fast,

i just installed my 2x 460 cards a 3-4 months ago and the new ones are already under way and so much faster damn you indeed.

aldo there is still much to overklock on my overklokked 460 cards and i don't see me needing anymore power then these 2 give at this moment but still this is going way to fast.
Xtrafresh 11th November 2010, 10:00 Quote
Quote:
Originally Posted by Anakha
No. they are under NDA, a legally binding agreement. If they break that NDA, not only do they get sued into oblivion (And CustomPC with them), but they also don't get any early looks at future AMD hardware. Asking them to do so is just plain stupid. Have a little patience.
No need to get rude man, have a little patience with your fellow human being. Also, i disagree with you. NDAs are all fine and dandy, but the industry is using them to effectively shut down all investigative journalism. Asking a journalist to do a littl more then play along with the NDAs going (a LOT at the moment) is very healthy imho.
Quote:

The 460 was the first mid-range card they produced on Fermi.
480 = High end v1, 460 = Mid-Range v1.
Therefore 580 = High End v2, 560 = Mid-Range v2.
FAIL. It doesn't work like that, this is exactly the logic that the article tries to apply and also the logic that nVidia wants to put out there by naming this card the 580 instead f the 490. You are buying into the nVidia marketing machine, and we all know what they are shovelling.

THE GTX460 ALREADY HAS MOST CHARACTERISTICS OF A GTX580. THEREFORE, YOU CANNOT APPLY THE 480 -> 580 HIKE TO A SPECULATED 460 -> 560.

Ahem. Oh sorry, capslock got stuck there. And the text seems to be printed in bold too. How odd...
Quote:

Given the performance gap between the 5870 and the 5890, it's possible to estimate the performance difference between the 6870 and the 6970 (Given that they are, essentially, GPU refreshes). That %age increase puts the 6970 still underperforming the 580 at this point, though it'll be a close-run thing.
Wat.

First of all the 5890 does not exist. I'm not sure which performance gap you have in mind, but if you do want to make a forced analogy (which would be false for various reasons) the only valid one would be the gap between 5870 and 5770. This gap is around 70%. That would put the 6970 at 70% above the 6870, and that is well above the GTX580.

To recap:
First, you apply false logic, (results from the past are no guarantee for the future).
Second, while applying this logic you fill it with nonexistant data.
Third, even then, you fail at math and come to the wrong conclusion.

In short:
wat
Quote:
This is a blog entry. This makes it one person's opinion. Given that he has experience in the field, and the capability to do basic math (As shown by the included tables showing his calculation), this goes from "Biased speculation" to "Educated guess". AMD/ATi fans might not like it, as it knocks them from the pedestal they've been trying to cling to since Fermi came out,
Yeah, let's play this game. No wait, let's not. It's a blog entry, ok, but it's also a front page article that comes together with the GTX580 review. I was hoping/expecting to read about some genuine analysis, not wild conjecture that is easily proven false (the GTX460 has no SMs left to unlock, and already most of the architecture that gives the GTX580 a lead over the GTX480).
I am not a fanboy, i have as much disdain for Charlie's ramblings as i have for this piece of proze. I just don't like it when a site that i previously went to for reviews and analysis starts filling their pages with drivel.
Quote:
so let's try spelling this out:

IF NVidia refreshes their mid-range cards (Highly likely, as they've done so in the past) AND
IF the performance difference between it and the newly-released high-end card is similar to the difference between the 460 and 480 (Quite possible), AND
IF NVidia release it at the same price point (Which they are likely to do, as the manufacturing costs are the same),
THEN the "560" will wipe the floor with all currently released cards at it's price point (Namely, the 6870 and 6850).
Yeah, let's spell it out. None of those IFs are likely to happen. Ready? GO!

IF NVidia refreshes their mid-range cards (There are no signs they will do so before 28nm, so this chance is at best 50/50) AND
IF the performance difference between it and the newly-released high-end card is similar to the difference between the 460 and 480 (possible, but only if nVidia pulls NEW tricks out of the hat, see the part where my caps lock button got conveniently stuck above. Let's be generous, put this at 30%), AND
IF NVidia release it at the same price point (they won't as they will want to see ROI from their R&D on a new card, but let's say 50/50 again just for fun),
THEN the "560" will wipe the floor with all currently released cards at it's price point (Namely, the 6870 and 6850 (AMD might cut the prices significantly, but they probably won't so this has a likelihood of 80%)).
Quote:

Admittedly, there are 3 big IFs in there, but all those IFs are highly likely to happen.
50% * 30% * 50% * 80% = 6% ;)
Quote:

ATi/AMD have shown their hand for mid-range, and at the moment they have a winner. NVidia have not answered that play yet, and they're playing their cards close to their chest, but what they have shown for the high-end indicates a very strong mid-range hand. We'll soon find out just how ATi/AMD are going to answer NVidia's high-end challenge, and shortly after that we will probably hear NVidia's mid-range response. As it stands at the moment, though, if NVidia disabled an SM, downgraded the RAM, and called it good (As it has a history of doing, look at the 470, 9600 and so on) you get an idea of what a 560 would look like, and that is mighty tempting, and more powerful than the 68xx's. Which is where this blog post is coming from, IMHO.
And here is where it really goes wrong. The GTX460 is not the same chip as a GTX480, which is the ONLY reason why nVidia can put it at a good pricepoint. If GTX560 would be a NF110 chip, instead of a hypothetical NF114, there's no way they could compete at 6850/6870 prices.
Quote:
Originally Posted by rickysio
Oh wow, did I miss the memo about the comments being taken over by a fanboy parade today?
Trollbait article.

EDIT for GTX560 speculations above:
Try googling for GTX560. Aside from the Black Ice watercooling radiator, you won't find a thing, not even speculation. You guys are all geeding on what the clever naming people of nVidia are shovelling.
Snips 11th November 2010, 10:12 Quote
Wow that's dedication Xtrafresh. shame no one will read it.
memeroot 11th November 2010, 10:36 Quote
@Xtrafresh

" I was hoping/expecting to read about some genuine analysis, not wild conjecture that is easily proven false (the GTX460 has no SMs left to unlock"

I count that 460 has 48 sms locked

quote and picture,

http://www.techpowerup.com/reviews/MSI/GeForce_GTX_460_Cyclone_OC_1_GB/

NVIDIA's GeForce Fermi (GF) 104 GPU comes with 384 shaders (CUDA cores) in the silicon but NVIDIA has disabled 48 of them to reach their intended performance targets and to improve GPU harvesting. Unlike with GF100, the GF104 has more populated streaming multiprocessors (SMs), 48 cores per SM vs. 32 cores per SM on GF100.
fingerbob69 11th November 2010, 11:07 Quote
But as you point out MEMEROOT there was a reason for that, chip yeilds.

It's taken 8 months for nVidia to fix the 480 starting from a point where they knew on release it would need fixing. I'm not saying the 460 is simularly broken but would nVidia have already started about a respin to fix yields for the 460?


I think more like is that they'll bump up the clocks so that what is currently an oc'd 460 becomes a standard 560... as that card does currently sit between the 6850/6870.
Xtrafresh 11th November 2010, 11:10 Quote
Quote:
Originally Posted by memeroot
@Xtrafresh

" I was hoping/expecting to read about some genuine analysis, not wild conjecture that is easily proven false (the GTX460 has no SMs left to unlock"

I count that 460 has 48 sms locked

quote and picture,

http://www.techpowerup.com/reviews/MSI/GeForce_GTX_460_Cyclone_OC_1_GB/

NVIDIA's GeForce Fermi (GF) 104 GPU comes with 384 shaders (CUDA cores) in the silicon but NVIDIA has disabled 48 of them to reach their intended performance targets and to improve GPU harvesting. Unlike with GF100, the GF104 has more populated streaming multiprocessors (SMs), 48 cores per SM vs. 32 cores per SM on GF100.
Oh right, i was basing myself on a quick look here at this pic from anandtech:
http://images.anandtech.com/reviews/video/NVIDIA/GTX460/fullGF104.jpg

Most of the issues above still stand though. All this speculation of a GTX560 is complete BS, because a new design (NF114) would be needed, not a toned-down version of th NF110
memeroot 11th November 2010, 11:20 Quote
one of the big changes they made was in the transistors used across the 110 compared to the 100, this could be cross applicable to the 104 making the 114 enabling the activation of the SM's

I dont think however you will see the same increase in clocks as between the 100 and 110 (as the 104 already clocks high)
xaser04 11th November 2010, 11:59 Quote
Quote:
Originally Posted by Xtrafresh
granted, the 580 is a good card that has truely unlocked the potential of the Fermi architecture, but i'm not a big fan if this kind of wild conjecture. For instance, the NF104 is a completely different design then the NF100 (of which th NF110) is a direct derivative. There is no basis at all to believe that it has as much overhead left to be unlocked as the NF100, and for a start, the GTX460 it doesn't have any unused SMs like the GTX480, has it?

Going even further, you could say that the 6870 is the successor of the 5770, and it has a 70% performance increase (min framerate, 19x12, here: http://www.bit-tech.net/hardware/graphics/2010/11/09/nvidia-geforce-gtx-580-review/7) , so this means that the 6970 will also feature 70% performance hike over the 5870, obliterating nVidia till kingdom come.

Ofcourse, there's a milion things wrong with all that, just as your ludicrous math has no merit to it whatsoever. From nVidia fanboys i am expecting to see these kind of posts, but you are supposed to be a respected journalistic outlet. Yeah fine, it's a blog, and some speculation is fine, but please don't become a next Charlie.

The GF104 in the 460 has one SM disabled. Either this due to yields or to prevent straight up in house competition with the much-more-expensive-to-produce 470.

A '560' with 384 shaders and a core clock of 800 mhz could make a refresh of the 470 pointless (unless the latter has a SM increase to 480 plus a core clock increase - basically making it a 480).
memeroot 11th November 2010, 12:22 Quote
@xaser04

I think thats what the article suggests will happen
xaser04 11th November 2010, 13:34 Quote
Quote:
Originally Posted by memeroot
@xaser04

I think thats what the article suggests will happen

Indeed, I missed that at first glance.

It's a shame though that we will see an entire 'new' line of GPU's that arn't much faster than the 'previous gen' models.
Action_Parsnip 11th November 2010, 15:50 Quote
These are all very good points but all the competing ATI GPUs already released consume less power. In fact they will always have die size and power consumption on their side. The GTX 460 for instance is in it's current form already rather power hungry for the performance, they're very overclockable NOW but what happens with extra units enabled? They'll be less overclockable and consume more power again. Yes, were not in the GF100 league of power consumption and it's not excessive, but going forward the power numbers will only get worse and likely disproportionately so in relation to performance growth.

Although it's not the be-all and end-all of things, as much as GTX 580 reviews state wait on Cayman to launch, conclusions of future GF104 derivatives will say that they are hotter and larger chips than competing ATI products. Manufacturing cost and performance per watt will be lucrative avenues to compete with until 28nm is ready next year.
Fractal 12th November 2010, 11:36 Quote
As has been pointed out several times already; this blog is mere speculation based on some flawed premises. It shouldn't be on the front page as it is mixed in with articles and so could be easily confused with one.

Whether or not AMD is letting the DX11 lead slip through its' fingers; Bit-tech is letting its' journalistic integrity slip through its' fingers with such trash. Next thing you know they'll start reporting celebrity gossip.
rickysio 12th November 2010, 12:29 Quote
do_it_anyway 12th November 2010, 13:31 Quote
Quote:
Originally Posted by leveller
This is the bit of the article that has my attention:

"Unfortunately, while we know some interesting things about the forthcoming ATI Radeon HD 6900-series, we’re still not allowed to tell you anything about it"

Which can be taken either way. I hate secrets :(

True, but if you look at the GTX580 review, they say its awesome, BUT you would be a fool to buy one right now. They recommend waiting until AMD release their new cards. I would read this as "AMD has something good up its sleeve, just wait and see)

Now, to all those thinking that BT are being bias, the fact they recommend NOT going out and buying one yet is (IMO) proof that they are still unbias.
G0UDG 12th November 2010, 15:13 Quote
Here here well said do_it_anyway i think your spot on with your comment the fact that BT recommended holding off till AMD shows its hand is proof they are unbiased,precisely the reason why I buy pc upgrades based on both pc reviews and comments from forum members who own and have used the products we all know from experience we can trust BT reviews and advice certainly from my piont of view at least
general22 14th November 2010, 15:58 Quote
Wow it's a blog post that even mentions that it's conjecture and you fanbois turn it into a shitfest with misinformation flying around.

Why don't you idiots stop pledging your soul to a commercial entity and just enjoy the tech.
new_world_order 14th November 2010, 16:06 Quote
Quote:
Originally Posted by xaser04
Indeed, I missed that at first glance.

It's a shame though that we will see an entire 'new' line of GPU's that arn't much faster than the 'previous gen' models.

My thoughts exactly.
new_world_order 14th November 2010, 16:08 Quote
Quote:
Originally Posted by G0UDG
Here here well said do_it_anyway i think your spot on with your comment the fact that BT recommended holding off till AMD shows its hand is proof they are unbiased,precisely the reason why I buy pc upgrades based on both pc reviews and comments from forum members who own and have used the products we all know from experience we can trust BT reviews and advice certainly from my piont of view at least

Just noticed your avatar was Donald Sutherland from Kelly's Heroes, great movie :)
memeroot 14th November 2010, 16:50 Quote
"It's a shame though that we will see an entire 'new' line of GPU's that arn't much faster than the 'previous gen' models."

and with a cheapo 2nd hand gx2 says history repeats its self.....

having said that the new gen for me is getting interesting 580 is nice, new gx2 cards soon - wallet begging to be opened.
Zaim 14th November 2010, 17:00 Quote
Quote:
Originally Posted by WarrenJ
Personally, i would like to see a dual gpu card a'la GTX295's. maybe 2 460's or 2 470s? Are they on they're way yet?

I agree, There last dual card was the 295 which was good.
phinix 15th November 2010, 22:37 Quote
That's right - where is dual Fermi? I would say it will be dual GTX470 or a'la GTX570...
I would buy it, definitely...
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums