bit-tech.net

Nvidia says Fermi is great, but who knows?

Posted on 18th Jan 2010 at 12:28 by Clive Webster with 71 comments

Clive Webster
You’ve probably seen a fair bit of Fermi coverage around the interwebs today, and might even be wondering why we’ve not posted something. We were at the same briefing as these other sites, after all. If you’ve read the coverage elsewhere carefully, you’ll notice that there’s the odd dig at not seeing benchmarks, or hardware, from Nvidia and that’s why we’ve not posted a big extravaganza as well as Keir Graham’s mod.

While the briefing last Sunday was very informative about Nvidia's thoughts on a variety of topics (not just Fermi), there was little opportunity to test Fermi hardware. Nvidia is just getting the first chips back from the fab, and is still assessing the yields and quality of the silicon in order to set the final clocks, so we still think the actual launch is six to eight weeks away. That means that any article we posted today would essentially say, ‘Nvidia thinks forthcoming Nvidia product will be great’ with the inference, ‘so hold off on buying a HD 5850 or 5870.’

That just doesn’t seem fair – sure, if you can wait a couple of months then do so, but if you need an upgrade now, those cards are very good (if you can find stock). After all, there was no mention in the briefing of how much a Fermi card will cost.

But I don’t want to bash Nvidia too much in this blog, instead I wanted to talk about the atmosphere of those briefings last week. Despite a few self-critical and honest comments regarding Fermi’s lateness and how much Nvidia would have liked to have had a product when Windows 7 launched, the general mood was one of calm enthusiasm and quiet pride. It was the feeling you get handing in a piece of work that you’ve absolutely nailed – that essay that you just know you aced, the report you’ve been working on for weeks that’s going to get you noticed by Them Upstairs, the mod you drop onto the forums that you know will get even the pros in a fluster. Nvidia, I think, reckons it’s on to an absolute winner with Fermi.

Whether this is true remains to be seen – even if the claims of being twice as fast as a GTX 285 and having eight times the geometry processing power are true, the price of the 3bn transistor GPU could still make a Fermi uncompetitive. With clock speeds yet to be defined, and only a couple of Nvidia-approved tests to go on, we’ll also avoid making any assumptions and postulations of how much faster or slower a GF100-equipped card will be than Radeon HD 5870 for the moment.

While Nvidia had a lot of positive things to say about Fermi, we’ll wait until we have hardware to test in order to temper that enthusiasm with our own findings and objective analysis. We’re therefore using the Fermi briefing of last Sunday to get more of a head start on our technical understanding than we’ve been used to in the past.

71 Comments

Discuss in the forums Reply
wormy 18th January 2010, 13:33 Quote
and my girlfriend is the most beautiful woman in the world...but I'm not gonna let you see her...
AlexB 18th January 2010, 13:39 Quote
Quote:
Originally Posted by wormy
and my girlfriend is the most beautiful woman in the world...but I'm not gonna let you see her...

That's OK, I can do it with my eyes closed.
wormy 18th January 2010, 13:40 Quote
Quote:
Originally Posted by AlexB
That's OK, I can do it with my eyes closed.

keep your hands where I can see them!
yakyb 18th January 2010, 13:42 Quote
Quote:
Originally Posted by AlexB
Quote:
Originally Posted by wormy
and my girlfriend is the most beautiful woman in the world...but I'm not gonna let you see her...

That's OK, I can do it with my eyes closed.

eyes closed bag over the head same net result
andrew8200m 18th January 2010, 13:46 Quote
Its all good to see and I am sure fermi will be a very good card indeed but to say "hold off buying ati parts" is just out of order. Im still waiting as I have a hole burning in my pocket. Do I get 2x 5970 or do I get 3x fermi cards? Hell.. do I wait and get 2x fermi dual chp cards when they are released? I dont know! LIfes full of complications at the minute, its about time Nvidia pulled there fingers out ad got this all sorted.

Andy
wormy 18th January 2010, 13:49 Quote
Quote:
Originally Posted by andrew8200m
Its all good to see and I am sure fermi will be a very good card indeed but to say "hold off buying ati parts" is just out of order. Im still waiting as I have a hole burning in my pocket. Do I get 2x 5970 or do I get 3x fermi cards? Hell.. do I wait and get 2x fermi dual chp cards when they are released? I dont know! LIfes full of complications at the minute, its about time Nvidia pulled there fingers out ad got this all sorted.

Andy

Let me save your pockets by taking that money out of them and I...ahem...promise that you will get a significant return on it in due course.

Legal stuff - all of the above by me is a lie and it's more likely that I'll spend the dough and pair of swanky new 24" monitors to give me a 3 screen Eyefinity setup having decided a few months ago that I might as well get a 5850 and be done with worrying about what NVIDIA might produce one day as yet unannounced
wuyanxu 18th January 2010, 14:40 Quote
the price of Fermi will be determining factor in whether it is a success. if they can manage a product that beats 5870 and release it at £240, undercutting 5870 by a lot, then it'd be a winner.

but then, it's nVidia we are talking about, with Fermi 2 being the same GPU as fermi 1, just a redesigned cooler
Tyrmot 18th January 2010, 14:42 Quote
Fair enough. No point doing their advertising for them till some benchmarks are available after all. Even if you've got a wad of cash burning a hole in your pocket I don't see that it can hurt to wait a couple months and see what the competition can offer
licenced 18th January 2010, 15:08 Quote
Quote:
Originally Posted by wuyanxu
the price of Fermi will be determining factor in whether it is a success. if they can manage a product that beats 5870 and release it at £240, undercutting 5870 by a lot, then it'd be a winner
If you had a product that beat the competition on performance would you sell it for 20% less as well, especially if the competition's product was still hard to find on shelves? There's no way nVidia will undercut ATI unless they need to match them on a price-to-performance level.
Xir 18th January 2010, 15:43 Quote
Well...if Fermi's good...the price of the 5870 will go down.
if Fermi's THAT good...I might buy one.
Either way, I win :D
Claave 18th January 2010, 16:09 Quote
Quote:
Originally Posted by andrew8200m
Its all good to see and I am sure fermi will be a very good card indeed but to say "hold off buying ati parts" is just out of order.
Andy

Just to be clear, Nvidia never said 'don't buy ATI parts' - it's well aware that doing so would be out of order. But by saying its forthcoming product will be excellent, that's the implication (as i said, the inference is there).
thehippoz 18th January 2010, 17:00 Quote
just hope the yields are ok.. that's alot of transistors
H2O 18th January 2010, 17:34 Quote
Quote:
Originally Posted by wuyanxu
the price of Fermi will be determining factor in whether it is a success. if they can manage a product that beats 5870 and release it at £240, undercutting 5870 by a lot, then it'd be a winner.

but then, it's nVidia we are talking about, with Fermi 2 being the same GPU as fermi 1, just a redesigned cooler

There is no chance that the top end versions (GTX360/GTX380) will be that low. TSMC is having some problems with its fab process for these cards, and since the die is so gigantic, the number of chips good that Nvidia gets back is very little compared to the HD58xx chips.

There have been reports that the production is so complicated, Nvidia is having problems getting chips with all 512 CUDA cores in working order. We do not know if they are getting any back at all, and if they are, how many. As a result, the GTX360 will probably be using defective chips with only 448 cores enabled.

The general consensus on the XS forums is that the GTX380 will be around $650.
Teelzebub 18th January 2010, 17:42 Quote
Was reading this on another site about fermi quite interesting
--------------------------------------------------------------------------------

I've got screen shots of the benchmarks but they aren't really indicative of final performance.

The drivers and firmware on the cards are very beta and sure to show decent improvement before launch. All the GPU itself, hardware wise needs, is a final fan and packaging. Fan and packaging can be finalized in a week so it's drivers and software that need polish.

Figure that for the first time they have parallel geometric processing and have to invent the way to use it correctly. Then think how long quads have taken for programmers to use the cores. Nvidia is doing that in a very short time frame without the benefit of the computing community in general contributing code.

That's a pretty daunting task.

Given the 80(ish)% increase in performance over GTX-285 with rough drivers and firmware, crappy game support for what Fermi brings to the table new, we are probably looking at a 100% performance increase in raw FPS.

Once game developers dip into the cornucopia that is Fermi we'll likely see further gains.

I know if I were in game developing I'd be drooling to get Fermi firing on all 8 cylinders.

A contact tells me that nothing will currently show off Fermi's Tessellation like Unigine's new bench so maybe Unigine had a clue into Fermi's design.

Rumors still abound about poor yield at the foundry but I was told that "Fermi will be surprisingly affordable" so I'm not sure if yield problems are an issue.

Nvidia didn't provide a slide deck on APEX which is their in house programmer GUI for PhysX that they just give away. We saw them put PhysX on a moving figure in about 4 minutes so it makes it easy to include PhysX into games. that might be a response to OpenCL on the horizon. They already support OpenCL but here's the hitch.

Game developers might not accept OpenCL as readily as PhysX because if something blows up who do they go to it's open source code. PhysX buggers up Nvidia sends an optimization team to support the in game implementation.

Personally I like both camps but lets face facts. 58xx is a good solid GPU but it was a throw twice as much components at the GPU based on the same design we already had for a long time solution. But it's a very good component stuff. The saving grace is ATI is due a core revamp and Tessellation and DX11 should drag long enough to give them time to develop a new core.

That's if the Geometry gamble Nvidia made pays off. There's the off chance that developers will look at it and say most games are console ports and we have no use for that. In which case Fermi still has an 80 - 100% performance increase.

In the mean time, at launch, they will have the new core revamp and take back the performance crown with a generational leap in core technology over ATI.

Gotta love competition it's good for everyone.
__________________
Quadzilla i7
thehippoz 18th January 2010, 17:42 Quote
I can see that easy.. nvidia had issues with the 280 on release too with a batch of silicon that ran too hot.. for 650 in todays economy it better come with some tits
D-Cyph3r 18th January 2010, 17:49 Quote
Nvidia SAYS alot, thats their problem.
thehippoz 18th January 2010, 17:55 Quote
Quote:
Originally Posted by Bumsrush
The saving grace is ATI is due a core revamp and Tessellation and DX11 should drag long enough to give them time to develop a new core.

I wouldn't be surprised if they keep ahead of fermi with the headstart they've had.. opencl is the only way to go

lol yeah dcyph3r exactly
dec 18th January 2010, 22:03 Quote
you know how they say people who drive massive cars are just trying to make up for something? maybe fermi is all "hey look i got 3 trillion transistors!"

lol jk. they need to seriously get those cards out and push prices down.
Bede 19th January 2010, 01:29 Quote
Anyone else sometimes feel that having both ATI and Nvidia using TMSC, a crap manufacturer which simply cannot perform as required, is a bad thing?
Xir 19th January 2010, 09:40 Quote
wellll...ATI should have the AMD-knowledge behind it now.
And AMD, even if their designs aren't quite up to speed, has good manufacturing processes...
Errr...I mean Global Foundries has good manufacturing processes.

Process transfer isn't as easy as Intel makes it look :D
Remember when AMD tried to run it's process at Chartered? Started building their own fab in dresden 6 months later. :D
frontline 19th January 2010, 14:25 Quote
Maybe there will be some games out that actually need this much power by the time it is released....Or more likely we will be playing Mass Effect 2, Bisohock 2, COD whatever_version_is_next, Battlefield Bad Company 2, the latest orange box engine game and various other sequels/console ports on hardware that it is more than up to the task already.

But at least it should drive down prices of the 5800 series towards 4800 series levels (hopefully).
Star*Dagger 20th January 2010, 09:08 Quote
nv-who?!?

Are those the guys who made may 8800gtx (now long retired), that I replaced with a 4870x2, which i am replacing with a 5970 and 3 27" monitors.

Try again another time nvidia, you are aced for the foreseeable future.
technogiant 21st January 2010, 20:40 Quote
TSMC have now solved the 40nm process problem so hopefully availibility will be good from both camps regardless of die size.
H2O 22nd January 2010, 03:31 Quote
I wouldn't hold my breath. From what we have heard, GF100 will run hot, use lots of power, and there is the possibility that even the A3 silicon is not good enough for long term production (there are rumors of Nvidia going to B1). There is also a member on XS who seems to have extensive inside knowledge on GF100 (he was correct about the deep-dive NDA release date, among other things) and he says that GF100 is not as good as everyone is saying. I hope he is wrong, but it appears that GF100 will not shine until a revision or a shrink to the 28nm node.
Krayzie_B.o.n.e. 22nd January 2010, 06:07 Quote
HD 5000 or Fermi.... how about who cares!!

My twin XFX HD4890's @ 1ghz can run everything under sun absolutely fine. I do have an itch to upgrade but theres one problem...

Is there any games out to justify an upgrade? NO!! seems as of late everything is a dam 360 port. I got a nice system that I refuse to play 360 ports on.

I'll upgrade when the PC specific games are released. other than that I might as well by a PS3 with my GPU money.
Elton 22nd January 2010, 06:40 Quote
I have an itch to upgrade.

My HD4850 is slowly not doing well.
Star*Dagger 24th January 2010, 22:33 Quote
Quote:
Originally Posted by Elton
I have an itch to upgrade.

My HD4850 is slowly not doing well.

Upgrade to a 5870, or a 5970 if you can find and afford one.
crazyceo 27th January 2010, 09:31 Quote
(D-Cyph3r 18th January 2010, 16:49 Nvidia SAYS alot, thats their problem.)

(Star*Dagger 20th January 2010, 08:08 nv-who?!?

Are those the guys who made may 8800gtx (now long retired), that I replaced with a 4870x2, which i am replacing with a 5970 and 3 27" monitors.

Try again another time nvidia, you are aced for the foreseeable future.)

Really guys?

AMD has been the biggest bullshit artists for the last decade. Constantly overstating it's own importance and then conveniently not being available for comment when proved wrong. The recent Interview with Richard Huddy was a classic steaming pile that again will not be taken as gospel by anyone worth a carrot.

How can you hail the recent ATi release as all conquering when NOBODY knows exactly what Fermi holds?

If you are building a system today I would recommend a GTX260 which can be picked up for around £110 today and play most games at decent rates. Then wait for Bit-Tech benchtests in a few months before deciding which way to go.
Elton 27th January 2010, 13:05 Quote
Quote:
How can you hail the recent ATi release as all conquering when NOBODY knows exactly what Fermi holds?

Simply because it currently is conquering. And that Fermi isn't out yet.
crazyceo 27th January 2010, 15:36 Quote
When some are recommending spending the cash NOW and lets be honest, the higher end cards aren't exactly cheap. Isn't it prudent to wait a mere 2 months before spending £300+ on cards that may or may not be the best today? They are assuming it will remain "ALL CONQUERING!" with no proof whatsoever.

Or you can keep those blinkers on!
Elton 28th January 2010, 00:36 Quote
Quote:
Originally Posted by crazyceo
When some are recommending spending the cash NOW and lets be honest, the higher end cards aren't exactly cheap. Isn't it prudent to wait a mere 2 months before spending £300+ on cards that may or may not be the best today? They are assuming it will remain "ALL CONQUERING!" with no proof whatsoever.

Or you can keep those blinkers on!

Well prudence isn't very well known to many. Personally I agree, I'm waiting for Fermi, not because it might win but because everything will be cheaper.

As far is it goes though, you do have to admit, ATi is currently dominating, It's unrealistic to say that the RV8xx is all conquering as that's ludicrous, but you do have to admit that it is in the lead.
crazyceo 28th January 2010, 08:53 Quote
It currently has no competition, so yes. The sad thing about it is that it really hasn't jumped ahead by any huge margin. When the 8800GTX came out, it was just a massive leap forward which took quite a few years to realistically catch up to. Even Nvidia struggled to follow it up because it was so good. Have ATi leaped forward anywhere near as well as that? Honestly, no.

Every generation since has only really been a little step forward. Hopefully, Fermi (isn't that a pasta?) can kick start the gfx market with another huge leap forward and get the competition thinking again.
Elton 28th January 2010, 08:57 Quote
Yeah it is true, the HD58xx is undoubted not that big of a jump, but it's still quite competitive..

It's faster than the 2900XT by about 3 times, which if you look at it is quite impressive. Although, admittedly it took them a while.

And Fermi was an engineer, not pasta, although Fettucini is a pasta.
barndoor101 28th January 2010, 09:26 Quote
Quote:
Originally Posted by crazyceo
How can you hail the recent ATi release as all conquering when NOBODY knows exactly what Fermi holds?

If you are building a system today I would recommend a GTX260 which can be picked up for around £110 today and play most games at decent rates. Then wait for Bit-Tech benchtests in a few months before deciding which way to go.

because the latest ATi release has been selling as fast as it has been produced since it was released im not sure what counts as 'conquering' but that comes close. nvidia have had the entire market ripped out from under them - in the top end the HD5870 is twice as fast as the GTX285, and nvidia is struggling to make a profit with its lower end cards.

Comparing a product which isnt even released yet to one which has been selling like hotcakes is foolish. ATi's product is out there for you to buy, nvidias is hiding behind NDAs and crappy slideshows. even sites (like BT) who were arguably nvidia fans have started to doubt some of the stuff coming from them.

Also, where is this mythical 100 quid GTX260? cheapest i found was 130 on scan. you can buy a HD4890 for this price and get +20% performance.
crazyceo 28th January 2010, 10:04 Quote
I remember CustomPC doing a benchtest on all the current gfx cards before the 5xxx came out. It went through all the old models from both companies and then made a recommendation on whether to upgrade or not. Since I owned a 8800GTX I was keen to see whether it was worth the money to upgrade. I just checked and it was this time last year issue 65 (Feb 2009). It recommended that I would need to spend a serious amount of cash to gain a worthwhile performance increase.

One year on and I don't really think its worth the immediate upgrade. Yes I will benefit from reduced power consumption and a new card would be a lot cooler. However, unless it's another huge step forward is it really worth it?

At the very least, I'll hang on and see what Fettucini brings.
crazyceo 28th January 2010, 10:07 Quote
"Also, where is this mythical 100 quid GTX260? cheapest i found was 130 on scan. you can buy a HD4890 for this price and get +20% performance."

It was an offer on Aria a few weeks ago. Fanboy quotes aside
barndoor101 28th January 2010, 10:35 Quote
Quote:
Originally Posted by crazyceo
"Also, where is this mythical 100 quid GTX260? cheapest i found was 130 on scan. you can buy a HD4890 for this price and get +20% performance."

It was an offer on Aria a few weeks ago. Fanboy quotes aside

A one-day offer does not a cheap card make. Cheapest there is now 145 quid.

What i take issue with is that you are comparing a product that has introduced a new version of dx, is making money, has review scores etc, with a product that hasnt been released yet - then saying the released product doesnt push the boundaries enough.

its like saying Crysis wasnt advanced enough because Duke Nukem Forever was just around the corner...
Elton 28th January 2010, 11:09 Quote
I think Crazyceo is just addressing the crazed fanboys who really think that the HD5xxx are the all time dominators even after Fermi comes out.

Of course it's speculation, and now with a Dwindling supply of GT200 chips(and an odd surge in RV7xx chips) you have terrible prices.
barndoor101 28th January 2010, 11:21 Quote
Quote:
Originally Posted by Elton
I think Crazyceo is just addressing the crazed fanboys who really think that the HD5xxx are the all time dominators even after Fermi comes out.

Of course it's speculation, and now with a Dwindling supply of GT200 chips(and an odd surge in RV7xx chips) you have terrible prices.

problem is that is this industry if you wait for the Next Big Thing™ you would never spend any money. all you can do take a snapshot of what is currently the best item for your current price range.
crazyceo 28th January 2010, 12:01 Quote
Christ get off the high horse and read the posts for a change.

I haven't compared anything with future yet to be released products. It was the other ATi fanboys claimimg ATi will still be "ALL CONQUERING!" after Fermi is released.

Yes you are right, the current crop of cards do not go far enough. DX11? whose going to immediately benefit from that? DX10 has only just got to grips with the community. Making money? AMD made money last year due to Intel paying them off. Without it, they made losses! Review scores? That just proves my point that the new 5xxx is just a very small step away from the 4xxx series. If you purchased a high end 4xxx last year, are you going to spend all that money again on a high end 5xxx this year? Obviously no.

As to the £110 GTX260, Aria had them running for a while but popularity obviously took over and they sold that batch out. Be quicker next time.

Just because YOU own one, doesn't make it the greatest.
memeroot 28th January 2010, 12:25 Quote
Only 3d realy seems to offer the benefits that I'm looking for with the new batch of cards... Would rather have that and the power benefits of the ATI cards but there doesn't seem to be that option at the moment.

hence waiting to see what fermi offers
barndoor101 28th January 2010, 13:15 Quote
Quote:
Originally Posted by crazyceo
Christ get off the high horse and read the posts for a change.

I haven't compared anything with future yet to be released products. It was the other ATi fanboys claimimg ATi will still be "ALL CONQUERING!" after Fermi is released.

Yes you are right, the current crop of cards do not go far enough. DX11? whose going to immediately benefit from that? DX10 has only just got to grips with the community. Making money? AMD made money last year due to Intel paying them off. Without it, they made losses! Review scores? That just proves my point that the new 5xxx is just a very small step away from the 4xxx series. If you purchased a high end 4xxx last year, are you going to spend all that money again on a high end 5xxx this year? Obviously no.

As to the £110 GTX260, Aria had them running for a while but popularity obviously took over and they sold that batch out. Be quicker next time.

Just because YOU own one, doesn't make it the greatest.

dude just chill. the only reason ati fanboys are saying that the 5000 series are all conquering is because right now they are. the HD5870 is twice as fast as the next single-gpu (GTX285).

just remember that the last time there was a massive leap forward (like 3x the performance of last-gen) was the 8800 (3 yrs ago), but nvidia havent been able to that trick since.

for someone building a PC right now (that they want to use for gaming), the HD58xx series is what you are forced to get. fermi might be released in march, but when is general availability? 1-2 months down the line? Also, dont expect a huge leap forward when compared to the 58xx series. Latest i heard was a 15-20% increase at best (with a chip almost twice as big and right on the edge of the ATX power envelope).

i have 2x HD4890s in CF. I have considered getting a HD5870 to replace them, just because i think the 2 cards are noisy, and for eyefinity. but now i might just wait for the 58xx series refresh.
Elton 28th January 2010, 13:25 Quote
Quote:
just remember that the last time there was a massive leap forward (like 3x the performance of last-gen) was the 8800 (3 yrs ago), but nvidia havent been able to that trick since.

Actually no one has.
barndoor101 28th January 2010, 14:00 Quote
Quote:
Originally Posted by Elton
Actually no one has.

thats my point - no-one has been able to make such a huge leap forward, so to expect it from every new product generation is asking too much.
crazyceo 28th January 2010, 21:16 Quote
Quote:
Originally Posted by barndoor101
Quote:
Originally Posted by Elton
Actually no one has.

thats my point - no-one has been able to make such a huge leap forward, so to expect it from every new product generation is asking too much.

"Also, dont expect a huge leap forward when compared to the 58xx series. Latest i heard was a 15-20% increase at best "

Based on what evidence when no one is quoting facts about the structure of Fermi because no one knows.

Why can't we expect every new release to push the limits like the 8800GTX did? That's the yardstick they all have to aim for. Have ATi done it in the last 4 years? No! Have Nvidia recaptured that excellence? On current product base clearly no! Will Fermi recapture it? No one including you have no any idea other than ATi fanboy rumours and wishes they don't.

That's the problem with the current ATi range, it doesn't push enough past the 4xxx series to warrant the upgrades to most people.

As a community, we have to demand ATi and Nvidia to push those boundaries and not just rush out the next replacement but with DX11 or a hdmi port or two but just change the number at the front.

Why is that asking too much?

You can settle for far less if you like but I'm not parting with £400+ on a card that gives me nothing more than a hdmi port.
barndoor101 28th January 2010, 21:55 Quote
Quote:
Originally Posted by crazyceo
"Also, dont expect a huge leap forward when compared to the 58xx series. Latest i heard was a 15-20% increase at best "

Based on what evidence when no one is quoting facts about the structure of Fermi because no one knows.

Why can't we expect every new release to push the limits like the 8800GTX did? That's the yardstick they all have to aim for. Have ATi done it in the last 4 years? No! Have Nvidia recaptured that excellence? On current product base clearly no! Will Fermi recapture it? No one including you have no any idea other than ATi fanboy rumours and wishes they don't.

That's the problem with the current ATi range, it doesn't push enough past the 4xxx series to warrant the upgrades to most people.

As a community, we have to demand ATi and Nvidia to push those boundaries and not just rush out the next replacement but with DX11 or a hdmi port or two but just change the number at the front.

Why is that asking too much?

You can settle for far less if you like but I'm not parting with £400+ on a card that gives me nothing more than a hdmi port.

then dont. no-one is forcing you to.

but think about it this way. how much R&D money do you think it takes to create a product 3-4 times better than the previous generation? do you think these companies will spend that much EVERY generation? of course they wont (which is why you find 8800 tech in the current gen of nvidia cards). it is in their interests to give an evolutionary change as opposed to a revolutionary one, simply because some people (like most who frequent these forums) will stay at the bleeding edge, and will pay good money to have that extra edge, as insignificant as it is. then there are people like yourself who havent upgraded since the 8800 - no extra money going in nvidias pockets on the 200 series.

maybe they could recoup their R&D spending every revolutionary generation, but they would make more money with the evolutionary path.
Elton 28th January 2010, 22:13 Quote
Quote:
That's the problem with the current ATi range, it doesn't push enough past the 4xxx series to warrant the upgrades to most people.

You could say the same for the X1950s ---> HD2xxx, the 9800Pro users ----> X800XTs...

It's unreasonable to expect a company to make a product +3X faster than their current one, and to be honest, the HD5xxx is 2x faster than the previous gen, take a look, the only reason it doesn't seem so is that we had X2 cards as of late.
crazyceo 28th January 2010, 23:13 Quote
Quote:
Originally Posted by barndoor101
Quote:
Originally Posted by crazyceo
"Also, dont expect a huge leap forward when compared to the 58xx series. Latest i heard was a 15-20% increase at best "

Based on what evidence when no one is quoting facts about the structure of Fermi because no one knows.

Why can't we expect every new release to push the limits like the 8800GTX did? That's the yardstick they all have to aim for. Have ATi done it in the last 4 years? No! Have Nvidia recaptured that excellence? On current product base clearly no! Will Fermi recapture it? No one including you have no any idea other than ATi fanboy rumours and wishes they don't.

That's the problem with the current ATi range, it doesn't push enough past the 4xxx series to warrant the upgrades to most people.

As a community, we have to demand ATi and Nvidia to push those boundaries and not just rush out the next replacement but with DX11 or a hdmi port or two but just change the number at the front.

Why is that asking too much?

You can settle for far less if you like but I'm not parting with £400+ on a card that gives me nothing more than a hdmi port.

then dont. no-one is forcing you to.

but think about it this way. how much R&D money do you think it takes to create a product 3-4 times better than the previous generation? do you think these companies will spend that much EVERY generation? of course they wont (which is why you find 8800 tech in the current gen of nvidia cards). it is in their interests to give an evolutionary change as opposed to a revolutionary one, simply because some people (like most who frequent these forums) will stay at the bleeding edge, and will pay good money to have that extra edge, as insignificant as it is. then there are people like yourself who havent upgraded since the 8800 - no extra money going in nvidias pockets on the 200 series.

maybe they could recoup their R&D spending every revolutionary generation, but they would make more money with the evolutionary path.

BHAAHHHHHH!

Sorry just pretending to be a sheep like you!

It would be cutting edge if was good. sadly it isn't! Let's just wait for Fermi and then decide and not be a ATi fanboy like you!

The companies SHOULD be spending the money from their successes on the R&D for the next generation of cards. Nvidia didn't and thus the GTX200 range although good just wasn't great. ATi had been playing catch up for a few years after getting spanked by Nvidia and therfore had the time to develope an OK card but again it wasn't great.

I could put my hand in my pocket today and happily go 3 way system but won't until I see the next generation pan out. You go follow the rest of your sheep and potentially waste your money.

I'll wait and see what Fermi brings to the table.
thehippoz 28th January 2010, 23:34 Quote
ati said in that interview with bt.. they won't be ahead of fermi at launch but they will later in the year.. they've had their card out since last september.. but unlike nvidia doubt they've been sitting on their hands the whole time expecting to milk out the competition

really both companies are money milkers.. since they were caught price fixing already I don't see how you can like one over the other.. just go with the price to performance (not to mention features)- it'd be kinda lame to buy nvidia right now..

you gotta admit nvidia's dropped the ball this round.. just releasing fermi and ati has had all this time, they will counter they said it themselves.. I'd like to see the leobeater card myself

I'm probably getting the 5850 when price drops some.. pretty sure they won't have much to compete with that card pricewise
barndoor101 29th January 2010, 14:14 Quote
Quote:
Originally Posted by crazyceo


*incoherent babble*

not being a sheep, just being a realist. i realise that these companies first loyalty is to their creditors and investors, and they have to make as much money as possible. perhaps you should remove your head from your arse then you might see this too.
Elton 30th January 2010, 03:25 Quote
Quote:
The companies SHOULD be spending the money from their successes on the R&D for the next generation of cards. Nvidia didn't and thus the GTX200 range although good just wasn't great. ATi had been playing catch up for a few years after getting spanked by Nvidia and therfore had the time to develope an OK card but again it wasn't great.

This is where your argument somewhat fails. You can't realistically expect any, I repeat, any company to use all of it's resources on one project, that's first of all risky, secondly expensive, thirdly makes the investors unhappy, and finally it is just unrealistic. Not to mention that the G80/G92 chips were good enough that almost nothing could catch up to it, but look at the time it took for them to get there...

Late 2005-->2007 for the G80, it only took a year for the G80 to be replaced by the GT200. If they didn't replace the G80, they would've been stomped on(well even more) by the RV7xx chips.

Also, in this market top performance doesn't win anything, and in fact isn't even the goal, the goal is to 1 up the opposing team just slightly or enough to convince consumers to purchase their product. If they had G80 like advances every generation, well frankly, we wouldn't need GPUs for a while.
thehippoz 30th January 2010, 07:15 Quote
they could have kept on top like intel does amd if they kept going.. think they were riding on their success as the fastest single gpu cards for quite awhile and milking that.. I really hope fermi is all they say it is- but learned not to believe it until I see it..

same with ati.. remember the hype after the 8800gtx was released- I put off buying until that pos ati released, was a big fan of the older x800- even us ati fanboys jumped ship over to the g80 as they had fixed the image quality issues and managed to make ati's new card look clueless

price we were looking at 600+ on the gtx.. considering how good it was at the time- I bought one and it kept it's value long enough to sell it for 350 on ebay and go with the 9800gtx on release.. that card was a downgrade in certain respects like supersampling

think that's what buyers into nvidia thought about g92.. it was nothing but a cheap way for them to make a sub par 8800gtx.. wasn't until the 200 series we saw a real jump up

can say nvidia cards hold their value.. went through 3 generations now and expect to sell off this 260 for around 80 bucks- but probably going over to that 5850 unless fermi is affordable, or so far over the top ridiculous (like the g80 was) that it's worth the price

not really likin nvidia right now
IanW 30th January 2010, 08:14 Quote
Quote:
Originally Posted by Cutter McJ1b
Nvidia says Fermi is great, but who knows?

And I say "Ship or GTFO!":|
crazyceo 30th January 2010, 14:33 Quote
Quote:
Originally Posted by barndoor101
Quote:
Originally Posted by crazyceo


*incoherent babble*

not being a sheep, just being a realist. i realise that these companies first loyalty is to their creditors and investors, and they have to make as much money as possible. perhaps you should remove your head from your arse then you might see this too.

White Noise, nothing more. Realist? Again you bring nothing to the conversation.
crazyceo 30th January 2010, 14:41 Quote
"This is where your argument somewhat fails. You can't realistically expect any, I repeat, any company to use all of it's resources on one project, that's first of all risky, secondly expensive, thirdly makes the investors unhappy, and finally it is just unrealistic. Not to mention that the G80/G92 chips were good enough that almost nothing could catch up to it, but look at the time it took for them to get there..."

Once it was there, it stuck around for over 2 years for someone to even come close to it. Don't you think in that time, they could have come up with something better?

It's naive to think just because they did it once they couldn't do it again or ATi couldn't release a product making the same level impact. It's almost as if both companies are just making do since there really isn't anything out there to software wise to task them.

Both companies are making money (ATi especially after the Intel handout), so why not push the boundaries.

Otherwise, every release will still be held up to the 8800GTX and asked have they made that level of impact. Unless Fermi can, the question will still go on being asked of both companies.
barndoor101 30th January 2010, 18:10 Quote
Quote:
Originally Posted by crazyceo
White Noise, nothing more. Realist? Again you bring nothing to the conversation.

Ah my mistake. I didnt realise that 'bringing something to the conversation' meant calling anyone who disagreed with you a fanboy then insulting them.

Till nvidia release Fermi me and alot of other people will be skeptical about anything they say, based on their past deeds.
Elton 30th January 2010, 23:37 Quote
Quote:
Once it was there, it stuck around for over 2 years for someone to even come close to it. Don't you think in that time, they could have come up with something better?

Remember that the G80 chip was a revolutionary chip, much like the RV3xx chips. The GT200 was an evolution or refinement of the G80, as was the G92, although the G92 really shouldn't count since it's more or less a die shrink.
Quote:
It's naive to think just because they did it once they couldn't do it again or ATi couldn't release a product making the same level impact. It's almost as if both companies are just making do since there really isn't anything out there to software wise to task them.

Well look at their histories, every DX iteration there's usually a revolutionary card followed by cards that refined the process(from shader pipelines method that were refined until the G70 and the RV5xx to the Shader Cores that were created during the G80/RV6xx series and refined even to this day) from the RV3xx which marked the DX9, to the G80, these were revolutionary for sure, but inbetween most people didn't need to upgrade every gen, rather every other Gen.
Quote:
Both companies are making money (ATi especially after the Intel handout), so why not push the boundaries.

Because the high end market isn't the most profitable, the big money lays in price/performance for OEMs and mid--low range consumers.
Quote:
Otherwise, every release will still be held up to the 8800GTX and asked have they made that level of impact. Unless Fermi can, the question will still go on being asked of both companies.

Well if you look back many GPUs were held against the 9800PRO, which still today is quite formidable. And until we find a new revolutionary method for GPUs, they will be compared to the 8800GTX, because it created well a damn good performing product for the Shader Core method.
crazyceo 31st January 2010, 14:53 Quote
Quote:
Originally Posted by barndoor101
Quote:
Originally Posted by crazyceo
White Noise, nothing more. Realist? Again you bring nothing to the conversation.

Ah my mistake. I didnt realise that 'bringing something to the conversation' meant calling anyone who disagreed with you a fanboy then insulting them.

Well, stop it then!

Till nvidia release Fermi me and alot of other people will be skeptical about anything they say, based on their past deeds.
barndoor101 31st January 2010, 15:20 Quote
quoting fail. i havent called you a fanboy in the slightest, yet you persist in doing it.

you say you wanted AMD to push the boundaries. how could they have done this when the intel settlement happened AFTER the HD58xx shipped? you cant reinvent the wheel every generation when you have no money (although the gfx part of AMD - ATi is the part which makes the most money).
Horizon 31st January 2010, 16:11 Quote
Well, If it does outperform the 5800, they have to get the pricing right considering it runs hot and treats the psu like an all you can eat buffet. ATi is going to resort to the obvious dick move by dropping the price on the 5800 cards but it depends on Nvidia whether they lower the prices a little or through the floor boards. I'd like a 5870 for $299-250, but one for $250-199 I'd like even more.
Bindibadgi 31st January 2010, 16:31 Quote
Quote:
Originally Posted by Horizon
Well, If it does outperform the 5800, they have to get the pricing right considering it runs hot and treats the psu like an all you can eat buffet. ATi is going to resort to the obvious dick move by dropping the price on the 5800 cards but it depends on Nvidia whether they lower the prices a little or through the floor boards. I'd like a 5870 for $299-250, but one for $250-199 I'd like even more.

DICK MOVE?

They are entitled to charge whatever they want to remain competitive. It's a free market economy. If they took the initiative to control the market early, well, good for them. How will it play out in the long term? We will see. ATI is well within its right to get a little money now - just how Intel does in its performance lead. Do you think it really costs them $300 more to make an i7 960 versus i7 930?

The fact you can get a 1TFlop product of multi-billion transistors from whoever, at that price is astonishing to the point of inconceivable 5 years ago. Without profit, these companies won't be competitive and make future products to keep PC gaming alive.
Horizon 31st January 2010, 17:00 Quote
Quote:
Originally Posted by Bindibadgi
DICK MOVE?

They are entitled to charge whatever they want to remain competitive. It's a free market economy. If they took the initiative to control the market early, well, good for them. How will it play out in the long term? We will see. ATI is well within its right to get a little money now - just how Intel does in its performance lead. Do you think it really costs them $300 more to make an i7 960 versus i7 930?

The fact you can get a 1TFlop product of multi-billion transistors from whoever, at that price is astonishing to the point of inconceivable 5 years ago. Without profit, these companies won't be competitive and make future products to keep PC gaming alive.

You're right, that was uncalled for.
Bindibadgi 31st January 2010, 17:24 Quote
Quote:
Originally Posted by Horizon
You're right, that was uncalled for.

:) +rep for being cool
crazyceo 31st January 2010, 20:53 Quote
Quote:
Originally Posted by barndoor101
quoting fail. i havent called you a fanboy in the slightest, yet you persist in doing it.

you say you wanted AMD to push the boundaries. how could they have done this when the intel settlement happened AFTER the HD58xx shipped? you cant reinvent the wheel every generation when you have no money (although the gfx part of AMD - ATi is the part which makes the most money).

Spoken like a true fanboy. Big Yawn!
barndoor101 31st January 2010, 22:24 Quote
Why thank you for that thoughtful and intelligent post.

Exactly how are you meant to create a revolutionary new product when you have no money? Try following your own advice and actually say something worthwhile instead of mouthbreathing like normal.
Quote:
Originally Posted by crazyceo
White Noise, nothing more. Realist? Again you bring nothing to the conversation.
Bindibadgi 1st February 2010, 00:42 Quote
Any more flames, troll or baiting gets people thrown out. Clear?

/polishes banhammer ready.
barndoor101 1st February 2010, 01:02 Quote
Sorry boss. I just dont like being called a fanboy whilst trying to make points.
Elton 1st February 2010, 02:53 Quote
Not to worry bindi, he's nowhere as bad as the warrior guy on the Console gaming dying thread..
crazyceo 1st February 2010, 12:53 Quote
Quote:
Originally Posted by barndoor101
Why thank you for that thoughtful and intelligent post.

Exactly how are you meant to create a revolutionary new product when you have no money?

Thanks for the compliment.

So are you saying now that it's ok to put a new product on the shelves that is pretty much the same as the last but with a few tweeks and gimmicks? You must concede that the difference between the ATi 4xxx series and 5xxx is very small.

I'm not just picking on AMD here, Nvidia failed completely with their 9xxx series but did ok with their gtx2xx series but again it wasn't fantastic.

They both have money and had money to develope these products and that was before this industry hit recession.
barndoor101 1st February 2010, 13:30 Quote
Quote:
Originally Posted by crazyceo
Thanks for the compliment.

So are you saying now that it's ok to put a new product on the shelves that is pretty much the same as the last but with a few tweeks and gimmicks? You must concede that the difference between the ATi 4xxx series and 5xxx is very small.

I'm not just picking on AMD here, Nvidia failed completely with their 9xxx series but did ok with their gtx2xx series but again it wasn't fantastic.

They both have money and had money to develope these products and that was before this industry hit recession.

it was meant in sarcasm but nevermind, lets keep this civil.

the differences between the HD48xx and 58xx are fairly small, but graphics isnt the only industry which use evolutionary changes between generations.

Intel for example uses a tick-tock system, where they first introduce a revolutionary new tech (eg nehalem) on the tock, then refine this on the tick (32nm clarkdale). Same went for Conroe->Wolfdale. (no idea why its the wrong way round)

one could argue that the HD58xx is a refinement of the HD48xx, using a smaller process tech (40nm instead of 55nm), less heat, more power efficient, a few new features (DX11, eyefinity).

1 thing i just thought of is this: it is slightly unfair to directly compare a fully matured product (ie the HD4890) with a new product (HD5870), as the drivers havent properly matured yet (the same will happen for Fermi). This part pisses me off, why cant we have completed drivers for a product launch? But then I come from a programming background and remember it is an iterative process.
chumbucket843 3rd February 2010, 00:48 Quote
Quote:
Originally Posted by barndoor101

one could argue that the HD58xx is a refinement of the HD48xx, using a smaller process tech (40nm instead of 55nm), less heat, more power efficient, a few new features (DX11, eyefinity).
since when have we had a refresh for a new API? usually chips have to be redesigned to support them. sure evergreen was designed for time to market but the performance just isnt that great and nvidia cant even get their dx11 part launched. to top it off larrabee was cancelled. it looks like 2010 is going to be a boring year.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums
CM Storm NovaTouch TKL Review

CM Storm NovaTouch TKL Review

24th October 2014

Corsair Gaming H1500 Review

Corsair Gaming H1500 Review

23rd October 2014

CM Storm Resonar Review

CM Storm Resonar Review

22nd October 2014