bit-tech.net

GT200: Nvidia GeForce GTX 280 analysis

Comments 1 to 25 of 53

Reply
bowman 24th June 2008, 10:35 Quote
'I can't help but feel this is a strange position to be in with the release of a completely new architecture because, generally speaking, the new generation of hardware completely outclasses everything that's gone before.'

That's because it isn't a new architecture, it's the old architecture with rearranged deck chairs and more of everything. It's more of the same old.

Nvidia is resting on its laurels, taking chances with huge chips based on two year old tech - ATI is going to whop them this generation. I hope you include the 4850 and 4870 in your games tests.
Tim S 24th June 2008, 10:45 Quote
sorry, I missed some emphasis on new. It's new in the sense that it's changed, but it's a tweak of a tried, tested and successful architecture.
bowman 24th June 2008, 11:02 Quote
Sorry for the crassness. It's not directed at you, it's directed at Nvidia. I loved my 7800 and I love my 8800, but they made me expect similar gains for the next generation.. Which just didn't happen.
RotoSequence 24th June 2008, 11:12 Quote
I just wish this article had more in the way of gaming benchmarks - and I'd really like to see how the 48X0s compare to it. As things stand, while well written, the base of this article is about 50% Nvidia marketing FUD. :(
Tim S 24th June 2008, 11:19 Quote
Quote:
Originally Posted by RotoSequence
I just wish this article had more in the way of gaming benchmarks - and I'd really like to see how the 48X0s compare to it. As things stand, while well written, the base of this article is about 50% Nvidia marketing FUD. :(

I hear you about the gaming benchmarks - it just wasn't feasible in any reasonable timeframe. I'd made the mistake of thinking earlier in the year that there wouldn't be product launches the weekend after Computex and was busy getting married. :)

Anyway, most of the article was actually based on the roundtable and email discussions and I had with the engineers and technical marketing team. The architecture was essentially a known quantity before the marketing presentations commenced and I started asking questions to architects, the CUDA team, the PhysX team, the driver team, etc. Many didn't get answered, but most of them did - I've got about 10 that are outstanding at the moment and I hope to get them answered in a call later today.
Andy Mc 24th June 2008, 11:23 Quote
Quote:
Originally Posted by Tim S
Ageia has ported the PhysX API to CUDA and released a public driver that enables GPU-accelerated PhysX – the actual port to CUDA took about a month.

Download link for it didn't happen.
Tim S 24th June 2008, 11:24 Quote
Quote:
Originally Posted by Andy Mc
Quote:
Originally Posted by Tim S
Ageia has ported the PhysX API to CUDA and released a public driver that enables GPU-accelerated PhysX – the actual port to CUDA took about a month.

Download link for it didn't happen.

Sorry, missed it out - there are some links here: http://www.xtremesystems.org/forums/showpost.php?p=3075957&postcount=1
Xtrafresh 24th June 2008, 11:48 Quote
Tim, those benchmarks you have lined up... will it be a next-gen shootout (including 4870) or will that be a later article?

I'm kinda waiting for an upper midrange card that would outperform my 8800 GTS 640 enough to be a reasonable upgrade. So far, i'm just not getting bang-for-buck, and i'm really anxious to see what the GTX 260 and the 4xxx's will do.

If you want, you can even borrow my whip to make your staff work harder :D
Bindibadgi 24th June 2008, 11:58 Quote
Quote:
Originally Posted by Xtrafresh

If you want, you can even borrow my whip to make your staff work harder :D

Unless it's got Harry's name on the top (in future), Tim does all the graphics writing and benchmarks entirely by himself. This article was 10k words alone :P

Joe and I have other things to do :P
Woodstock 24th June 2008, 12:09 Quote
first off, congratulations tim, and good luck

secondly im eagily awaiting the gaming benchmarks
frontline 24th June 2008, 12:36 Quote
Nice article, look forward to seeing some gaming benchmarks, along with some in-depth info on the 4xxx series cards from Ati (bearing in mind overclockers already have the 4850 in stock).

But looks like you've been a busy chap :)
wuyanxu 24th June 2008, 12:40 Quote
very very nice to see G80 is included (unlike other sites who didn't include the card) G80 is probably most user would be looking at, since G92 was so cheap, gtx280 is targeted at a completely different audience.

because G80 was such a HUGE step forward, people are expecting the same with this one. but in actual fact, nVidia first made cost effective improvement to G80 through G92, so that they can produce 9800Gx2 just as 7900Gx2 to 7800GTX.
if we look aside the multi-GPU solution, you'd see it's actually a large step forward from G92/G80 architecture. just look at all those figures.

and in terms of idle power consumption, idle heat output and ultra high resolution gaming, gtx280 beats 9800Gx2 without question, this is the most important parts in which market gtx280 is targeted. 9800Gx2 only have 512MB and it WILL run into memory problems at 30inch resolution, so, to game at that resolution, only solution would be 8800GTX in SLi or gtx280.
we all know multi-GPU hasn't matured yet, one still needs to mess around with profiles and compatibility issues to gain access to the full potential. so by forgetting (single out) multi-GPU, we will see that gtx280 is a HUGE achievement.

thanks for the article, looking forward to see gaming benchmarks. would be grateful if you could kindly colour multi-GPU solutions differently from single GPU solutions, as this will clearly show gtx280 is a HUGE step forward.
amacieli 24th June 2008, 13:14 Quote
I'm personally fine if nVidia decided to concentrate more on CUDA this time around rather than getting the enormous performance jump that some wanted to see. Pretty much every game I have (even a certain 'graphics demo' game that is popular to mention for being heavy on resources) runs very well on my overclocked 8800 GTS, and I think there's more value to be had, now, in getting things like video encoders/decoders to be accelerated, FAH, etc.
zoot2boot 24th June 2008, 14:55 Quote
Quote:
Originally Posted by amacieli
I'm personally fine if nVidia decided to concentrate more on CUDA this time around rather than getting the enormous performance jump that some wanted to see. Pretty much every game I have (even a certain 'graphics demo' game that is popular to mention for being heavy on resources) runs very well on my overclocked 8800 GTS, and I think there's more value to be had, now, in getting things like video encoders/decoders to be accelerated, FAH, etc.

i couldn't agree less. i can see why nvidia are investing in the technology, if they want to become a platform company then some of it is vital to them. it is however totally irrelevant to me. they're just subsidising their RnD for single chip solutions and flogging it off to gullible gamers. honestly, does any of the stuff that it can do bar push big frame rates sound remotely useful to you?
Cupboard 24th June 2008, 15:38 Quote
Quote:
Originally Posted by zoot2boot
*snip* honestly, does any of the stuff that it can do bar push big frame rates sound remotely useful to you?

Video encoding, yes. And if there is development in that area then more apps with become "Cuda enabled" helping speed them up too.

The graphics cards seem pretty sweet for games too. They are really what the 9 series should have been IMO
Not entirely sure where my card should have fitted in with the naming though.
Redbeaver 24th June 2008, 16:17 Quote
i think for my 8800GTS640 upgrade ill vote for the $229 9800GTX+....... fast. cheap. single-chip.

or maybe 4870 when its out and if its cheaper.....
Goty 24th June 2008, 16:29 Quote
Quote:
Originally Posted by Redbeaver
i think for my 8800GTS640 upgrade ill vote for the $229 9800GTX+....... fast. cheap. single-chip.

or maybe 4870 when its out and if its cheaper.....

The 4870 will be more than the 9800GTX+, but why not just go for a 4850 that's within 1% performance wise, but about 10% cheaper?
Tim S 24th June 2008, 16:46 Quote
Quote:
Originally Posted by zoot2boot
Quote:
Originally Posted by amacieli
I'm personally fine if nVidia decided to concentrate more on CUDA this time around rather than getting the enormous performance jump that some wanted to see. Pretty much every game I have (even a certain 'graphics demo' game that is popular to mention for being heavy on resources) runs very well on my overclocked 8800 GTS, and I think there's more value to be had, now, in getting things like video encoders/decoders to be accelerated, FAH, etc.

i couldn't agree less. i can see why nvidia are investing in the technology, if they want to become a platform company then some of it is vital to them. it is however totally irrelevant to me. they're just subsidising their RnD for single chip solutions and flogging it off to gullible gamers. honestly, does any of the stuff that it can do bar push big frame rates sound remotely useful to you?

I dunno, video transcoding, image manipulation and Folding are pretty widespread. :)
frontline 24th June 2008, 17:12 Quote
You could build a cheap quad core rig just for folding purposes for the same price as the GTX 280 :)
Hugo 24th June 2008, 17:21 Quote
Quote:
Originally Posted by frontline
You could build a cheap quad core rig just for folding purposes for the same price as the GTX 280 :)

Or you could build a dirt cheap system with a GTX 280 and get significantly better performance than a quad core offers.
Andy Mc 24th June 2008, 17:55 Quote
Tim, thanks for the link. Sucks that it's not available for the 9800GX2 yet......
frontline 24th June 2008, 18:08 Quote
"Or you could build a dirt cheap system with a GTX 280"

Hmm, clearly 'dirt cheap' means different things to different people.
Tim S 24th June 2008, 18:19 Quote
Quote:
Originally Posted by frontline
"Or you could build a dirt cheap system with a GTX 280"

Hmm, clearly 'dirt cheap' means different things to different people.

Of course, you could build a system with an Nvidia/ATI card in it for Folding... but the performance wouldn't be as good as a GTX 280. That's kind of a given though.
jfreak 24th June 2008, 18:30 Quote
Is there a chance that gaming will be moved to the GPU entirely? maybe down the road??

Sounds to me if game developers can use the GPU for game related, non-graphical, calculation such as AI and other logic, it would seriously affect the need for new faster CPU's. I mean most office apps and such run with processing power to spare with CPUs that are 2 or 3 years old.

I think Intel might have something to think about and AMD / Nvidia might have an amazing opportunity on their hands.

I think things are getting exciting again and things like OpenCL / cuda are good foundation shakers.
Tim S 24th June 2008, 18:39 Quote
There's some AI stuff coming to GPUs - both ATI/Nvidia have talked about and demoed it... and yes, it's impressive. The latest demo I saw was of 3,000 characters (each individually rendered) moving around with dynamic pathfinding that is dependant on what else is happening in the environment - think Fire, Enemies, Artillery etc.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums