bit-tech.net

Nvidia predicts 570x GPU performance boost

Nvidia predicts 570x GPU performance boost

Nvidia's CEO claims that GPU technology will increase to 570 times current performance levels within six years, while CPU tech will lag behind.

If you're a big fan of beefy graphics cards but just wish that they were a teensy bit faster, hold on to your hats: Nvidia is predicting that GPU performance is going to increase a whopping 570-fold in the next six years.

According to TG Daily, Nvidia CEO Jen-Hsun Huang made the prediction at this year's Hot Chips symposium hosted by Stanford University. During his speech, Huang claimed that while the performance of GPU silicon is heading for a monumental increase in the next six years - making it 570 times faster than the products available today - CPU technology will find itself lagging behind, increasing to a mere 3 times current performance levels.

Huang claimed that the improvements in GPU technology wouldn't be wasted on mere gaming, either: with the CUDA and OpenCL technologies allowing programmers to run general computing tasks using the graphics card's GPU, the performance boost could lead to the development of true real-time language translation technology and ever more convincing forms of augmented reality, along with the more traditional supercomputing tasks of signals analysis and energy exploration.

The improvements would also open the doors for true, real-time ray tracing - ushering in a new level of realism for games and improving the times taken to render photo-realistic CGI scenes for films.

Obviously, Huang has reason to push GPU technology as the way forward: as the CEO of a company which has no processor offerings - unlike rival AMD which offers both CPUs and GPUs after its acquisition of ATI - it's in his best interest to ensure that developers start working on OpenCL/CUDA-based GPGPU applications now in order to take advantage of the performance gap between CPU and GPUs he predicts.

Do you believe that Huang is on the money with his predictions, or is he simply talking up his company's core competency? Do you have faith that the CPU isn't going to find itself the laggard in the performance race? Share your thoughts over in the forums.

42 Comments

Discuss in the forums Reply
schwabman 27th August 2009, 09:41 Quote
So that's what, a 57,000% increase? Huang, I am not buying what you are selling!
V3ctor 27th August 2009, 09:47 Quote
...and then comes ATi with 569 1/2x for half price... :D:D
yakyb 27th August 2009, 09:50 Quote
umm any idea why its exactly 570x
13eightyfour 27th August 2009, 10:06 Quote
Quote:
Originally Posted by yakyb
umm any idea why its exactly 570x

Its probably the same amount of times they can rebrand the same card as something new. :D
GFC 27th August 2009, 10:16 Quote
Yea.. and let me guess, It's gonna be the size of moon, right?
DaMightyMouse 27th August 2009, 10:20 Quote
Quote:
Originally Posted by GFC
Yea.. and let me guess, It's gonna be the size of moon, right?

Probably will need a nuclear reactor just to power it...
bogie170 27th August 2009, 10:31 Quote
And 2 Nuclear Reactors for SLI :P
biebiep 27th August 2009, 10:39 Quote
Bullcrap.
nVidia's raytracing acceleration that they use in demo's is soooooo ugly compared to real raytracing.

They'll need that 570x just to be at the same level of what a CPU CAN do.
l3v1ck 27th August 2009, 11:01 Quote
It's gone up a lot in the last five years, but not 570 times. So why would we expect it to go up that much in the next five years.
RotoSequence 27th August 2009, 11:04 Quote
Quote:
Originally Posted by l3v1ck
It's gone up a lot in the last five years, but not 570 times. So why would we expect it to go up that much in the next five years.

Because they think it will sell more product NOW, of course.
xaser04 27th August 2009, 11:09 Quote
I wonder if these cards will simply be rebadged 8800GT's with clever *cough* marketing?!
Zero_UK 27th August 2009, 11:10 Quote
no no no no... everyone and bittech, you've misread the data all wrong.

He's on about PRICE increases.. NOT preformance. Preformance will just increase 0.1% from the next gen rebranded cards that allow a slightly better overclock.

;)
damienVC 27th August 2009, 11:22 Quote
Quote:
ever more convincing forms of augmented reality

Hmmm - that sounds interesting! Something I've been trying to achieve for years through various (legal) mindaltering methods...!
Ross1 27th August 2009, 11:26 Quote
nvidias marketing/PR/spin department is one of the most odious around. this, their incredibly annoying naming/rebranding of gpus, their constant catfights with intel (which isnt just words, see there being no sli on p35/p45).... it doesnt do anyone any good.
PQuiff 27th August 2009, 11:59 Quote
Hmm... I think all the large companies like Nvidia, MS , Sony etc should have someone outside with a big whip. Any time an employee tries to leave they should get whipped back into the building. Thus, no more of these rapidly escalating stupid comments would come out. And we might actually get some good kit, not just the re branded stuff we get now.

I remember when the first GeForce(256) to be exact, came out. I nearly pooped myself it was such a good card, and light years ahead of anything that had come out. I had just bought a Number 9 Graphics card, but chucked it in the bin for the new GeForce. Cant see me doing something like that for this gen stuff.

Bit of Subjective stuff here.
Quake Jun 96
Quake 3 Dec 99 say 100% better(gfx wise) ?
Halflife 2 Nov 04 say 200% better(gfx wise than quake)?
Unreal Tourney 3 Dec 07 250-300% better (graphics wise than quake)?

Thats over 10 years....and its a bit of a dodgy comparison and highly subjective...but id love to see how that equates to the gfx cards speed and memory etc....prolly find that a Gfx card that is 500% faster wont lead to a mind blowing images or games performance.

Still fingers xed.
Denis_iii 27th August 2009, 12:35 Quote
Quote:
Originally Posted by titanium angel
Quote:
Originally Posted by yakyb
umm any idea why its exactly 570x

Its probably the same amount of times they can rebrand the same card as something new. :D

lmao classic
Denis_iii 27th August 2009, 12:38 Quote
Quote:
Originally Posted by Ross1
nvidias marketing/PR/spin department is one of the most odious around. this, their incredibly annoying naming/rebranding of gpus, their constant catfights with intel (which isnt just words, see there being no sli on p35/p45).... it doesnt do anyone any good.

and there krappy mobile drivers
at least for my M7900GTX
l3v1ck 27th August 2009, 12:38 Quote
Anyone who bought an 8800GTX when if first came out must be feeling pretty smug right now. They're still selling what are effectively die shrunk versions several years later.
Let's hope the Early DX11 cards offer the same jump in performance over the DX10 cards, that the early DX10 cards did over DX9 cards. The leaked specifications for ATi's upcoming cards certainly seem hopeful.
Comet 27th August 2009, 14:31 Quote
What NVIDIA CEO is saying reflects the "behind the doors" war that is happening. The hardware market is changing drastically.
AMD/ATI was in a bad shape and NVIDIA was reigning just a couple of years ago. But AMD did one hell of a move that is shaking things.
It bought ATI. It is now the first chip maker that has the "tools" to combine the CPU and the GPU in one single integrated platform.
Those in the industry no well that this is the way to go. CPU's are becoming more parellel but GPU's give lot of promisse in AI and physics processing.
What you really need is a better fusion between the two. Intel is openely stating that their next CPU's will be able to suite many gamer needs. And developers are also giving clues that what Intel is saying is not a lie.
INTEL is attacking NVIDIA and AMD market.
NVIDIA is trying to catch up but it knows it has to have a foot in other platforms as they create, and here goes "THEIR OWN CPU PROCESSOR" combined with a GPU one.
NVIDIA is investing heavly on atom based pcs because thats an area where a GPU can still make an huge difference and they can increase profits.
Let's be clear about one thing. Despite how the market evolves, you can't have a cpu or gpu doing AI, physics, graphics, gameplay and all that heavy stuff in a single processor.
You need a hybrid solution. That's what AMD did. It upped the stakes and sent a clear warning to competitors. They sent the message.
"HELL with you. Were moving forward. Were building an hybrid platform. No more depending on third parties."
Developers are sending the message that as GPUs and CPUs come togheter it doesn't make sense to talk with each other in a different way.
Future platforms will share a "common" language so to speak and true paralelism despite the core technology. What this means is that highly parallel processing such as physics, graphics, ai
will be interchanging threads between cores in both the cpu and gpu as if they were one.
Now imagine this. Imagine that you want to build a platform for cloud computing. You got a hybrid GPU/CPU chip that interchanges threads. You add another hybrid gpu/cpu that shares the same interchange capability.
The two of them together can do the same stuff interchanging stuff between them as if they were one. The more you add, the more efficient the system gets. And you don't loose processing power.
They're all interchanging threads with each other, there is no unused core as it happens in the solutions available today.
That's what this guys are aiming for.
When NVIDIA CEO states that GPU's will have a 500x and the CPU only a margin of that, what he really wants to say is. "We got loads of experience in parallell processing. Our future solutions will be able to handle both traditional CPU and GPU tasks with breaze".
Star*Dagger 27th August 2009, 14:56 Quote
ill take the 570x speed increase, at first I thought this article's title was referring to a new nVidia card. But 500+ times over 6 years sounds about right.

To the above guy with his comparisons of quake versus 2007 games you are off by at least a scale of magnitude, maybe two. Have you played both extensively and know the tech behind them both?

Yours in Accurate Plasma,
Star*Dagger
SBS 27th August 2009, 14:59 Quote
sear 27th August 2009, 15:39 Quote
He's talking out of his ass. 570 is as arbitrary a number as any.
B1GBUD 27th August 2009, 16:20 Quote
Quote:
Originally Posted by l3v1ck
Anyone who bought an 8800GTX when if first came out must be feeling pretty smug right now. They're still selling what are effectively die shrunk versions several years later.
Let's hope the Early DX11 cards offer the same jump in performance over the DX10 cards, that the early DX10 cards did over DX9 cards. The leaked specifications for ATi's upcoming cards certainly seem hopeful.


Yay!! I win.... I have two of those puppies
cheeriokilla 27th August 2009, 17:39 Quote
Quote:
Originally Posted by SBS
http://tinyurl.com/clmvh7

LOL!!!!!!!

ATI+AMD FTW!
l3v1ck 27th August 2009, 17:53 Quote
Quote:
Originally Posted by Comet

It bought ATI. It is now the first chip maker that has the "tools" to combine the CPU and the GPU in one single integrated platform.
2nd. Intel has done CPU's and graphics for ages. Okay, their graphics are rubbish, but maybe Larrabee will fix that. Either way I'd expect Intel to have a CPU/GPU on one die long before AMD does.
aggies11 27th August 2009, 18:23 Quote
While this is largely marketing speak and hyperbole, performance has to double 9 times in 6 years, doesn't sound that implausible.

I think it's more focused on GPU's internal tech going massively parallel, and due to the nature of the specialized hardware, able to scale better then general purpose CPUs. But there are a lot of caveats down that particular road.
TSR2 27th August 2009, 19:30 Quote
570 times speed increase... and in the same timeframe Windows will become 571x more bloated and slow.
thehippoz 27th August 2009, 19:47 Quote
can of whoopass
dec 28th August 2009, 05:09 Quote
thats a big and oddly specific number. SBS beat me to that pinch of salt. I dont think its impossible to get that much more GPU power in 6 years but lets think. GT 870 = GT 790 = GTS 650 = GTS 570 = GTX 460 = GTX 380 = GTX 285. thats 6 times more computing power right?
Marc5002 28th August 2009, 10:04 Quote
more like 2x gap at GTX400 we increase always by higher gap usally 50% jump from 4870 1GB to HD5870 1GB : rumour of GTX280 vs GTX 380 (3x more power) but that is not even proven in any benchmark it just being running around hardware forums
l3v1ck 28th August 2009, 10:50 Quote
Let's take a look at a rough improvement in the last five years shall we.
http://www.gpureview.com/show_cards.php?card1=33&card2=605

Texture fill.... 51840 / 2600 = 20 x performance in 5 years
Pixel fill....20736 / 2600 = 8 x peformance in 5 years
bandwidth.... 158.98 / 19.84 = 8 x performance in five years.

Not exactly close to 570 times is it? He'd be lucky to get 57 times in five years.
dyzophoria 28th August 2009, 21:07 Quote
why stop at 570, he should have gone with 1000 times, or 1000000 times, on any account I doubt it will happen , lol, he should really just shut up imho, less talk , more innovations, now go go go
AWowzer 29th August 2009, 00:20 Quote
There is one reason why I disagree with this statement. Consoles.

GPU hardware power has increased rapidly over the last ten years because software has pulled it along. There was always one game that the best graphics card of the day could not run quickly and this process pulled graphics hardware along. The most recent example being Crysis. Its taken over two years for graphics hardware to catch up and be able to run it really smoothly at highest detail settings at highest resolution.

The problem is now for the first time consoles have the power of 2 year old pcs when it used to be five years. So titles for the pc can be released for consoles. The money is in consoles because of pc piracy issues and consoles are now the lead dev platform. The most demanding graphics engine ps3 360 can handle is about Unreal Engine 3 level. ps3 and 360 are going to be around for another 2 years maybe. And the ps4 and 720 will have probably the power of pcs of today.

So I would argue that speeds might only be 5x in 6 years
AWowzer 29th August 2009, 00:30 Quote
The only thing that might cause speed increases to be 500x in 6 years time is if the Compute function in Windows 7 and Larrabee allow very basic real time raytracing to start on pcs.

All graphics cards are optimised for rasterisation these days. But the Compute function in Windows 7 would allow graphics card to use the power of their parallel processing for the number crunching required to deliver basic real time raytracing and aid the CPU in these calculations.

Larabee has a software renderer instead of a hardware one allowing it to be optimised for whatever the game engine demands be it rasterisation or raytracing.

If basic real time raytracing can start to roll on PC that would start the software pulling hardware process again and hardware would be required to advance at rapid rates again (unlike the stagnation that exists now where the Nvidia can re release the 8800gtx 5 times with just die shrinks) to deliver the power to be able to do real time raytracing at high resolutions.
AWowzer 29th August 2009, 00:41 Quote
The consoles despite slowing down graphics have brought a big benefit through the war between the manufacturers to bring the next amazing thing to market. Huge advancements have been made in input, emotion and face recognition technology.

Imagine what it could be like in 6 years. Wearing specs that project a 3d 720 degree (360 around the vertical and horizontal) image. Walking on a machine that had the tip of a large trackball just on the surface so you would stay in the same spot no matter which direction you went. Full face, body and emotion recognition. It would be like a clunky holodeck. Computer, load ten Rachel Stevens please
Elton 29th August 2009, 02:46 Quote
Quote:
Originally Posted by AWowzer
The only thing that might cause speed increases to be 500x in 6 years time is if the Compute function in Windows 7 and Larrabee allow very basic real time raytracing to start on pcs.

All graphics cards are optimised for rasterisation these days. But the Compute function in Windows 7 would allow graphics card to use the power of their parallel processing for the number crunching required to deliver basic real time raytracing and aid the CPU in these calculations.

Larabee has a software renderer instead of a hardware one allowing it to be optimised for whatever the game engine demands be it rasterisation or raytracing.

If basic real time raytracing can start to roll on PC that would start the software pulling hardware process again and hardware would be required to advance at rapid rates again (unlike the stagnation that exists now where the Nvidia can re release the 8800gtx 5 times with just die shrinks) to deliver the power to be able to do real time raytracing at high resolutions.

You make it sound easy, ray-tracing is by no means a perfected method, and it will bring in nice graphics for sure, but if you were to look realistically, would it be profitable to make cards 500x more powerful in 5 years? Hell no, they're going to drag it out as long as possible, GPU companies, and Intel are here for the money, not to make performance, which is a side effect of competition.
Mithyx 30th August 2009, 05:38 Quote
Quote:
Originally Posted by l3v1ck
...Texture fill.... 51840 / 2600 = 20 x performance in 5 years
Pixel fill....20736 / 2600 = 8 x peformance in 5 years
bandwidth.... 158.98 / 19.84 = 8 x performance in five years.
...

20 x 8 = 160 x 8 = 1280! He's underestimating the improvement by a long shot!

I should be in marketing :)
s1n1s 31st August 2009, 04:30 Quote
Might do since technology is improving faster and faster so can't see why not
tejas 31st August 2009, 16:45 Quote
Nvidia will be fine. Intel Larrabee will annihilate AMD
Dragunover 27th November 2009, 22:20 Quote
I doubt larrabee is capable of doing anything except sucking excretory orifices
gavomatic57 28th November 2009, 14:16 Quote
Larrabee is not looking good and AMD's CPU's are a millstone around AMD/ATI - they're a generation behind and not all that competitive on price. Fermi is shaping up to be pretty powerful too, so it's anyone's game right now.
TheSoloEngineer 21st February 2010, 13:18 Quote
I just wish I could tell which comments are Corporate American comments and which are actual member comments.

Those promoting Intel are the only obvious corporate misinformation I can see.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums