bit-tech.net

I hope PC gaming graphics have plateaued

Posted on 26th Mar 2009 at 12:14 by Mark Mackay with 48 comments

Mark Mackay
During October of last year, in an interview with CVG, Valve’s vice president of marketing, Doug Lombardi stated that he believes PC gaming graphics are close to being as good as they’ll get.

Lombardi told CVG that he believes that ‘graphics have started to top-out. We've got really great-looking games but what we want are more intelligent, more visceral games and the multi-core processors are going to be the way that we get there on PC.’

Some of you might feel that this is nonsense; others might feel that it’s true but you wish it wasn’t because you want to be dazzled by graphical feats of unimaginable excellence. Personally, not only do I believe that Lombardi’s statement holds a good deal of truth but I also actually hope that he’s right.

The reason I do is because of something that happened the other day in the labs. Antony was testing an awesome watercooled rig built around an overclocked Core i7-920 and overclocked Nvidia GTX 295. I watched in awe as the machine played Crysis Warhead in DirectX 10 mode at 2,560 x 1,600 with 4x AA at a 24fps minimum frame rate.

I hope PC gaming graphics have plateaued

If Crysis is indeed the pinnacle of PC graphics then in a few years or maybe less than that, we’ll be able play our games at the native resolution of a 30in display with all settings on max using mid-range hardware. There are a few rich persons out there that can afford such luxury but I’d never seen a game that looked so good being played so smoothly on such a large display and OMG do I want one!

I hope PC gaming graphics have plateaued

In my opinion, the often reality-checking detail in Crysis with all the sauce turned on looks just as good as Left 4 Dead at max settings even though the latter game is built on a graphics engine that is the better part of five years old and will run on a pocket watch. As Joe wrote in his Graphics versus Presentation Blog how good a game looks isn’t about how technical the graphics are.

For this reason and the fact I want my future mid-range gaming PC to run my games on a 30in display, I really hope that graphics advancements slow down some and gives the hardware a chance to catch up.

48 Comments

Discuss in the forums Reply
proxess 26th March 2009, 13:28 Quote
With the advancements of the last few years I've been completely left in the dust. I also believe what's been said is true and hope it is.
GFC 26th March 2009, 13:34 Quote
I personally think the 3D age is over, it's AI and Physics age, here we go multicore!!
UrbanMarine 26th March 2009, 13:48 Quote
It's the what can consoles do age?
Xtrafresh 26th March 2009, 14:02 Quote
As soon as the XboX 1080 and the Playstation 4 and the Weeeeeeeeeeee! come out, there will be another push for more advanced graphics to help sell those consoles.

I see your arguement, but i think you'll go down in history with all the other people that think technology has stopped.
Bauul 26th March 2009, 14:14 Quote
I'd much rather devs concentrated on creating a game that's styled really well than has just technically good graphics. I loved looking at BioShock far more than Crysis just because Art Deco covered in blood will always win over Jungle Island covered in blood.
Cutter McJ1b 26th March 2009, 14:17 Quote
Quote:
Originally Posted by Cutter Mcj1b in his blog
I really hope that graphics advancements slow down some and give the hardware a chance to catch up
Quote:
Originally Posted by Xtrafresh

I see your arguement, but i think you'll go down in history with all the other people that think technology has stopped.

Sorry, I think you must have commented on the wrong article there. Don't worry, it's easily done.
sandys 26th March 2009, 14:29 Quote
Sony and Nvidia seem to be pushing the 3d gaming angle, I reckon this along with better AI/Physics is going to make gaming much better.
BadHead 26th March 2009, 14:43 Quote
No. Graphics have not plateaued. If they have, how will nVida, ATI and any other graphic card manufacturer that care to join the foray continue to sell graphics cards? People will just save up for a nVidia 295 or an ATI 4870x2 and never buy another graphic card - ever. That would not be good news for the Graphic Card manufacturers - or us.

Graphics have a bit of a way to go before they are indistinguishable from real life. But it is possible that in a few years time, we could be playing games that are that good. Then there's 3D Graphics. Then there's holographics. Then there's...Who Knows?
Turbotab 26th March 2009, 14:50 Quote
I hope not, if you look at the best off-line rendered digital Hollywood productions, there are still many areas for improvement. I do believe that as graphics become increasing realistic, they will become subservient to gameplay and level design, after all, as you flee a terrifying monster or drive Laguna Seca at 200mph, how much time do you have to admire incredibly intricate details.

As we enter the nano-technology era, it would be great to achieve truly portable gaming computers, visual output device included. Imagine how we will laugh in 10 years time at the size & weight of our ATX cases and TFT monitors! From your comments, can we infer that you do not think DirectX 11 will bring a major improvement in graphics? And lastly, you are the editor of Bit-Tech, suggest to NEC that you require a longterm test of their 30inch panel, it's common practice in the motor journalism industry, you have to suffer for your art:)
ChaosDefinesOrder 26th March 2009, 14:51 Quote
Quote:
Originally Posted by BadHead
No. Graphics have not plateaued. If they have, how will nVida, ATI and any other graphic card manufacturer that care to join the foray continue to sell graphics cards? People will just save up for a nVidia 295 or an ATI 4870x2 and never buy another graphic card - ever. That would not be good news for the Graphic Card manufacturers - or us.

Compare the performance of the 8800GTX, 9800GTX, GTX280 and GTX285, sure they do increase with each iteration but the performance jump has become increasingly smaller each time.

In this respects, I do agree with the article, the technology behind G-Cards does seem to be slowing down somewhat. The question, however, is whether this is because game technology has slowed down, or whether nVidia's relative dominance of the graphics performance sector has meant that they haven't tried as hard between iterations anymore

It's also certainly true that recent games have deviated from the previous "pretty = better" and instead gone more towards "better = better" which, despite being a rather superfluous statement on its own, actually means that they're focussing on making the games more playable, more enjoyable and simply focussing on other areas of gameplay than just graphics. This can only be a good thing, as a pretty game that needs nuclear power and a small loan to play properly doesn't nescessary have the enjoyment value (e.g. Crysis) as something that can be run on a machine 4 years old (e.g. Source engine games)
Xtrafresh 26th March 2009, 15:16 Quote
Quote:
Originally Posted by Cutter McJ1b
Sorry, I think you must have commented on the wrong article there. Don't worry, it's easily done.
*re-reads article*

so wait... your point then is that you'd like 2015's Atom platform to play games at full/full/full/max/max settings?

To me, it seems rather like you are hoping that no further advancements in software technology are madefor games, so you can enjoy high-end gaming on mid-range hardware. It sure sounds nice, but unless i'm wrong *again* about what you are trying to say, i think you'll be disappointed.

If the software side of graphics stops, manufacturers will also stop going bigger. This means that that 2015 GPU (the ATI 9870, or the nVidia GTX1495+GSOX2-9600GT) will be no more powerful then today's generation of cards, just smaller and more economic. Plus, they will still be high-end. The only way for midrange cards to get to that level of performance, is to either play old games (Homeworld for the win!) or for harware and software to get seriously de-synched. You seem to be hoping for that de-synch, but the problem with that is that nobody will buy high-end, and prices of mid-range will go up to the point that they are high-end again.
Sebbo 26th March 2009, 15:29 Quote
PC gaming graphics are still rasterised-based. We will not truly come near a plateau until we're using ray-tracing for everything you see on screen. On top of that, a great physics engine will be required. Just look at any body of moving water, or a flame, smoke and dust clouds etc in a current game - they're all still a long way from being as good as they could be, and won't achieve this until physics engines really start to handle it all.

All that said though, maybe rasterised graphics are reaching its plateau, but not PC gaming graphics as a whole.
Cutter McJ1b 26th March 2009, 15:31 Quote
Quote:
Originally Posted by Xtrafresh

so wait... your point then is that you'd like 2015's Atom platform to play games at full/full/full/max/max settings?

I'd have to re-read my blog again but I'm pretty sure I didnt say any of the above.
Quote:
Originally Posted by Xtrafresh
To me, it seems rather like you are hoping that no further advancements in software technology are madefor games
Quote:
Originally Posted by Cutter McJ1b in his blog
I really hope that graphics advancements slow down some and give the hardware a chance to catch up
Quote:
Originally Posted by Xtrafresh
i'm wrong *again*

Quote for truth
zimbloggy 26th March 2009, 15:32 Quote
Well, it is unlikely that the games ai is going to grow as fast as shaders and high res textures do. The simple reason is that good graphics are easier to do and benefit hardware manufacturers more than better ai.
Skiddywinks 26th March 2009, 16:14 Quote
I do hope that graphics progress does at least slow down and let the hardware take the lead again, but I definitely do not hope progress stops altogether any time soon.
Xtrafresh 26th March 2009, 16:18 Quote
Quote:
Originally Posted by Cutter McJ1b
I'd have to re-read my blog again but I'm pretty sure I didnt say any of the above.

Quote for truth
Oi, please don't be such a git about it :(

If it's not your desire to sit in first class with 3rd class tickets, I'm seriously questioning what it is that you ARE trying to say here...
Quote:
Originally Posted by blog
we’ll be able play our games at the native resolution of a 30in display with all settings on max using mid-range hardware. (...) OMG do I want one!
Quote:
Originally Posted by blog
the fact I want my future mid-range gaming PC to run my games on a 30in display, (...)
So maybe i'm just wrong about you wanting things to stop, while in fact you want software to slow down, but that doesn't negate my point that as soon as mid-range hardware will be able to serve you all the possible candy, no card will ever be positioned above that, making the card high-end again, and i bet you can count on pricing to work accordingly.
Major 26th March 2009, 16:37 Quote
I agree with the L4D + Crysis comment a little, they need to make most of what they have, and not add and add and add to try and make a game look better. Making things "too" realistic is not the way to go at the moment, just try and make a game look nice.
Cutter McJ1b 26th March 2009, 16:46 Quote
Quote:
Originally Posted by Xtrafresh
Oi, please don't be such a git about it :(

Well, to be fair, if you're going to put words in peoples mouths then youve got to be prepared to take responsibility for doing so.
Quote:
Originally Posted by Xtrafresh
as soon as mid-range hardware will be able to serve you all the possible candy, no card will ever be positioned above that, making the card high-end again, and I bet you can count on pricing to work accordingly.

It depends if the exponential increase in price/performance ratio of integrated circuits is enough to help GPU speed vastly outstrip gaming requirements. If games graphics plateaued roughly where theyre at now then the amount of money it would cost to actually manufacture a future chip capable of running games smoothly with full sauce at 2,560 x 1,920 with 16x AA would be very low and thus unlikely to cost large sums of money. Especially considering the how fierce competition is in the GPU market.

Thats also before taking into consideration that rasterisation wont last forever. But thats a whole other ball park as there are so many unknowns. I liked Sebbos comment.
Quote:
Originally Posted by Sebbo
All that said though, maybe rasterised graphics are reaching its plateau, but not PC gaming graphics as a whole.

A few years of affordable 30in rasterised gaming prior to ray tracing or whatever comes next would be a blessing indeed. Maybe gamers would love it so much that they never bothered the spend the cash to be early adopters of ray tracing and the technology was a flop because we were all loving out 54in displays too much. Though then that would depend on what the game devs were up to. The debate rages on.
wuyanxu 26th March 2009, 16:49 Quote
but the problem is that we NEED software to make hardware developers to release newer products. otherwise the hardware company will try to maximise shelf-time by simply rebranding every few month (nVidia) with no real development.

before Crysis came out, 8800GTX was sitting at the top for a loooooong time, and people are saying it's enough. only after the release of Crysis, more affordable hardware was released.
UncertainGod 26th March 2009, 16:52 Quote
I think we are approaching the end of the rasterised 3d graphics era and that is why improvements in engines are getting ever decreasing returns, the move to raytracing must begin soon.
Sir Digby 26th March 2009, 17:03 Quote
Personally, I think that there will still be improvements in how good games look - but only relating to the effects, so we'll be getting better lighting/better flames etc.

I suspect that actual polygon count will not rise a great deal because of the exponential increase in time it takes to make a higher poly-count models.
Xtrafresh 26th March 2009, 17:08 Quote
Quote:
Originally Posted by Cutter McJ1b
It depends if the exponential increase in price/performance ratio of integrated circuits is enough to help GPU speed vastly outstrip gaming requirements. If games graphics plateaued roughly where theyre at now then the amount of money it would cost to actually manufacture a future chip capable of running games smoothly with full sauce at 2,560 x 1,920 with 16x AA would be very low and thus unlikely to cost large sums of money. Especially considering the how fierce competition is in the GPU market.
AHA! you want prices to go down in general, making high-end graphics more affordable and less exotic! Well, wish granted, it's already happening :D

The 4870 1GB is at $180 now, and it is a decidedly high-end card, perfectly capable of delivering playable rates at 30in (which is 2560x1600 btw, not 1920). It's a pricepoint that only got you midrange stuff two years ago. Generally, i think we can say that prices for similar performance levels have halved in the last year, mostly thanks to the 4870 (introduced at $299), and ATI generally competing again.

I can understand the "want!"-factor of high-end graphics, and I too am hoping for prices to fall even further, i simply disagree with your causality. I'm of the conviction that making software better and better will get increasingly expensive, while making hardware better will get increasingly cheap. Since the two are eachother's incentive to innovate (game with no GPU that can run it is pointless, GPU that no game will ever need is pointless too), i'm hoping for some more software innovations. Both the SourceEngine and the CryEngine are getting a bit dated, and there hasn't been a game release based on a new cutting edge engine for years.
Quote:
A few years of affordable 30in rasterised gaming prior to ray tracing or whatever comes next would be a blessing indeed. Maybe gamers would love it so much that they never bothered the spend the cash to be early adopters of ray tracing and the technology was a flop because we were all loving out 54in displays too much. Though then that would depend on what the game devs were up to. The debate rages on.
Lol, i've also been foaming in the mouth since the first time i saw that Westinghouse 3840x2400 screen
cheeriokilla 26th March 2009, 17:11 Quote
"Lombardi told CVG that he believes that ‘graphics have started to top-out. " Tell that to John Carmack, I think he differs with his Mega Texture tech, Sparse Voxel Octrees and his interest in ray casting....
Sir Digby 26th March 2009, 17:17 Quote
Quote:
Originally Posted by cheeriokilla
"Lombardi told CVG that he believes that ‘graphics have started to top-out. " Tell that to John Carmack, I think he differs with his Mega Texture tech, Sparse Voxel Octrees and his interest in ray casting....

My understanding of the megatexture tech is that it doesn't make the game look much better - but it makes developing and editing levels much easier...
Xtrafresh 26th March 2009, 17:21 Quote
Quote:
Originally Posted by Sir Digby
My understanding of the megatexture tech is that it doesn't make the game look much better - but it makes developing and editing levels much easier...
...which will make it easier for developers to make better looking levels :D
Cutter McJ1b 26th March 2009, 17:28 Quote
Quote:
Originally Posted by Xtrafresh
AHA! you want prices to go down in general, making high-end graphics more affordable and less exotic! Well, wish granted, it's already happening :D

Yes it is. But when the GeForce 6800 Ultra came out it was about £400 and when the 8800 GTX came out, that was about £400. This is because games were increasingly more and more demanding and the GPU's were more advanced in attempt to keep up.

What I'm saying is...
Quote:
Originally Posted by Cutter McJ1b in his blog
I really hope that graphics advancements slow down some and give the hardware a chance to catch up
Quote:
Originally Posted by Xtrafresh
i'm hoping for some more software innovations

Me too. Just one more time now
Quote:
Originally Posted by Cutter McJ1b in his blog
I really hope that graphics advancements slow down some and give the hardware a chance to catch up
nicae 26th March 2009, 18:09 Quote
Quote:
Originally Posted by Sebbo
All that said though, maybe rasterised graphics are reaching its plateau, but not PC gaming graphics as a whole.

qft.

It will soon be like Photoshop: The program gives you the possibilities to do almost anything and is "plateaued" (efficiency not accounted for). The problem is that most designers don't know what they're supposed to want to do.
Soon, graphics hardware will be enough for our rasterization needs, but our graphics developers must want the right things.

I will give a real world example: Company of Heroes, with a handful of sprites (!), produces the most impressive explosions I've seen for a game (I would post a youtube vid if I weren't at work).
Another example are Hollywood animations. They create fur, metals and clouds with the most surprising fidelity. The awe inspiring visuals stun you! ....Until they try to make a human. It's just mentally hard to produce one, to capture all those skin folds, all those muscles that you subconsciously know of from your life experiences but can't list on a sheet of paper. It's a real challenge that makes "artistic license" a crude excuse for making a game with a comic style like Team Fortress 2. If you're going to draw humans and it's going to look wrong, let's at least make it a styled "wrong"!
Don't get me wrong - I love it's style. But if we were able to draw photorealistic humans with ease, we would see way less comic-styled games. Those that aren't comic styled look bad. You just didn't look so critically at them until now. :P

As for evolution, I would forecast 15 to 20 years until graphics plateau in the way the article says. Like Sebbo said, we still need to make the jump to ray-tracing, and I would grotesquely place that as "half-way there".
In minor steps, we would also need to progress into higher resolutions (up to tiny pixels on wall projections), 3D illusions, angled displays (full domes?), etc. Is it actually a matter of display limitations?
In a later moment we could see more "futile" features that consume lots of compute power, such as rendering independently from the POV (e.g. for spectators watching from other angles" on some wacky omnidirectional display.. or something like that)... Ah, if the future were easier to predict! :P

I wouldn't include physics in graphics, but I guess it does mix up in some ways. For instance, volumetric clouds is considered graphics nowadays, but it's really physics. They're particles and you need to calculate how they move around, disperse and difuse light. Same thing for water. It's a material, so it's physics, but it refracts light, and light is done by graphics. Well... light IS a part of physics, eh?
In a short line, physics still has more than a decade to go, for sure.

But if you do want to stand ground in rasterization, we really shouldn't expect many large steps forward. More particles here, sharper textures there, livelier shadows here, bloomier HDR there and so forth. Shiny materials already have bumbmaps, movement already causes motion blur, round surfaces cast Phong reflections.. it's pretty much done. But rasterization is still well below ray-tracing.

One way is to look at a game screen and compare it with a photo of a similar ambient. Then ask yourself: What's missing to make it look better?
If I do that with Crysis and a jungle photo, I would say "not much".
Then again, if I do that with Left4Dead and a photo of a zombie, I would say "WTFOMGBBQ! A ZOMBIE!!" :D
Xtrafresh 26th March 2009, 18:51 Quote
Quote:
Originally Posted by Cutter McJ1b
Yes it is. But when the GeForce 6800 Ultra came out it was about £400 and when the 8800 GTX came out, that was about £400. This is because games were increasingly more and more demanding and the GPU's were more advanced in attempt to keep up.
Correct, but we have all advanced a full generation. The 8800GTX launched what, 30 months ago? The GTX280 introduced at roughly 300 pounds if i'm correct, and so did the 4870X2. Isn't this the progress you are looking for?

This will be my last try:
I understand what you are saying, i just disagree. I think it's important for the industry as a whole to keep moving forward, and that this will ultimately yield the best results for us consumers, both in price and performance. I can't say it more plainly then that, and as i sense we are both getting mildly annoyed, i'll stop there. I would like to keep this from spiraling into one of those threads that counts down to hitler. :p
Cutter McJ1b 26th March 2009, 19:00 Quote
When the Radeon 4870 X2 was released, some eTailers sold it as low as £350 but more were at £400. No one had it at £300. The GeForce GTX 295 is £400
Quote:
Originally Posted by Xtrafresh

I think it's important for the industry as a whole to keep moving forward

/facepalm
TurtlePerson2 26th March 2009, 19:00 Quote
Graphics on PCs are linked to the consoles now. When there's another generation of consoles there will be another generation of PC graphics. Until then, I'm enjoying playing games maxed out at 1080p on my 4830.
Xtrafresh 26th March 2009, 19:52 Quote
Quote:
Originally Posted by Cutter McJ1b
When the Radeon 4870 X2 was released, some eTailers sold it as low as £350 but more were at £400. No one had it at £300. The GeForce GTX 295 is £400
The GTX295 is an abomination. The 4870X2 was quoted here on Bit-tech (i have reliable sources :D) as being 327 when it was first reviewed, and even that thing was rediculously overpowered unless gaming at 30in (which is still 2560x1600).
Cutter McJ1b 26th March 2009, 20:11 Quote
Quote:
Originally Posted by Xtrafresh
The GTX295 is an abomination

The 295 is the very card that was making Crysis Warhead look so amazing on a 30in display and costs £400.
Quote:
Originally Posted by Xtrafresh
The 4870X2 was quoted here on Bit-tech (i have reliable sources :D) as being 327 when it was first reviewed

The 4870 X2 was out before then.
Quote:
Originally Posted by Xtrafresh
gaming at 30in (which is still 2560x1600).

Corrected. But we're digressing and the point of my blog still stands which is...
Quote:
Originally Posted by Cutter Mcj1b in his blog
I really hope that graphics advancements slow down some and give the hardware a chance to catch up
Tim S 26th March 2009, 20:14 Quote
Quote:
Originally Posted by UncertainGod
I think we are approaching the end of the rasterised 3d graphics era and that is why improvements in engines are getting ever decreasing returns, the move to raytracing must begin soon.

We're not approaching the end of raster graphics IMO. We may be approaching an age where rasterisation isn't used exclusively for every effect, but there is little point in ray tracing something that doesn't reflect light and isn't also curved. It's very easy to reflect light on a flat surface with rasterisation and it's also very efficient so there's little point changing the way you do something to see no visual benefit.

Where I think we're going is towards a hybrid raster/ray tracing rendering model because using ray tracing exclusively would take us back more than a few years - ray tracing vs rasterisation is not black and white and it never will be, that's for sure. There'll be shades of grey for many years to come.
jrs77 26th March 2009, 21:56 Quote
What I find silly with todays games is, that the only thing that advances are the GFX.

Acknowledged, there's a very few titles, that have shown excellent physics aswell lately, but that doesn't solve the real problem of the whole gaming-market....

most games are damn boring to play!
nicae 26th March 2009, 22:00 Quote
Quote:
Originally Posted by Tim S
We're not approaching the end of raster graphics IMO. We may be approaching an age where rasterisation isn't used exclusively for every effect, but there is little point in ray tracing something that doesn't reflect light and isn't also curved. It's very easy to reflect light on a flat surface with rasterisation and it's also very efficient so there's little point changing the way you do something to see no visual benefit.

Where I think we're going is towards a hybrid raster/ray tracing rendering model because using ray tracing exclusively would take us back more than a few years - ray tracing vs rasterisation is not black and white and it never will be, that's for sure. There'll be shades of grey for many years to come.

Good point. I wonder how efficient hybrid models will be. Most likely, ATI and NVIDIA optimization efforts will be crucial in accelerating ray-tracing penetration.
thehippoz 26th March 2009, 22:33 Quote
actually I loved mass effect on the pc, when you left on the grain effect and forced on 8q aa 16x af it was like playing a movie.. it's awesome imo

I know alot of peeps turned it off.. try at 8q and it looks as good as crysis far as immersion goes (imo)- felt like I was playing a hollywood flick =]
Elton 28th March 2009, 00:10 Quote
I want the games to actually be fun, I mean sure, Crysis was fun, but after a while it just got boring(until installing new weapons that is).

And FC2, not even worth mentioning.

All these games look great, but the lack of intriguing presentation and the lack of any innovation really kills it.
Horizon 28th March 2009, 01:58 Quote
wow, I don't ever think I've seen this before. An author trolling in their own blog.

"I hope PC gaming graphics have plateaued"

What the titles pretty much states is that you hope CG stops progressing, flatlining how the top of a plateau is shaped.

Crysis at max/dx10 and Left 4 Dead at max, side by side they are night and day. Screen shot both of them next to each other. Just look at the draw distance.

"The 4870 X2 was out before then."

the 4870 was released mid august, bit-tech reviewed it at the begining of september, it was reviewed 2 weeks after the fact.
knuck 28th March 2009, 02:17 Quote
I want two more things and then i will accept a plateau

1) actual ground textures, not those blurry crap that we still get in 2009

2) characters that don't look like they're made of shiny plastic



otherwise I am satisfied
Zurechial 28th March 2009, 03:19 Quote
Thus the difference is highlighted between Custom PC and bit-tech.
Can't say I care much for seeing those kind of responses on here from an article author.

Whether or not Xtrafresh got the wrong end of the stick about your article, there's no need to be an ass about it and I think Xtrafresh deserves kudos for not responding in kind.

It may not be my place to say as a nobody-user, but that kind of behaviour isn't bit-tech, by my reckoning.
naokaji 28th March 2009, 14:30 Quote
I too hope Graphics advancement will slow down, not so much for the sake of getting cheaper high end hardware, but to give the game devs time to focus on other things like refined controls, better sound quality (anyone ever noticed how horrible unrealistic weapons sound for example?), quality control (what good is a game if its full of bugs but has shiny graphics?) and so on....
Anakha 28th March 2009, 15:58 Quote
I think that the jump to ray-tracing will be coming soon. With the number of cores on a 295 parallelising a scene, you can get a reasonable framerate out at a mid-range resolution (1680x1050, 30fps).

The main problem that's holding up RT at the moment is the memory bandwidth requirements. The "Raytracing algorithmn" is very simple indeed, just needing a little FP math, the problem is that it takes a LOT of lookups to RAM, and with lots of cores processing at once, RAM speed, latency and bandwidth are going to be the limiting factors.

Get those licked, and it's just a matter of throwing more cores at it 'til it's done, and those cores don't have to be complex at all (they're all doing exactly the same instructions again and again). a little FP math, a lot of reading and writing to RAM, possibly firing off a few more threads (For reflection) and you're done.
Ending Credits 28th March 2009, 21:55 Quote
This is something that came to my attention recently.

http://img26.imageshack.us/img26/8017/25002363.jpg

http://www.horde3d.org/screenshots/alfred.jpg

Both of these were done with a free graphics engine called Hoarde3D and the top one gets 60fps with a 7600GT apparently (with real time shadows).


There's also the leadwerks engine (not free though).

http://www.leadwerks.com/post/island_shot2.jpg

I'm considering using the first one in a project I'm working on. (The only language I really know well enough is Blitz 3D which is DX7 ).

Anyway the point is that everything is getting pretty good now.
djDEATH 30th March 2009, 17:49 Quote
i remember thinking this to myself when the PS1 was my main gaming furore, that "this looks really good, who will ever need any more than this" and lo and behold, each new piece of hardware raises the bar.

there is one example can put forward here: Ambient Occlusion.

Its currently available through an NVidia Beta driver, and when you apply it to Crysis for example, it drops the frame rate down even further, but adds an extra level of reality that really is impressive. So new types of graphical niceness are appearing all the time, making older games look better (think AA on old games that didn't initially support it) and now Ambient Occlusion. Sure things look fine as they are, but compare it to the graphics in Jurassic Park (made in like 1992 or 1993 or something) and it shows just what the graphics of rendered images COULD look like, i think Nvidia and ATI are constantly striving to get to that point eventually, to be able to render like that in real time on end-user hardware and until we get there, this discussion will continue.

Kinda long the same lines, after watching the F1 at the weekend, i noticed how little detail standard TV really gives you, and although we "care" about jaggies and texture filtering in games, we forget that a lap of Washington in Race Driver GRID, actually looks better already than real footage of real cars racing around a real track broadcast on tv.
nicae 30th March 2009, 20:35 Quote
Why don't you describe Ambient Occlusion a bit more to us? :)
The_Beast 31st March 2009, 00:19 Quote
Games now do look very good and the AI could be better so they should start working on better game play
cyrilthefish 31st March 2009, 00:44 Quote
Quote:
Originally Posted by ChaosDefinesOrder
Compare the performance of the 8800GTX, 9800GTX, GTX280 and GTX285, sure they do increase with each iteration but the performance jump has become increasingly smaller each time.
The more cynical amongst us might put this down to recent games largely being developed for consoles, then porting to PC's, meaning the fixed console hardware is the limiting factor.

Certainly seems that way to me, though i'll happily concede that GPU's have hit the similar clockspeed/heat ceiling that CPU's did recently, hence the slowdown while they work around that (with point #1 being a factor, they don't have much need to vastly improve the performance at the moment as well)
DbD 31st March 2009, 16:08 Quote
+1

Graphics are mostly pegged to what a PS3 or 360 can do due to most games being multiplatform. PC gamers don't get more then higher resolutions and sharper textures really. It does mean things like nvidia's 3d vision (which makes existing engines look better) in some ways have more of a future then DX11 (which requires a new engine to make use of it) in the short term.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums
Corsair Gaming H1500 Review

Corsair Gaming H1500 Review

23rd October 2014

CM Storm Resonar Review

CM Storm Resonar Review

22nd October 2014

Asus Z97i-Plus Review

Asus Z97i-Plus Review

20th October 2014