bit-tech.net

A bluffer's guide to Shader Models

Comments 1 to 17 of 17

Reply
blackadder1000 25th July 2005, 14:11 Quote
Very good article. Really makes sense of the really confusing graphics world.
Well done bit-tech!
RTT 25th July 2005, 14:39 Quote
excellent :D
mclean007 25th July 2005, 15:05 Quote
Mr. Haz, he's our hero. Nice one guv.
Da Dego 25th July 2005, 15:52 Quote
Excellent job on spelling out the basics!
Decemoto 26th July 2005, 02:15 Quote
Agreed, that's really cleared a lot of stuff up for me. One question though, why do you refer to the next Windows as Longhorn, and then Windows Vista, and then Longhorn again? It could be a bit confusing for someone who hasn't read up on the situation! A simple search and replace would sort it, but it's up to you. Again though, great work on the article, I don't usually look into technical details of hardware too much, but this was just on the right level for me, telling me about things that are important, without bogging me down with over complication. Nice one!
Darkedge 27th July 2005, 14:33 Quote
Hmm so HDr is only avaliable with shader model three is it? You got your charts from Nvidia didn't you.. the Radeon 9800s can do HDR. Nvidia has done fine job in brainwashing people that shader model 3 is essential - show me one game that uses a shader that contains anything like the length that will only run on 3.0 .
Nvidia didn't do 2 beacuse ATi did the Dx work on it with MS - 99% of actual games programmers will admit that shader model 3 is not needed at the moment.
Tim S 27th July 2005, 15:01 Quote
Quote:
Originally Posted by Darkedge
Hmm so HDr is only avaliable with shader model three is it? ... the Radeon 9800s can do HDR
HDR with a floating point blend is not possible on a Radeon 9800, because the Radeon 9800 does not support a 16-bit Floating point frame buffer. You can do bloom and use less effective methods of HDR without an FP16 frame buffer though.
Quote:
Nvidia has done fine job in brainwashing people that shader model 3 is essential - show me one game that uses a shader that contains anything like the length that will only run on 3.0 .
Shader Model 3.0 is not essential, but it makes little sense buying a card that doesn't play every available game with all details (including HDR) turned on IMHO. Show me a game that is shipping and supports HDR on cards without the ability to do an FP16 blend... Half-Life 2 will support HDR without FP16 frame buffers, but the technology is not shipping at the moment.
Quote:
Nvidia didn't do 2 beacuse ATi did the Dx work on it with MS - 99% of actual games programmers will admit that shader model 3 is not needed at the moment.
Errr, NVIDIA had shader model 2.0 compliant parts in the GeForce FX - they were ******** slow, but they were compatible with shader model 2.0. They chose to add in some extra features to take it to Pixel Shader 2.0a, but the architecture was a Shader Model 2.x architecture.

From the dev's POV, Shader Model 3.0 is not needed for shader instruction lengths, but it is a hell of a lot more efficient and easier to program than Shader Model 2.0 using HLSL.
Quote:
You got your charts from Nvidia didn't you..
So you are saying that we wrote this article based on the fact that NVIDIA asked us to? The information for the chart on page 5 can be found on Microsoft's website:
http://www.microsoft.com/whdc/winhec/partners/shadermodel30_NVIDIA.mspx
Da Dego 27th July 2005, 16:40 Quote
:(...Darkedge, didn't we have a similar conversation in another thread with you, in regards to HDR being a post processing effect? Or am I mixing that up?

I'd bother with all the facts, but I think bigz spelled it out fine in the post above. But I'll recap the highlights from most of the other HDR threads we've had since its discussion back when Farcry 1.3 was released...

Bloom != HDR
HDR == Post processing
true HDR (as it is currently defined) requires SM3 because SM3 compliance means there's an FP16 buffer.

There's more material out there if you would like it, though. And SM3 is incredibly more efficient, but by reading the article I think you can tell that. As for devs not really liking or supporting it, please note:
Farcry HDR requires SM3
Splinter Cell runs either SM3 or SM1.1
The new engines for both Unreal and Serious Sam are based on SM3.
Tim S 27th July 2005, 16:49 Quote
I think UE3 will have an SM2.0 path as well, but it is coded on SM3.0 and then reverted back to SM2.0
MistaPi 29th July 2005, 19:52 Quote
Quote:
Originally Posted by bigz
HDR with a floating point blend is not possible on a Radeon 9800, because the Radeon 9800 does not support a 16-bit Floating point frame buffer. You can do bloom and use less effective methods of HDR without an FP16 frame buffer though.
But you did say that HDR was SM3.0 exclusive. Furthermore, as far as I know, FP blending and filtering is not a part of SM3.0. GeForce 6200 does not support this, but it is still a SM3.0 compliant part.
Quote:
Shader Model 3.0 is not essential, but it makes little sense buying a card that doesn't play every available game with all details (including HDR) turned on IMHO. Show me a game that is shipping and supports HDR on cards without the ability to do an FP16 blend... Half-Life 2 will support HDR without FP16 frame buffers, but the technology is not shipping at the moment.
FP blending and filtering is great, but unfortunately does it have a huge performance hit and it breaks AA. I for one take AA over HDR and its performance hit.
As for SM3.0, it does not bring much in todays games. Theres only one game SM3.0 have better quality over non-SM3.0 HW and thats because it lacks a SM2.0 fallback. For the GeForce 6-series I don't think this ever will be a big deal (from a consumer standpoint), because I believe (and as game developers has comment) the shaders must become alot more complex before we really see any real benefits and I strongly doubt the 6-series will be up for these games.
The way I see it pure fillrate, geometry rate and general performance is more important this generation for todays games and the future and ATi is the stronger suite here for the high-end (leaving out the "next gen"/7800GTX).
Tim S 29th July 2005, 20:44 Quote
Quote:
Originally Posted by MistaPi
But you did say that HDR was SM3.0 exclusive. Furthermore, as far as I know, FP blending and filtering is not a part of SM3.0. GeForce 6200 does not support this, but it is still a SM3.0 compliant part.

I have corrected the article's oversight to this fact - I think Wil has mixed Shader Model 3.0 with Far Cry patch 1.3, which enabled consumers with GeForce 6-series video cards to make use of the OpenEXR HDR file format.
Quote:
FP blending and filtering is great, but unfortunately does it have a huge performance hit and it breaks AA. I for one take AA over HDR and its performance hit.

May I ask what video card(s) you have?

HDR at 1600x1200 looks considerably better than 1600x1200 4xAA in Far Cry on a pair of 7800 GTX's, for example. The improvements in FP16 blending on the 7800 GTX make FP blended HDR a reality.
Quote:
As for SM3.0, it does not bring much in todays games. Theres only one game SM3.0 have better quality over non-SM3.0 HW and thats because it lacks a SM2.0 fallback. For the GeForce 6-series I don't think this ever will be a big deal (from a consumer standpoint), because I believe (and as game developers has comment) the shaders must become alot more complex before we really see any real benefits and I strongly doubt the 6-series will be up for these games.

GeForce 6-series is more of a developer's card than a consumer card - it happens to be a very good first implementation of Shader Model 3.0 and as such developers like developing games with it. Did I say (in this thread - bear in mind that I did not write the article) that the 6-series was going to be able to fully utilise its featureset?

I don't think I did, but more to the point it creates a good vibe because consumers can see what is coming to games in the future - it breeds enthusiasm for future games and drives developers to make use of these features. That is what we are starting to see.

R300, R350, R360, R420 are all the same kettle of fish - there's nothing that really stands out in R420. You could say 3Dc, but that was the only thing that was being discussed at R420 launch. No games use it yet.
Quote:
The way I see it pure fillrate, geometry rate and general performance is more important this generation for todays games and the future and ATi is the stronger suite here for the high-end (leaving out the "next gen"/7800GTX).
ATI have the fastest 'last generation' card, I don't dispute that. But, the fact of the matter is that the performance differences between the two is not massive and many would take features over performance when the gap is very small - much like the 40,000+ Steam users who have a GeForce 6800-series part.

Many would argue that NVIDIA won the last round of graphics card wars, they won the bleeding edge with SLI, ATI had the fastest single card solution, NVIDIA won the high end vs. X800 Pro - X800 XL was 6 months late to market if we're comparing the X800 XL to 6800 GT, so I would be very surprised if did not beat the GeForce 6800 GT. It is, however, one hell of a card for the price it ships at and I still believe that NVIDIA need to lower the price of a 6800 GT to something a little more in line with the X800 XL.

NVIDIA also won the mid-range with GeForce 6600 GT. The bottom end is debatable, but the X300 and X600 were faster than the GeForce 6200, but the X600 Pro was slower than the GeForce 6600 Std. X300 SE HyperMemory vs. GeForce 6200 TurboCache - flip a coin and you'll still have a card that sucks at the whole performance thang. :)
MistaPi 29th July 2005, 22:29 Quote
Quote:
Originally Posted by bigz

May I ask what video card(s) you have?

HDR at 1600x1200 looks considerably better than 1600x1200 4xAA in Far Cry on a pair of 7800 GTX's, for example. The improvements in FP16 blending on the 7800 GTX make FP blended HDR a reality.
I have a X800Pro VIVO with a XT PE BIOS mod. But I have tried FarCry 1.3 on a 6800GT and in my opinion the HDR effects is a bit overused in this game, but I don't dispute the fact that HDR does make a big impact. But aliasing is still a very big issue for me, to big for me to make that sacrifice. In fact, I am more excited about the TAA feature in 7800GTX than SM3.0 and FP blending put together. I am planing to buy one, perhaps the 7800GT. :)
Quote:
R300, R350, R360, R420 are all the same kettle of fish - there's nothing that really stands out in R420. You could say 3Dc, but that was the only thing that was being discussed at R420 launch. No games use it yet.
I believe FarCry 1.3, Tribes Vengeance and Sid Meier's Pirates! support 3Dc. And according to a game developer 3Dc is expected to be the standard for normal map compression.
Quote:
ATI have the fastest 'last generation' card, I don't dispute that. But, the fact of the matter is that the performance differences between the two is not massive and many would take features over performance when the gap is very small - much like the 40,000+ Steam users who have a GeForce 6800-series part.
This is true for the most part in todays games, but the X8x0XT (PE) is ofen taking the lead in higher resolution and/or in games that is quite fillrate/shader intensive. I think we will see the fillrate and geometry rate advantage X8x0XT (PE) has will become more and more apparent in the future and in my opinion is that more important than SM3.0 and FP blending for these cards. X850XT is ~40% faster than 6800U in Battlefield2 with 4xAA and 1920x1440 res. (ref Anandtech). In any rate I dont agree to the article comment that there its little sense in buying a ATi card today.

Please excuse my english, I am sure there is quite a few grammer and spelling errors in there. :)
Tim S 29th July 2005, 23:42 Quote
Quote:
Originally Posted by MistaPi
I believe FarCry 1.3, Tribes Vengeance and Sid Meier's Pirates! support 3Dc. And according to a game developer 3Dc is expected to be the standard for normal map compression.
My mistake.
Quote:
This is true for the most part in todays games, but the X8x0XT (PE) is ofen taking the lead in higher resolution and/or in games that is quite fillrate/shader intensive. I think we will see the fillrate and geometry rate advantage X8x0XT (PE) has will become more and more apparent in the future and in my opinion is that more important than SM3.0 and FP blending for these cards. X850XT is ~40% faster than 6800U in Battlefield2 with 4xAA and 1920x1440 res. (ref Anandtech). In any rate I dont agree to the article comment that there its little sense in buying a ATi card today.
But can you play BF2 at 1920x1440 with 4xAA? I don't think you can from my experiences, considering a 7800 GTX SLI is only just about playable at 1600x1200 4xAA with most details turned to 'High'. Some of the maps are less intensive, but in order to play the whole game without issue I don't think that 1920x1440 is a realistic resolution to be playing the game.

I've just checked the Anandtech article, and quoting 40% faster at that resolution doesn't tell the story - I mean, would you play BF2 at a 35FPS average? Taking in to account the fact that the frame rate drops considerably when you get involved in the slowdown sections (a little bit like bullet time). I know I certainly wouldn't. Also, from playing with the BF2 timedemo utility, I don't believe that it represents real-world game play by any stretch of the imagination.

I also totally disagree with Anandtech's suggestion that the 6600 GT is playable at 1024x768 4xAA 'High'.
MistaPi 30th July 2005, 00:12 Quote
Quote:
Originally Posted by bigz

But can you play BF2 at 1920x1440 with 4xAA? I don't think you can from my experiences, considering a 7800 GTX SLI is only just about playable at 1600x1200 4xAA with most details turned to 'High'. Some of the maps are less intensive, but in order to play the whole game without issue I don't think that 1920x1440 is a realistic resolution to be playing the game.
I am not saying its playable, I was just trying to make a point. We may see similar performance difference in other future games at lower and more playable resolutions/settings.
Tim S 30th July 2005, 00:22 Quote
Quote:
Originally Posted by MistaPi
I am not saying its playable, I was just trying to make a point. We may see similar performance difference in other future games at lower and more playable resolutions/settings.
I understand, but that is like saying that 3DMark gives an idea of performance for future games... :|
MistaPi 30th July 2005, 10:23 Quote
I am not soley basing my argument on Battlefield 2. Besides it's not a syntetic benchmark.
Rich_13 30th July 2005, 12:10 Quote
brilliant thread guys, making for a v good read. I do agree with bigz that you need alot of features on a card to drive forward the development of the next generation of games.

Also you have to give credit to the developers for trying to keep up with such fast moving updates in the graphics hardware wars. It takes alot more time and effort to bring out those games which utilise all these new cool features.

Plus developers have to work on the hardware you and me could be playing our games on atm. So it's not like they'd have next gen cards to devlope on and bring all the new features out when the new cards come out. (ok maybe a few big name developers but not most.0
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums