bit-tech.net

NVIDIA's GeForce 7800 GTX

Comments 26 to 43 of 43

Reply
Da Dego 23rd June 2005, 20:01 Quote
Thanks, Shepps, that really is quite useless. ;) Anywho...
Quote:
Originally Posted by [USRF]Obiwan

If you cannot explain that that LCD's can use more the 16bc then maybe i can explain why i have come to this questions in the first place:

why do alle hardware sites like tomshardware, anand, pcreview etc say otherwise.

From Viewsonic (monitor manufacturer)

pcreview: LCD vs CRT

Tomshardware comparison

Maybe i'm wrong, But then correct me with some real feedback. instead of bull nothing

:D

To give you a hand in understanding this, most LCD's even in 2002 used 24bit color. 16 million colors (as the THG review you posted) is not 16bit. 16bit means 2^16, or approximately 65k colors. 2^24 is 16,777,216 colors. There are still a lot of 24bit LCDs out there, but NEC/Sony/Philips and some Samsung newer ones and older more expensive ones are 32bit. Even a 32 bit still does not bring true color, though, as LCD technology means that there can be some very slight variations on each pixel as to what each color means. That's why they say it's not as good of color as a CRT, where (255,0,0) will be a true red across the entire display.

Hope that helps!
[USRF]Obiwan 23rd June 2005, 22:14 Quote
Thanx for the explanation Da Dego, The lightbulb flashed above my head now ;)
Blademrk 23rd June 2005, 23:52 Quote
Quote:
Originally Posted by Darkedge
well the graphs don't show twice the performance at all as I can tell and i've seen other reviews (tomshardware for example) that show an incremental increase and sometimes a decrease with SLI ffs. Not that an impressive card and certainly not twice the power.

I read a review today on HardOCP (only 'cos I can't get on the bit-tech site from work ), which puts the card in a rather good light saying that in some games 1 7800 is pulling better rates than 2 6800 running in SLI.

edit: Also decent widescreen support in the new drivers :D
lepre 24th June 2005, 00:02 Quote
i want more perfomance also at low res and no aa no anis low quality etc. -_-
Da Dego 24th June 2005, 15:18 Quote
Quote:
Originally Posted by [USRF]Obiwan
Thanx for the explanation Da Dego, The lightbulb flashed above my head now ;)
Glad I could help! :)
Tim S 24th June 2005, 15:19 Quote
why lq? surely that defeats the object of spending $599 on a new video card.

darkedge: you obviously didn't read the text. the frame rate may be similar, but look at the higher details and resolutions. the higher the res, the faster the card. I've said this several times to you in the past and you still moan about the same thing time and time again. :(
lepre 24th June 2005, 15:56 Quote
Quote:
Originally Posted by bigz
why lq? surely that defeats the object of spending $599 on a new video card.

simply because if i want to play hard on a particular game i don't care about graphics. it's only a damage for me and not an advantage. and i want minimum fps @ 125 not @ 30 please!
and with hq after 10minutes of playing my eyes hurts..
Da Dego 24th June 2005, 16:13 Quote
Quote:
Originally Posted by lepre
simply because if i want to play hard on a particular game i don't care about graphics. it's only a damage for me and not an advantage. and i want minimum fps @ 125 not @ 30 please!
and with hq after 10minutes of playing my eyes hurts..

Umm, why would you want a minimum fps of 125? Your eyes can only see between 28.4 and 29 anyways...?
lepre 24th June 2005, 17:15 Quote
Quote:
Originally Posted by Da Dego
Umm, why would you want a minimum fps of 125? Your eyes can only see between 28.4 and 29 anyways...?

that's when the eyes start to catch the frames as an animation but from that to have a total fluid animation there is a lot of difference.

and is scientifically confirmed that the eyes can view not fluid animation up to 120 and more.

if you don't believe that keep your 30fps minimum but don't say it's for everyone.
Da Dego 24th June 2005, 17:26 Quote
Ahuh. Do you know what speed a movie runs at, lepre? :) 29-30fps. So next time you go to the theatre, tell them really they should lengthen their reels by 4x because you're seeing non-fluid motion.

Do you happen to have a link to any of this "scientific confirmation"? Scio lingua italiana, so your native language is just fine if that's what you can find.
lepre 24th June 2005, 19:37 Quote
Quote:
Originally Posted by Da Dego
Ahuh. Do you know what speed a movie runs at, lepre? :) 29-30fps. So next time you go to the theatre, tell them really they should lengthen their reels by 4x because you're seeing non-fluid motion.

Do you happen to have a link to any of this "scientific confirmation"? Scio lingua italiana, so your native language is just fine if that's what you can find.

infact since i'm gaming i hate go to the cinema. and i only go for those great films i just want to see there because of the size and the audio. but the video frame rate is really bad and i can't even wath some sequence. i repeat, if you don't believe it's possible keep it for you. that's what i see for real.


p.s.: for the link i'm asking a friend.
Hamish 24th June 2005, 19:58 Quote
~30fps is the lower limit for jerkiness, which is why TV/movies run at that
the 125fps thing doesnt really apply anymore, in Q3 (and q3 based games) at certain frame rates you could move faster and jump higher, marginally
if you're using a monitor runing at 85hz you cannot see any more fps than 85 anyway :p

some people can see 'faster' tho and need a higher fps for lack of jerkiness
not me tho :D
Da Dego 24th June 2005, 20:49 Quote
Quote:
Originally Posted by lepre
infact since i'm gaming i hate go to the cinema. and i only go for those great films i just want to see there because of the size and the audio. but the video frame rate is really bad and i can't even wath some sequence. i repeat, if you don't believe it's possible keep it for you. that's what i see for real.
p.s.: for the link i'm asking a friend.

Did a little research:
The human eye can refresh rod and cone cells approximately 25 times a second (i.e. frames). However, part of what creates the fluidity of motion during that time is the appearance of blur.

Whatever object we focus on activates the majority of the cells in the eye, but not ALL of them. The remaining cells may be activated while the primary cells are refreshing by the re-emergence of background light and the motion of our own focal point when what we're focusing on moves. This sensation creates "blur." Essentially the brain uses this to fill in what happened in the last .04s. It gets built into movies and such automatically by each frame on a film being exposed for exactly 1/24 of a second, similar to the human eye.

In the case of a monitor and software, the blur may not always be there, which creates a "stuttering" effect. This happens because we are focused on the monitor as a whole (the full screen is the same distance from our eyes, despite the illusion of depth in the game), and so no new background light appears for us that would trigger the unused cells. Since frames in a video game aren't "exposed" per se over fractions of seconds (instead they show an instantaneous bit of time), they lack a blur unless programmed (why racing games look so much more real than many other games). The extra illusion of motion can sometimes be minimally compensated for by showing frames in between what we can actually see, creating a minimal transition effect by hopefully at least hitting a few cells.

However, monitors can only show as fast as their refresh rate, so even running at 75hz (high for an LCD), the maximum FPS you can actually show is 75, regardless of what speed the game runs at. Then your eye only takes ~25 of those frames for real imagery, and uses the other 50 of them for transitional motion at best, though most of them are lost in total. If you run a particularly unusual ratio of frames to hz to eye, you can end up with quite a headache, because your eyes may expect to see different frames than the ones sampled by the monitor.

So in a way, you are right...you can have the perception of something higher than 30fps, but only in a videogame (NOT at the movies, as you are claiming) that does not account for motion blur (which, I grant, some do not, but lots now do). Regardless, taking any framerate higher than your monitor's refresh-rate is POINTLESS and as long as you average somewhere around there any increased performance will not generate results. And anything above 25-30fps rate is a grey area of science based on how many cells get activated and how much "intermittent data" needs to or even can really be processed by the brain to produce further illusion of motion.

Oh, and as an addendum, movies run at 24fps, TV runs at 29-30 (NTSC/PAL crap)...sorry bout that. The extra frames on the TV are duplicate frames inserted into the recording and not new frames.

Phwew, that was long. :)

Srcs:
"A Human's Eye View: Motion Blur and Frameless Rendering" - Ellen J. Scher Zagier, Dept of Comp Sci, University of North Carolina
"30fps vs. 60fps" study by PenStar Systems
"The Limits of Human Vision" study by Michael F Deering, Sun Microsystems
And a whole lot more...


/*EDIT: I bet that the best way to fix this problem would be having a program that actually put the display about 1-2 frames delayed, then moved to about 30fps display by creating blur out of the lagged frames and continued this like a moving average. We wouldn't notice a 2 frame difference once it started displaying, but we'd have the blur that our eyes need to best approximate reality.
lepre 25th June 2005, 00:45 Quote
nice read.

anyway i was told of the scientif things from a friend who studied them at university. and i found that "my reality".

sure i know i don't see fps more than my monitor hz...that's why i use low res. because i don't have a ultra-uber 150hz refresh monitor and latelly i'm getting headache with this monitor and i hope i'll change it fast (i don't have to wear glasses, i'm 10/10)
anyway at 640 my monitor can get 100hz and that's quite better :)
then to have more fluidity (?) with a v-sync you must have much more fps than you monitor refresh otherwise you'll see images broke by lines in two pieces.

i bought my grandfather a grundig crt tv 21" 50hz and for me is very slow and i can't watch it much.

i'm sure forgotting to tell some things...anyway i'm not superman and there are many other guys out there telling the same...maybe it can be true :)
nedkiller 27th June 2005, 21:57 Quote
Quote:
Originally Posted by Da Dego
Umm, why would you want a minimum fps of 125? Your eyes can only see between 28.4 and 29 anyways...?

your eyes only distinguish detween single frames up to that frame rate. although at that rate your brain wont produce motion blur, which requires north of about 73fps. and as you way know, motion blur is how your brain generates a smooth coherent image. something that would definately help if you want to play competitively
Da Dego 27th June 2005, 22:23 Quote
Quote:
Originally Posted by nedkiller
your eyes only distinguish detween single frames up to that frame rate. although at that rate your brain wont produce motion blur, which requires north of about 73fps. and as you way know, motion blur is how your brain generates a smooth coherent image. something that would definately help if you want to play competitively
As mentioned...the only problem is, the articles I illustrate don't give a north of 73fps for blur...they target much less than that. So perhaps you could point to where that 73fps comes from?
r00t69 28th June 2005, 13:41 Quote
PAL is 25fps btw :)
Stephen Brooks 3rd July 2005, 04:12 Quote
Not scientific or anything, but I can certainly tell the difference between 25 and 50fps, and maybe even between 50 and ~80. I think the 24 (or 30) used in films is the _minimum_ needed for your eyes to not percieve the damn thing as a string of still images. However, when your rendering engine does not have motion blur (and film shutters _do_ have motion blur due to long exposures - take any stillframe out of an action film sequence for proof), you can see lots of undesirable effects from the finite frame rate even above that. In the most extreme case I can think of, imagine you're watching a bullet travel across in front of some bright background: the game will plot it in one position, then advance it by 1/50 of a second (at 50Hz). Now if the muzzle velocity was 1000 ft/sec, it would have travelled 20 feet just in that frame! You'd probably just see it once at the left-hand side of the screen and once at the right hand side, it would look like two bullets had momentarily appeared and then vanished! Even if you increased the frame rate to 1000Hz you'd still see a string of bullets separated by a foot each. You'd need 20`000Hz refresh rate to get the thing without those artifacts, so it just looks like a streak, believe it or not. This is really the fault of the graphics engine rendering moving objects at one _instant_ in time and not having a "shutter exposure" properly. Of course at lower frame rates it manifests itself with larger and less-rapidly-moving objects.

Even discounting that effect, I still reckon I could tell the difference between 25 and 50fps with shutter exposure included, but possibly not between 50 and ~80 any more.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums