bit-tech.net

Inside the GeForce 7800 GTX

Comments 1 to 15 of 15

Reply
kenco_uk 22nd June 2005, 17:07 Quote
The AA Performance - 1600 x 1200 graph is messed up a little.
Tim S 22nd June 2005, 17:13 Quote
in what way is it messed up? NV4x does not support Transparency or Gamma Adjusted Anti-Aliasing.
kenco_uk 22nd June 2005, 17:16 Quote
Well, the blue bar, it says, represents the G70. The green bar, which has higher fps (and a complete run of them) apparently represents to NV45. And then looking at the table of data, it's the other way around.

It's only a small thing and tbh I've raked through for any spelling errors and there aren't any. So, well done!

Oh and good article btw :)
The_Pope 22nd June 2005, 17:35 Quote
Fixed - thanks
ZERO <ibis> 22nd June 2005, 18:19 Quote
good artical i cant wate for the rest of the geforce 7 family to come out i wonder how long it will be befor the ultra level geforce 7s come out.
The_Pope 22nd June 2005, 18:25 Quote
It is widely believed that the Ultra will appear in response to ATI's next GPU, the R520, on the assumption that ATi will beat the "mid-range" 7800GTX. Whether that is 26th July, or August, or even October, who knows.

I'm more interested in seeing a game that makes this card look "slow" ... :D
mclean007 22nd June 2005, 20:51 Quote
a thorough, informative and well written article, as usual BigZ!

Question - you mention that AA and HDR are incompatible at present. Now I don't profess to understand all the finer points here, but is that an absolute impossibility due to the way the two techniques work, or is it merely a limitation of G70 and something we are likely to see fixed with G80(?) and/or R520 and its descendants?
mclean007 22nd June 2005, 20:52 Quote
Quote:
Originally Posted by The_Pope
I'm more interested in seeing a game that makes this card look "slow" ... :D
And you can rest assured that Valve etc. are working to realise that dream even as we speak.
Da Dego 22nd June 2005, 22:56 Quote
Quote:
Originally Posted by mclean007
a thorough, informative and well written article, as usual BigZ!

Question - you mention that AA and HDR are incompatible at present. Now I don't profess to understand all the finer points here, but is that an absolute impossibility due to the way the two techniques work, or is it merely a limitation of G70 and something we are likely to see fixed with G80(?) and/or R520 and its descendants?

As it stands they are both post-processing effects that happen at the same time at the end of the pipeline for each frame, and are a "chicken and egg" scenario. One would require the other to be done first so they could process the image, and right now they'd both be doing it at the same time in the same place.

Personally, I see a couple ways around this. One involves screwing with the chip, putting a second post-process transformation region on it (Bigz can handle the details as to how this is probably infeasible with current technology). Then you could do HDR first and have AA go next. The second idea would be abusing the very notion of SLi...using the entire second card to do nothing but interpret HDR or AA on an already rendered image from the first card. The cross-talk here would be amazingly complicated to implement, though I bet it could be done. People would be mighty pizznizzled at the performance hit, though, as they would have just spent $400+ to include AA and reduce their framerate to that of only having one card. :) Better to have the elevated resolution (effectively functioning as AA) and the HDR.

Biggles will, of course, strike down any of my incorrect limited knowledge once he gets online access faster than 56k again. :)
RotoSequence 23rd June 2005, 05:17 Quote
Great article as always, BigZ; how does the G70 do at super high resolutions, say, 30" Cinima display levels? Im working on a machine for a potential client, and I want to know if it will game at resolutions of 2048x1280 ;)
Tim S 23rd June 2005, 10:40 Quote
WRT HDR and Anti-Aliasing working together. Somewhere in one of the press documents, NVIDIA mentioned that you could possibly supersample with HDR - however, I tried this in Far Cry and was just given corrupt textures so I would say that I read the press document wrong, hence why I didn't mention it as working in the article.

Da Dego is close to right - basically the ROP, or pixel output engine, needs to either Anti-Alias or do an FP16 blend (for HDR) as the last instruction/operation before it sends the scene to TMDS/RAMDAC ready for displaying on screen.

Roto: I think it is possible that this will game at 2048x1280, but you will need to lower the details a little in some games that are already close to using up all of the frame buffer. :)
Mister_Tad 23rd June 2005, 13:27 Quote
heard anything regarding dual-link support?

would be nice to see it cropping up on more gaming cards as opposed to workstation only
Rekarp 23rd June 2005, 13:43 Quote
can you pull out the tape measure for me? I want to know if it will fi inside my case. :D
TheAnimus 23rd June 2005, 13:54 Quote
well the've certainly made a move to parrallelism (or n dimension problem solving, i prefer that term, as its more "visual" and easyer to think around with). So in 18 months we should be seeing dual core cards. Well mabye not, the PII hit this 'wall' but graphics problems are generally easyer to substitute for smp. (just like SLI has, but you can do it purely with anti-alising if you wished).

oh well someone buy me one. i cba to work this summer.
Tim S 24th June 2005, 00:07 Quote
Quote:
Originally Posted by Rekarp
can you pull out the tape measure for me? I want to know if it will fi inside my case. :D
i made note of this in one of the articles, i think it was the second on page 2. i dont have enough b/w on my pda to check though :)
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums