bit-tech.net

NVIDIA GeForce 6800 GS

Comments 1 to 25 of 38

Reply
hitman012 7th November 2005, 18:35 Quote
Looking good... it seems like nVidia are starting to pay attention to the sections of the market where they are really losing to ATi. This might be a good card for a system I am going to be building someone soon.

P.S: There's a typo on the DoD page, it says "of real world game play on the dod_anizo map" when the map name is "dod_anzio". Too much writing about anisotropic filtering, I guess :)
Tim S 7th November 2005, 18:39 Quote
cheers for the typo - I've always read it as anizo :o

Too much AF, as you say. :D
Tim S 7th November 2005, 18:56 Quote
updated with some places where it is in stock in the USA. ;)
-EVRE- 7th November 2005, 20:20 Quote
still forgeting those of us with lcd's bigger than 15"?
performance at 1280x1024 anybody?

case in point:
my sister has this hardware and is looking to upgrade.
19"lcd 1280x1024
2600+@2.3ghz
9800pro
and stuff just looks UGLY at 1024x768

so in fear.. what will be the best settings at 1280x1024? what is the performace difference by going with a 6800gt or faster?

I think having both APPLES TO APPLES and Best playable for all games at more resolutions is most usefull.
Da Dego 7th November 2005, 20:31 Quote
Quote:
Originally Posted by -EVRE-
still forgeting those of us with lcd's bigger than 15"?
performance at 1280x1024 anybody?

case in point:
my sister has this hardware and is looking to upgrade.
19"lcd 1280x1024
2600+@2.3ghz
9800pro
and stuff just looks UGLY at 1024x768

so in fear.. what will be the best settings at 1280x1024? what is the performace difference by going with a 6800gt or faster?

I think having both APPLES TO APPLES and Best playable for all games at more resolutions is most usefull.
A2A is a hard fight, though, Evre. Look at FEAR for example. The AI in the game makes it almost impossible to have the same fight twice, regardless of how carefully you attempt to go through the map.

You can clearly see by the BPS graphs that 1280x1024 ain't on the list. That means that this card probably does not fit that market well with much turned on, and you need to look at going a size up. Bigz can't test every resolution with every bit of eye candy turned on or off...it would take forever. After all, one could sacrifice textures, lighting, shadows, etc. and go to minimum detail on everything and get the game to run at 1600x1200 on a 6600gt. But it will look like crap, and who would play that way?

The beauty of BPS is that 1024x768 is the highest he could get the game to run with most of the eye candy turned on. If you have a 1280x1024 and you don't want to run at 1024x768, these graphs clearly illustrate that you may need to choose from a higher bracket of graphics cards to run FEAR.

The performance difference between the two cards (6800gt and 6800gs) is negligable in real world play, unless somehow your eyes can detect a difference of 1 or 2 fps...that's the point of saying Best Playable...if something is 37fps or 39, it doesn't matter much...they'll still both be 1024x768 with X amount of eye candy on...those 2 frames don't make a better card.

So essentially, your answers are all there, you just have to look at what the graphs are saying. And there's even a couple A2A comparisons in the back based on the only games that have a well-enough repeatable section. Just in case you didn't pull the information from the other graphs for whatever reason.
Tim S 7th November 2005, 20:36 Quote
Quote:
Originally Posted by -EVRE-
still forgeting those of us with lcd's bigger than 15"?
performance at 1280x1024 anybody?

case in point:
my sister has this hardware and is looking to upgrade.
19"lcd 1280x1024
2600+@2.3ghz
9800pro
and stuff just looks UGLY at 1024x768

so in fear.. what will be the best settings at 1280x1024? what is the performace difference by going with a 6800gt or faster?

I think having both APPLES TO APPLES and Best playable for all games at more resolutions is most usefull.
You're looking at Low to Very Low detail for FEAR on either 6800 GS or 6800 GT in FEAR. I don't think FEAR looks particularly ugly at 1024x768, in all honesty... I think it'll actually look WORSE at 1280x1024 when you turn off things like shadows, lighting and ruin a lot of the creepiness in the game.

If you want to play everything at 1280x1024, you're going to need a 7800 GT - anything less won't play FEAR and look good at the same time.

If you want apples to apples in every game, along with best playable, I'd be happy if you could find me another 12 hours in the day - I could really do with 36 hour days.

Just to give you an idea of how long these reviews take - I spend anywhere between 6 and 10 hours with each video card (depending on how much tweaking is required). I got this card at 3PM Friday afternoon and I was working til 4am Saturday morning, up again at 9am, working til 3am Sunday morning, up again at 10am and then went to bed at 3am last night to start off again at 9:30am this morning and finished the review by 5PM. I would say there was around 30 hours of testing for this review and then another 10-15 hours of writing, creating tables, doing photography, etc.
hitman012 7th November 2005, 21:01 Quote
I agree - FEAR does look better on high @ 1024 than at low on 1280 on my Samsung 17" LCD. I tried both these configuration before nVidia fixed the performance bug with the game in their latest drivers, allowing me to have the best of both worlds.

bigz: Everyone here appreciates the work that you and the Bit team put in; I find that the "best playable" style of review is extremely useful - the information that I need to know as a gamer is presented much more succinctly than with a large list of resolution/FPS/settings. Keep up the good work :)
silent_project 7th November 2005, 21:45 Quote
Looks like this is the card for me! good blend of value and more power than the plain 6800 i was going to have to settle for. I just hope it clears what it has to in my x-qpack...
-EVRE- 7th November 2005, 21:51 Quote
Sorry Bigz. After reading that again, I can see that it comes across in a very negative tone.
I do appreciate all the work that goes into doing a review.
and when I said 1024x768 looks bad, I meant on an lcd when its not in naitive resolution.

I originally thought that when BT did a review of a card, they were compiling a database of figures on that card. But with new drivers every month or so, i guess BT needs to rebenchmark every card to get current accurate results for a review.
Tim S 7th November 2005, 22:07 Quote
Quote:
Originally Posted by -EVRE-
Sorry Bigz. After reading that again, I can see that it comes across in a very negative tone.
It's no problem. I did actually delete some of the more harsh comments I made - I've got a very short temper at the moment with not being 100%.
Quote:
I originally thought that when BT did a review of a card, they were compiling a database of figures on that card. But with new drivers every month or so, i guess BT needs to rebenchmark every card to get current accurate results for a review.
It's really really hard to compile a database of 'best settings' for a video card in any particular game, because they can change quite dramatically with a driver update (or even a game patch). I would love to be able to do that, and I'd love to be able to put even more in to these reviews to make them even more useful than what I understand them to already be based on the feedback that we get.

Often there just isn't enough time to get done what I want to do anyway, which means I end up cutting things short. I hate doing it, I really do, because I'd like them to contain everything that everyone would ever want. I'd love to include gameplay evals on 'balanced' systems and a cost concious system too, but that's doubling the time taken to do a review without even factoring in that a slower system is going to require more tweaking than a faster system.

It sucks, I try and get as much done as I can, but what I will try and do in future is at least try and make reference to what is possible at 1280x1024 in the text. I am really pleased that EA/DICE have added 1280x1024 in to BF2 - I know it isn't the correct aspect for my CRT, but it is a resolution that a hell of a lot of people use. I'd rather be relevant to those TFT users as much as is possible but some games would look so poor at 1280x1024 that I'd argue that finding a sweetspot at a resolution that's a bit lower is the best option.
Payne280 7th November 2005, 22:54 Quote
I thought it was a well written and though out review so nice job on that.

I just don't like the new 6800GS because I was planning on getting another 6800GT for now and it seems unless I get it fairly soon I will be out of luck
Tim S 7th November 2005, 23:17 Quote
Quote:
Originally Posted by Payne280
I thought it was a well written and though out review so nice job on that.

I just don't like the new 6800GS because I was planning on getting another 6800GT for now and it seems unless I get it fairly soon I will be out of luck
Thanks for your kind comments.

I think the best place for you to look would either be the for sale forums or ebay. SLI can be used with two different vendors and clock speeds now. In fact, I must test whether a GS will work with a GT.... I'll test that tomorrow if I remember!
silent_project 8th November 2005, 01:32 Quote
the articles always rock! i enjoy the technical content and the fact that the benchmarks are actual games and utilities rather than obscure numbers and homebrew statistics.
Adnuo 8th November 2005, 07:35 Quote
Half-Life 2 has been taken out of the test loop :(. Still a good test IMO, even if it's not as up to date.

Don't see what the point of releasing more midrange cards is, besides building hype. The 6200 seems to have the OEM range covered, 6600GT/6800LE has the budget enthusiast covered, 6800GT has the midrange enthusiast, and 7 series has the high end. Regardless, good review :)
The_Pope 8th November 2005, 09:58 Quote
Quote:
Originally Posted by silent_project
the articles always rock! i enjoy the technical content and the fact that the benchmarks are actual games and utilities rather than obscure numbers and homebrew statistics.

I tell you what gave me a laugh: some other sites have used 3DMark as part of their testing for this card, yet the score is virtually identical with the ATI competition which the in-game testing proves are far slower.

Shows how pointless 3DMark is as a guide - you'd think the GS was the same as an XL, GTO or 1600XT based on the 3dMark scores, yet it's 25-50% faster in many games using Real World testing...
Tim S 8th November 2005, 11:05 Quote
Quote:
Originally Posted by Adnuo
Half-Life 2 has been taken out of the test loop :(. Still a good test IMO, even if it's not as up to date.
Half-Life 2 has been replaced by Day of Defeat: Source - a 6800 GT can run the game at 1600x1200, as can an XL, and I'm sure a GS can too. I don't find it a great deal of use these days, in all honesty.
Quote:
Don't see what the point of releasing more midrange cards is, besides building hype. The 6200 seems to have the OEM range covered, 6600GT/6800LE has the budget enthusiast covered, 6800GT has the midrange enthusiast, and 7 series has the high end. Regardless, good review :)
It is to reduce costs of production and offer something at a new pricepoint. It's already below $200, and that is a ruck load of performance for $200.
silent_project 8th November 2005, 13:55 Quote
I think more and more what we are seeing by the manufacturers rolling out these mid range cards is companies filling out their product portfolio and Nvidia in particular showing how they can take a viable platform like the 6800 (and i am sure the 7800 soon) and in short order create a card to fill a void or sneak into territory where their existing range or price points don't hit or don't cover effectively. Ati has been hurting recently, their cards are good yes but they are just now implementing support for technologies that Nvidia has had a corner on the market for for months now. What concerns me even more is that their offerings while still "good" are not earth shattering and the delays and supply shortages show that they are scrambling to not only get a viable product to market but to actually have a product available when it is supposed to be. The second part is interesting because ATI is an OEM as well as just a chip supplier.
Tim S 8th November 2005, 14:11 Quote
This card is already $50 under MSRP in the states: http://www.clubit.com/product_detail.cfm?itemno=A9602793

That is a ruck load of performance for $200.

Prices still the same in the UK - nobody has really made a move on it yet.
LoneArchon 8th November 2005, 21:28 Quote
Great review on the card. I wish would have waited a bit instead of buying a GTO card. But if i did that i would need a new motherboard. I will wait and see what I can get after chrismas. Maybe an Opteron 165(939 Dualcore) with 6800GS. Since the 165 can be had for about $290 right now and is reported to overclock very well with people reaching 2.6ghz on stock aircooling. Paired with a 6800gs or may be 2 of them it would be sweet.
Firehed 8th November 2005, 21:37 Quote
Quote:
Originally Posted by Adnuo
Half-Life 2 has been taken out of the test loop :(. Still a good test IMO, even if it's not as up to date.

Don't see what the point of releasing more midrange cards is, besides building hype. The 6200 seems to have the OEM range covered, 6600GT/6800LE has the budget enthusiast covered, 6800GT has the midrange enthusiast, and 7 series has the high end. Regardless, good review :)
No top-end card will have trouble with HL2 at any setting, even midranges like this can max it out no problem.

I totally am with you bigz - benchmarking isn't as easy as it's cracked up to be. Granted a BPS for several resolutions (start with the highest obviously, if it can play hl2 max at 1600x1200, it can clearly do it below that as well), and a more subjective "here's the best mix of res and quality" is nice to have but a lot less relavent in my eyes, especially with the prominance of TFTs. Of course in theory, bad TFT scaling could almost grant free AA. I'd almost not bother with 1024 anymore, I'd think just 1280 and 1600 will cover most bases. I figure my 1680x1050 is more or less the same as 1600x1200 in the games that support it (a few pixels less, but a wider draw space, in any case it seems to play about the same as that's where I came from).

The "common sense approach" would seem to be the best mix of effeciency and quality results. If it's unplayable at 1280x1024 (I wouldn't bother with the x960 for 4:3 except in FEAR and other games that refuse to do any other aspect), obviously it won't do 1600x1200, and vice-versa. Before I always liked to play at 1600 because I didn't like for my monitor to switch modes and drop to 60hz and need to be readjusted for squareness, now I prefer running native because it's on a flat panel.

Guess I don't stand a whole lot of chance at not losing a fortune on my NIB 6800gt again. Maybe I should just go for SLI again (but gah, I'd need another new waterblock......and I'd rather gain back what I can than shell out another $120 or so), seeing as I'd actually get something out of it with FEAR and Q4 and whatnot. Especially considering driver support has increased tenfold since my RMA (and when I got my LCD after the RMA, I stopped running dual-display, but if I can customize a keyboard shortcut for enable/disable SLI like for my display rotation, that wouldn't matter much)
Petor 8th November 2005, 22:25 Quote
umm...
i play fear on at 1024x768 with the graphics on high.... on a 6600gt overclocked out of the box... with no trouble at all... runs smooth as
Tim S 8th November 2005, 22:37 Quote
it depends what you class as 'smooth' - these are 'worst case' scenario settings. I could easily make them much much higher and have the frame rate drop much much lower, but I believe the majority of frames should be rendered above 30 frames per second.

I've also had this arguement elsewhere on these forums.
leviathan18 9th November 2005, 00:59 Quote
nice review i like that best playable settings and i hope you guys keep that nice work...


the card looks good for 200$ not too long ago that was the price of the 6600gt
Payne280 9th November 2005, 03:55 Quote
Bigz good luck with trying to make a 6800GT and a 6800GS run in SLI, I just don't see how that would work out.
pillow 9th November 2005, 04:08 Quote
newegg has two of the 6800gs in stock, an evga and the XXX just wanted to let you guys know, if you didnt already. what would be a better purchase, a 7800gt or two of the 6800gs' in sli?

F.B.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums