bit-tech.net

NVIDIA GeForce 7950 GX2

Comments 1 to 25 of 43

Reply
Almightyrastus 6th June 2006, 19:34 Quote
Having seen the performance figures now I have to say I am very impressed with this card, definatly one for my short list when I come to build my next system.

Can't wait to see what they do with the decent quad drivers though.
Ramble 6th June 2006, 19:38 Quote
Ahh, but will it make me tea?
Kipman725 6th June 2006, 20:13 Quote
good price...considering your getting two cards.
tank_rider 6th June 2006, 20:41 Quote
Ah but here comes the akward question, what happens if i run a 3d app, such as a cad or cae program, in dual monitor mode? does each card take one monitor each, or will one of the cards be basically sitting there twiddling its thumbs?
RotoSequence 6th June 2006, 21:26 Quote
Nice, but im still wondering - why dont they split the DVI interconnects beween the two PCBs and let the cooling solution blast air out from the back of both?
DXR_13KE 6th June 2006, 21:27 Quote
very interesting....was looking forward for a FEAR benchamark.

edit:
anyday we will see a X1950 and then a 7960 and then a X1960.....and one day we will even see a X1999XTXXXXXXX version Z and a 7999GTXXXX version omega. when this insanity stop?!?!?!?
scottsch 6th June 2006, 23:45 Quote
Vista and DirectX 10 w/ shader model 4.0 are coming in 6 months. This is still a DX9 card. It will be obsolete for future gaming in 9 months. Why buy any high end card at this point?
Nature 7th June 2006, 00:12 Quote
Quote:
Originally Posted by scottsch
Vista and DirectX 10 w/ shader model 4.0 are coming in 6 months. This is still a DX9 card. It will be obsolete for future gaming in 9 months. Why buy any high end card at this point?

Fantastic GPU(s)... I'll never own it.

I agree with the scotts sentament. When is the 9800pro going to come out again? A card that can play all games fluidly and won't be out of date in a year.

I'm glad that the technology is increasing this fast but make it practical to us commeners. I guess that's what the GT's serries and consoles are for but :'( I don't won't to sacrifice my AA and HDR so I can eat more than Ramen noodles.
Tim S 7th June 2006, 00:50 Quote
Quote:
Originally Posted by scottsch
Vista and DirectX 10 w/ shader model 4.0 are coming in 6 months. This is still a DX9 card. It will be obsolete for future gaming in 9 months. Why buy any high end card at this point?
DirectX9 isn't going to just 'disappear'. Everything that we've seen of Crysis so far was done with DirectX9 shaders, for example.
Tim S 7th June 2006, 01:08 Quote
Quote:
Originally Posted by DXR_13KE
very interesting....was looking forward for a FEAR benchamark.

edit:
anyday we will see a X1950 and then a 7960 and then a X1960.....and one day we will even see a X1999XTXXXXXXX version Z and a 7999GTXXXX version omega. when this insanity stop?!?!?!?
I think we will be doing some more thorough benchmarking for our retail roundup and check the performance over a few more games.
Tim S 7th June 2006, 01:08 Quote
Quote:
Originally Posted by tank_rider
Ah but here comes the akward question, what happens if i run a 3d app, such as a cad or cae program, in dual monitor mode? does each card take one monitor each, or will one of the cards be basically sitting there twiddling its thumbs?
I don't think that will work because both signals are outputted from the 'top' GPU.
-EVRE- 7th June 2006, 01:27 Quote
Quote:
Motherboard compatibility:
First off, it should be reasonably obvious that a pair of GeForce 7950 GX2s in Quad SLI will only work on NVIDIA SLI-certified motherboards. However, NVIDIA has tested GeForce 7950 GX2 on a number of motherboard solutions, including boards with only a single PCI-Express x16 interconnect.

GeForce 7950 GX2 functions as any normal single card solution would, in that it will work in virtually any motherboard with a PCI-Express x16 interconnect, regardless of the chipset. Compatible motherboards will need a BIOS update to ensure that it is able to see the second GPU that is hidden behind the internal PCI-Express switch. For an up-to-date list of motherboards that support GeForce 7950 GX2, you can check over on NVIDIA's homepage.


I have an Asus SLI board that gives 8x to each slot in SLI mode, will these cards work in quad sli on it? ie 4xpcie to each?
Ab$olut 7th June 2006, 01:30 Quote
Quote:
Originally Posted by scottsch
Vista and DirectX 10 w/ shader model 4.0 are coming in 6 months. This is still a DX9 card. It will be obsolete for future gaming in 9 months. Why buy any high end card at this point?
Thats like saying blueray will be the norm when the ps3 releases :) dx9 has some time left in it and seeing how new games are going theres hardly anything worth buying imo so cs:s and dod:s will do fine for me untill valve release the next cs :D
Tim S 7th June 2006, 01:33 Quote
Quote:
Originally Posted by -EVRE-
I have an Asus SLI board that gives 8x to each slot in SLI mode, will these cards work in quad sli on it? ie 4xpcie to each?
If it's listed on nvidia.com/GX2, you'll be able to run Quad SLI in it when the drivers are launched. NVIDIA recommends using two PCI-Express x16 slots though, as you may encounter a lack of bandwidth (resulting in less than maximum performance).
neocleous 7th June 2006, 01:43 Quote
I have a pair of 7900GT's that Im runing in Sli but they have failed so I'm going to return them to Scan. Should I get one of these instead of another two 7900GT's?
bahgger 7th June 2006, 02:25 Quote
Brilliant article, one that has made me decide to get the 7950 GX2 instead of 2 7900GTs. I'm glad that bit-tech.net has the same mindset as what I followed throughout my choices for my new computer and resultantly this review is perfectly tailor made to help me out :) Thanks guys!
yahooadam 7th June 2006, 11:12 Quote
Impressive - i have to ask though, how did you get the cards to play BF2 in widescreen - the command line force thing just stetches the image, so how do you get proper widescreen on your GPU's .........
ozstrike 7th June 2006, 11:52 Quote
Ok I got convinced to get one of these...one question though. Do you think it will ever be possible to watercool one of these?
bahgger 7th June 2006, 13:14 Quote
I'm sure one of the renowned companies will introduce a watercooling solution since, as you can tell, the top PCB can be removed from the bottom, so the cooling kit can be modified. I'm also pretty sure, however, that the watercooling setup would be very expensive! I guess if you can afford 7950 GX2's, you can afford that little extra anyway ;)
eddtox 7th June 2006, 14:31 Quote
Quote:
Originally Posted by ozstriker
Ok I got convinced to get one of these...one question though. Do you think it will ever be possible to watercool one of these?
Yay! Custom waterblock time!!!

-ed out
yahooadam 7th June 2006, 15:45 Quote
i can think of a way to do it - fairly simple, you have a block of copper on the bottom card, heatpipes connect that to the block of copper on the top card and the top card is watercooled, pretty easy, or you have your water loop go through each card (but that means alot of connectors) all this needs is a slim waterblock really
Rimmer 7th June 2006, 23:21 Quote
Quote:
Originally Posted by yahooadam
Impressive - i have to ask though, how did you get the cards to play BF2 in widescreen - the command line force thing just stetches the image, so how do you get proper widescreen on your GPU's .........


I'm also interested in finding this out also :)

Can you let us in on the info Bigz ;)
Tim S 8th June 2006, 02:19 Quote
all of the 'widescreen' hacks we use are publicly available from www.widescreengamingforum.com

BF2 uses a command line modification and AFAIK, it only stretches the GUI, the actual game is running in true widescreen. It looks a damn sight better than running at 1600x1200 and scaling (or having black bars down the side).
yahooadam 8th June 2006, 08:59 Quote
hmmm ok, when i tried BF2 it just seemed to all be stretched (with the command line hacks) but anyway, im orite with the black bars on the side, ok its annoying, but the streched GUI looks really bad IMO
Starbuck3733T 8th June 2006, 21:17 Quote
Bigz, can you do me a HUGE favor and measure just the power draw from the PCI-e connector during a bench run of something?
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums