bit-tech.net

NVIDIA recalls GeForce 8800 GTX batch

NVIDIA recalls GeForce 8800 GTX batch

NVIDIA has recalled a batch of GeForce 8800 GTX cards, but still expects to hard launch on the 8th November.

As many of you know, NVIDIA is launching its next-generation GeForce 8-series products later this week. However, there has been a manufacturing problem with a batch of the GeForce 8800 GTX cards.

NVIDIA has issued us with an official statement regarding the problems:

"Some GeForce 8800 GTX boards that were built through our contract manufacturer had a simple BOM error - wrong resistor value. GeForce 8800 GTS boards are not effected by this.

These GeForce 8800 GTX boards were shipped to our Add-In-Card partners.

We have been working with them to pull these back and change the resistor to the correct value.

We believe we will still be able to hit our hard launch this week (Nov 8th) with the new GeForce 8800 GTX boards.

This is a testament to our execution as well as the execution of our Add-In-Card partners.

Adam Foat, Product PR Manger, Northern Europe."


Clearly someone has made a mistake during the manufacturing process and NVIDIA is going to have to pull out all of the stops to meet its launch date. However, we are glad to see that NVIDIA is addressing these problems before the product gets into the hands of consumers.

Look out for our review of the products later this week - we'll be keeping a close eye on stock levels when the product launches.

11 Comments

Discuss in the forums Reply
Lazarus Dark 6th November 2006, 11:03 Quote
g80 has felt rushed to me and I'm thinking this may be the result of that. I understand the desire to beat ati to dx10 and next gen grafx and even applaud a tech company pushing the envelope but quality should come first. Mostly though I think they should have let g80 bake in the labs a bit more to work on power consumption or at least some low power mode for normal desktop usage, but perhaps that will come with drivers but really it should never have two pcie power connectors, thats ridiculous. hopefully with the next die shrink and further improvements they will get the 8900 series to reasonable levels. I think I'll have to skip these though, I live in a very old apartment complex and the wiring isnt that great; add in a phase change to cool the freakin card and I think I might blow a fuse.
Bindibadgi 6th November 2006, 11:11 Quote
Rushed? No. It's just because you havent had details strewn out about it for like, 6 months. It's only recently that the Inq and a few other sites have actual details on it, most people have kept to the NDA. The only other option for power consumption for something with that many trannies is a whole new process (65nm) that requires an entire rehawl for the chip FAB and will put them 6 months behind. It's not an option considering DX10 is right around the corner. You don't need phase change and two power connectors is no different from running SLI.
zr_ox 6th November 2006, 12:13 Quote
Just as long as this does not delay Tim's review on Wed ;)
Tim S 6th November 2006, 12:55 Quote
hopefully it won't, but I've got a few long nights ahead of me. :)
perplekks45 6th November 2006, 13:57 Quote
Poor poor Tim. :p
You want to change? You go to my university and I'll do the benchies for you. :D
LoneArchon 6th November 2006, 14:45 Quote
Quote:
Originally Posted by Lazarus Dark
g80 has felt rushed to me and I'm thinking this may be the result of that. I understand the desire to beat ati to dx10 and next gen grafx and even applaud a tech company pushing the envelope but quality should come first. Mostly though I think they should have let g80 bake in the labs a bit more to work on power consumption or at least some low power mode for normal desktop usage, but perhaps that will come with drivers but really it should never have two pcie power connectors, thats ridiculous. hopefully with the next die shrink and further improvements they will get the 8900 series to reasonable levels. I think I'll have to skip these though, I live in a very old apartment complex and the wiring isnt that great; add in a phase change to cool the freakin card and I think I might blow a fuse.
According to this site power consumption under load is only 4% higher than the current king of the hill the Radeon x1950xtx http://www.dailytech.com/article.aspx?newsid=4812
I really would like to know about the dual Sli Connectors it has
mclean007 6th November 2006, 15:22 Quote
I'm seriously salivating at the thought that a whole new generation of GPUs (by all accounts G80 is far more revolutionary than the incremental steps taken with the 6xxx and 7xxx GPUs, impressive though they were). Roll on Wednesday!!!
JADS 6th November 2006, 15:27 Quote
It'll be cool to see how it performs, but somewhat depressing to know that every single card using the chip will be identical. I mean what is the point of having different manufacturers if they all produce identical products?
mclean007 6th November 2006, 15:35 Quote
Quote:
Originally Posted by JADS
I mean what is the point of having different manufacturers if they all produce identical products?
Ever it was thus with graphics cards. Manufacturers will differentiate on cooling solutions (though the first batch will likely mostly use reference coolers, especially if NVidia has produced a decent reference design), clock speeds, software bundles, warranties and price.
M4RTIN 6th November 2006, 16:03 Quote
i'd look to r600 launch for different designs on the 8800 what better way to combat X2000? than a high power cooler and uber high clock speeds
meserlian 6th November 2006, 22:45 Quote
the only reason i am waiting for this card to come out is because i am hoping it would make the other cards go cheaper, i am looking forward to buy a high end Dell Xps m1710 this x-mas :-).
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums