bit-tech.net

Nvidia ends support for DirectX 10 GPUs

Nvidia ends support for DirectX 10 GPUs

Nvidia's DirectX 10 graphics card family has officially reached end-of-life, with the company announcing the GPUs will no longer be supported in future driver releases.

Nvidia has officially announced plans to end support for its DirectX 10 graphics card families, instead concentrating its future driver efforts on optimising the performance and stability of its DirectX 11 compatible Fermi, Kepler and Maxwell ranges.

In a support update posted this week, Nvidia has warned that all driver package releases following Release 340, starting with Release 343, will drop support for the company's DirectX 10-exclusive GPU families in both the consumer and professional product lines. As a result, owners of said cards will be stuck on an outdated driver branch until such a time as they see fit to splash out on a hardware upgrade.

Cards affected by the move include the GeForce 8 and 9 desktop GPU families, the GeForce 100, 200, 300 and 400 desktop GPU families, the GeForce 7, 8 and 9 laptop families and the GeForce 100, 200 and 300 laptop families. Professional users will also find a range of Quadro FX, Quadro CX, Quadro Plex and a single Tesla board on the end-of-life list.

Nvidia isn't forcing users into an immediate upgrade, however: the Release 340 branch of the company's driver bundle will continue to be updated for reported issues until the 1st of April 2016, after which it will be formally abandoned. Non-bugfix enhancements found in the Release 343 branch and newer, however, won't be backported to the older driver branch - including enhancements and performance optimisations.

Nvidia's support page offers a full list of the affected graphics card products for both desktop and laptop users.

26 Comments

Discuss in the forums Reply
AlienwareAndy 14th March 2014, 10:53 Quote
Can't say I blame them. They release cards like chit through a goose so supporting all of them with one driver must be a right old Peter.

AMD ended support for the DX10 cards they made a while back too, removing any Crossfire support in later drivers.

I still think GPU producers should have separate teams of devs writing drivers for one specific core format (IE Kepler, Fermi etc).
Baz 14th March 2014, 13:16 Quote
The GTX 280 was one of the first products I reviewed at bit-tech, and I still have bit's old GTX 275 collecting dust on a shelf. Goodbye old soldiers.
badders 14th March 2014, 13:44 Quote
Quote:
Originally Posted by Baz
The GTX 280 was one of the first products I reviewed at bit-tech, and I still have bit's old GTX 275 collecting dust on a shelf. Goodbye old soldiers.

I'm still running a 250GTS!
AlienwareAndy 14th March 2014, 13:52 Quote
Quote:
Originally Posted by Baz
The GTX 280 was one of the first products I reviewed at bit-tech, and I still have bit's old GTX 275 collecting dust on a shelf. Goodbye old soldiers.

They were utter crap.
Gundam God 14th March 2014, 14:22 Quote
Still running an 8800 GT GS. Still don't have any real reason to upgrade yet though I might update the drivers soon, can't actually remember the last time I did that.
bawjaws 14th March 2014, 14:46 Quote
My 8800GT died of old age about a year ago. That was an awesome card, would still happily be running it now had it not expired :(
schmidtbag 14th March 2014, 15:14 Quote
If any of you are using linux, they simply moved the DX10 cards to their "legacy" drivers. They still update those drivers, just not as frequently. So, if you'd like to keep using those GPUs in the years to come, this would be a good opportunity to give linux a shot. You're obviously most likely going to get a better experience with newer GPUs but linux works very nicely with nvidia. In fact they still support the GeForce 6 and 7 series.

I personally still own a 7900GTO, but it's currently in a staticproof bag sitting in a shelf. I wish it was CUDA compatible, because it'd make a great GPGPU - the core is still pretty good even for today's standards but 256MB of VRAM really isn't enough to play games, but sufficeint for non-gaming purposes.
Umbra 14th March 2014, 16:22 Quote
Still use my BFG 8800 in a back up/emergency PC, I bought it new £140 and fitted a Zalman heatsink/fan to it, played over a 1000 hours of Oblivion with it, brilliant card, completely useless for today's AAA games of course but, Ah, the memories.

http://www.fazerfetish.eclipse.co.uk/BFG%208800%20cooler%20small.jpg
maverik-sg1 14th March 2014, 18:49 Quote
The only surprise to me is that they haven't discontinued them sooner - probably console ports were propping up the driver support?

I always perceived DX10 as bit of a mis-fire, not a massive jump over DX9, so not invested in, DX11 adding tesselation allowed console ports to be easier on the eyes, we'll finally say goodbye to DX9 games over the next 12-18mths though in favour for DX11 console ports....which should keep us going for the next 4 years lol.
schmidtbag 14th March 2014, 18:53 Quote
Quote:
Originally Posted by maverik-sg1
The only surprise to me is that they haven't discontinued them sooner - probably console ports were propping up the driver support?

I always perceived DX10 as bit of a mis-fire, not a massive jump over DX9, so not invested in, DX11 adding tesselation allowed console ports to be easier on the eyes, we'll finally say goodbye to DX9 games over the next 12-18mths though in favour for DX11 console ports....which should keep us going for the next 4 years lol.

I too am surprised they haven't done this sooner. The reason DX10 was a mis-fire is because MS didn't release DX10 for Windows XP, and, ATI took a relatively long time to add the support for it. Also, consoles didn't support the technology that DX10 offered so that also didn't really help much.

DX11 worked out because people were actually willing to switch to Windows 7 and the hardware performance was a big enough jump over the DX10 generation products. But, even DX11 is relatively uncommon.
SMIFFYDUDE 14th March 2014, 19:21 Quote
I hate Nvidia drivers, I have to go way back to 314.22 for a driver that doesn't try to murder my GTX 560 Tis but doing that means I can't use SLI in some games.
AlienwareAndy 14th March 2014, 19:31 Quote
Quote:
Originally Posted by maverik-sg1
The only surprise to me is that they haven't discontinued them sooner - probably console ports were propping up the driver support?

They had done it sooner, just not officially. When BF3 launched I was running a triple screen Quad SLI PC (two GTZ 295 single PCB). BF3 did a lot of this....

http://s72.photobucket.com/user/timmahtiburon/media/bf32011-11-2915-23-08-86.jpg.html

http://s72.photobucket.com/user/timmahtiburon/media/bf32011-11-2915-23-30-62.jpg.html

I mean the shadows of course. So I waited for a fix. And waited, and waited. Nvidia just said that they pretty much couldn't be assed with older cards.

Not nice if I'd paid over five hundred notes per card and this wasn't even two years into their lifespan. Thankfully each card cost me peanuts but I was still rather cross over it.
LordPyrinc 14th March 2014, 22:16 Quote
I've got a perfectly functional GTX 275 laying about. It's got the oddball 768 MB of RAM. Last time I hooked it up was when I was first playing around with SLI with my 550s. Used the 275 as a dedicated PhysX card. Took it out when I dropped the 660s in.
schmidtbag 14th March 2014, 23:03 Quote
Quote:
Originally Posted by LordPyrinc
I've got a perfectly functional GTX 275 laying about. It's got the oddball 768 MB of RAM. Last time I hooked it up was when I was first playing around with SLI with my 550s. Used the 275 as a dedicated PhysX card. Took it out when I dropped the 660s in.

If you have another PCIe slot you can still use it for physx. You don't need matching GPUs to accomplish this.
Cthippo 14th March 2014, 23:15 Quote
Still running a GT320 and a 7800GT here.

Of course, the only game I play is World of Tanks, and while I could use an upgrade, it's not a priority.
SimoomiZ 14th March 2014, 23:34 Quote
The joys of proprietary DX + driver model.

Having a locked down proprietary model allows a company to degrade performance in the latest drivers artificially... or end support altogether - despite the fully unified shader GPU microarchitectures being similar, there's no reason for G80 cards to become totally obsolete.
Guinevere 15th March 2014, 01:30 Quote
Quote:
Originally Posted by schmidtbag
If you have another PCIe slot you can still use it for physx. You don't need matching GPUs to accomplish this.

Maybe, but many a benchmark says you'll get the best performance by allowing the physx (when you're actually playing game that has some) run on the main card(s).

/JustSayin
Pookie 15th March 2014, 09:09 Quote
I thought they already would have done this ages ago! AMD stopped updating it's DX10 cards atleast 18 months back.
AlienwareAndy 15th March 2014, 10:42 Quote
Guys the Physx PPU seems to react to clock speed. So, putting a GTX 275 in as a Physx card when you are running a 1100mhz Kepler is a waste of time. It'll simply slow you down.

It seems Nvidia have piped the PPU onto the die itself, making it better with a better core (if that makes sense). I tried this theory a couple of years back and you were literally just adding a spare furnace to your rig for no gains whatsoever.

If you run SLI then setting it to the second GPU did seem to help matters, rather than letting the master GPU do all the work.
.//TuNdRa 15th March 2014, 20:03 Quote
Quote:
Originally Posted by SimoomiZ
The joys of proprietary DX + driver model.

Having a locked down proprietary model allows a company to degrade performance in the latest drivers artificially... or end support altogether - despite the fully unified shader GPU microarchitectures being similar, there's no reason for G80 cards to become totally obsolete.

There is within reason. Currently anything further back than 600 series basically doesn't get any targeted support anyway and Unified shaders only go so far, all of the extra hardware added in to support DX11, and further backend optimisations internally have probably led to "Unified Shaders" Being a lot less unified across generations than most Nvidia Engineers would like. Ensuring that all current driver releases worked on 8000, 9000 and 200 series cards was probably getting to be more hassle than it was worth, especially considering even the 200 series is Five years old now.
Bokonist 16th March 2014, 11:18 Quote
The article says dx10 cards will be discontinued, and goes on to include the desktop 400 series which were the first dx11 cards. As a proud owner of 460s I am concerned about no more driver support as SLI support can be shakey even when you do get the latest drivers.
r3loaded 16th March 2014, 11:47 Quote
Quote:
Originally Posted by Bokonist
The article says dx10 cards will be discontinued, and goes on to include the desktop 400 series which were the first dx11 cards. As a proud owner of 460s I am concerned about no more driver support as SLI support can be shakey even when you do get the latest drivers.

Ah, that was referring to the OEM-only GeForce 405 which was actually a rebadged GT218 Tesla chip and not a DX11 Fermi chip.
true_gamer 16th March 2014, 16:15 Quote
Quote:
Originally Posted by AlienwareAndy
Quote:
Originally Posted by Baz
The GTX 280 was one of the first products I reviewed at bit-tech, and I still have bit's old GTX 275 collecting dust on a shelf. Goodbye old soldiers.

They were utter crap.

The GTX 280 was the first card I had that could max out Crysis at 1200p! And lasted me well over 1yr. (Which is probably the longest time I have held on to my hardware before getting the upgrade bug!) - So it was far from crap and a much better improvement over the 9800GTX which sucked ballz, let alone that 9800GX2 that also ran like crap due to the limited no. of games that supported SLI. :)
true_gamer 16th March 2014, 16:26 Quote
Quote:
Originally Posted by SMIFFYDUDE
I hate Nvidia drivers, I have to go way back to 314.22 for a driver that doesn't try to murder my GTX 560 Tis but doing that means I can't use SLI in some games.

Sounds like you need to do a fresh install of windows or something... I rarely get a driver problem unless it's down to a game being rushed out like Tomb Raider which kept crashing...

Anyway, whenever I update my drivers, I always uninstall the old drivers first, reboot and then run driver sweeper, reboot, then go into AppData/Program Data/Programs etc. and delete any NVidia folders then use ccleaner, and then reboot and install the new ones.

Saves any driver conflicts! ;)
Bokonist 16th March 2014, 18:17 Quote
Quote:
Originally Posted by r3loaded
Quote:
Originally Posted by Bokonist
The article says dx10 cards will be discontinued, and goes on to include the desktop 400 series which were the first dx11 cards. As a proud owner of 460s I am concerned about no more driver support as SLI support can be shakey even when you do get the latest drivers.

Ah, that was referring to the OEM-only GeForce 405 which was actually a rebadged GT218 Tesla chip and not a DX11 Fermi chip.

Thanks for the clarification, I do want to upgrade really but don't want to be forced into it. Hopefully I'll be in a position to do that by the time the serious 800 series gpus get released. I'm guessing they'll be around by Christmas.
Combatus 17th March 2014, 23:58 Quote
Quote:
Originally Posted by AlienwareAndy
Quote:
Originally Posted by Baz
The GTX 280 was one of the first products I reviewed at bit-tech, and I still have bit's old GTX 275 collecting dust on a shelf. Goodbye old soldiers.

They were utter crap.

Compared to what?
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums