bit-tech.net

Nvidia releases PhysX for 8, 9 and GTX 2xx series

Nvidia releases PhysX for 8, 9 and GTX 2xx series

Nvidia has released public PhysX drivers - time to throw stuff about!

Nvidia has made a big deal in recent months about its purchase of Ageia and the potential of PhysX integration on the GPU. With a vast installed user base of Nvidia GeForce 8,9 and GTX series graphics card owners, Nvidia certainly has the potential to mount a challenge against Intel's Havok API, used by hit games like Half Life 2, Soul Calibur 4 and Company of Heroes to name but a few, but we have yet to see a publically released PhysX driver.

That was until today, when Nvidia dropped a monster download including not just the PhysX enabled Forceware 177.83 driver, but a full version of PhysX enabled game Warmonger, PhysX technology demos and the Unreal Tournament 3 PhysX mod. However, in true daytime shopping channel style, "Wait, there's more!"

Nvidia has also included two more CUDA based applications to take advantage of the untapped processing power of your GPU, with a beta version of the Badaboom video encoder/media converter (although only a thirty day trial, as this is an application Nvidia wishes to sell separately) and a Folding@home client too.

The entire pack can be downloaded from Nvidia's website, although be warned it does weigh in at a meaty 2.7GB - hope you're not on too restrictive a broadband service.

However, the driver is still missing the originally touted ability to manually dedicate certain portion of your GPU to PhysX processing, although you can now use your old 8 or 9 series graphics card as a dedicated PPU by popping it into a second PCI-E slot, with no SLI support required!

While it's great that Nvidia has finally released PhysX support for older cards, we still can't help but wonder of it's worth it right now. PhysX accelerated titles are few and far between on the PC, and from our previous experience the PhysX effects only add a visual aspect to most games, as developers still have to cater for the core gameplay to those sans PhysX processors and do most of the physics effects on the CPU. In fact the only place where you'll see a real advantage at the moment (outside the very limited PC stable of games) is in 3DMark Vantage and the specific PhysX benchmark.

It's also a disappointment to see Badaboom releasing as a separate product rather than a welcome freebie and the prospect of having to pay for the software in the future, which we've been told will soon be bundled with new Nvidia graphics cards, despite having purchased the Nvidia GPU required to run it, smacks of exploitation. Let's hope Nvidia sees the error of its ways and releases it as a free application for all GeForce owners.

What's your take on the place of PhysX in modern gaming? How do you feel about having to pay for Badaboom? What's do you think is the best use of physics in a game? (Soul Calibur 4 - ed) Let us know in the forums.

30 Comments

Discuss in the forums Reply
Hugo 13th August 2008, 13:10 Quote
Quote:
Originally Posted by article
What's do you think is the best use of physics in a game? (Soul Calibur 4 - ed)
Soul Calibur 4, DoA 4, <insert fighting game with special boob physics here> :D
Omnituens 13th August 2008, 13:35 Quote
WTB Boob Physics Card.
Redbeaver 13th August 2008, 13:43 Quote
2.7Gb???

i hope there's an option to grab just the driver lol

but uh, what im really personally waiting is the ability to use 2nd graphic card as dedicated processor tho............ bleh.........
Narishma 13th August 2008, 13:46 Quote
Even if it's a little old now, I think HL2 still has the best use of physics in a game.
BlackMage23 13th August 2008, 14:01 Quote
Will the UT3 mod make all the female characters boobs bouncey?
WhiskeyAlpha 13th August 2008, 14:16 Quote
Quote:
Originally Posted by Article
However, the driver is still missing some of the originally touted features, including the ability to manually dedicate a certain portion of, or even an entire GPU to physics processing, as well as the ability to use a different GPU in a second PCI-E slot as a dedicated PhysX processor, so there's still some way for the technology of GPU PhysX acceleration to go yet.

Are you 100% certain on this? I'm sure when I was reading up on this last night I found a guy on a forum who was using a 8800GT as a dedicated physics card which boosted his Vantage benchmark considerably? I'll try and see if I can dig it up (blast regualrly cleaning my browsing history!).

EDIT:

Found another one anyway here. Looks like it's working okay there?
Timmy_the_tortoise 13th August 2008, 14:18 Quote
Quote:
Originally Posted by Narishma
Even if it's a little old now, I think HL2 still has the best use of physics in a game.

I think most people would agree with you there. Hopefully HL3 (and whatever engine it uses) will push it even further.
wuyanxu 13th August 2008, 14:25 Quote
tested most of the demo, meh. UT3 maps: meh. and Crazy Machine 2 still won't let me play PhsyX levels :(

anyone know how good is the transcoder? because it's 30 day trail, i want to have a large amount of video before installing it.
(at the moment, don't need to transcode anything as a single night have transcoded all stuff i want into iPhone format on quad core)
Timmy_the_tortoise 13th August 2008, 14:28 Quote
Quote:
Originally Posted by wuyanxu
tested most of the demo, meh. UT3 maps: meh. and Crazy Machine 2 still won't let me play PhsyX levels :(

You've tried it? Frame-rate hits?
wuyanxu 13th August 2008, 14:35 Quote
Quote:
Originally Posted by Timmy_the_tortoise
You've tried it? Frame-rate hits?
for which game?

UT3 normal maps: no, UT3 physX maps, couldn't play it before, so can't comment.
Timmy_the_tortoise 13th August 2008, 14:38 Quote
Quote:
Originally Posted by wuyanxu
for which game?

UT3 normal maps: no, UT3 physX maps, couldn't play it before, so can't comment.

I just meant in general.. Since some of the GPU is used for PhysX now, I guessed there may be frame-rate drops..
Baz 13th August 2008, 15:00 Quote
Quote:
Originally Posted by WhiskeyAlpha
Quote:
Originally Posted by Article
However, the driver is still missing some of the originally touted features, including the ability to manually dedicate a certain portion of, or even an entire GPU to physics processing, as well as the ability to use a different GPU in a second PCI-E slot as a dedicated PhysX processor, so there's still some way for the technology of GPU PhysX acceleration to go yet.

Are you 100% certain on this? I'm sure when I was reading up on this last night I found a guy on a forum who was using a 8800GT as a dedicated physics card which boosted his Vantage benchmark considerably? I'll try and see if I can dig it up (blast regualrly cleaning my browsing history!).

EDIT:

Found another one anyway here. Looks like it's working okay there?

Thanks for the heads up, we'll test this out and check back when we've confirmed
Hugo 13th August 2008, 15:16 Quote
You've had 15 mins Harry, I need answers!

Can I use, say, a 9600 GT as a PhysX card alongside my 4870 or not eh?
Jordan Wise 13th August 2008, 15:16 Quote
Quote:
Originally Posted by Narishma
Even if it's a little old now, I think HL2 still has the best use of physics in a game.

Agreed. Crysis could have gone so much further with the power behind their engine, I was really disappointed by that
Ninja_182 13th August 2008, 16:19 Quote
I never really saw much use for additional physics acceleration in games. HL2 made adequate use of physics. No game requires the simultaneous exploding of 3000 barrels. Will be interesting to see HL3 or maybe just Ep3's use of physics but I doubt it will require additional acceleration, especially when CPUs themselves are enough.

It would make more sense if AMD and Intel got together and did the same thing for Havok. That way we have it standardized across platforms, not just a stupid scenario where you have to have one manufacturers hardware or you cant have it.
wharrad 13th August 2008, 17:32 Quote
I'm going to guess that these will work along side the older dedicated PhysX cards?

I say that, as I'm with Hugo, more an ATI man (multi-monitor you see). Those dedicated cards are pretty cheap now if Overclockers still stock them, and I'm going to hazard a guess that Vista's WDDM drivers are not going to like Nvidia and ATI at the same time... And I actually sort of like Aero (Should I admit that?).
WhiskeyAlpha 13th August 2008, 18:06 Quote
Quote:
Originally Posted by Baz
Thanks for the heads up, we'll test this out and check back when we've confirmed

No probs ;).

It leaves an interesting question though. Can the PhysX driver be installed "stand alone" from the Nvidia package? If so, it would be interesting to whack a Nvidia 8+ series in a rig with an ATi card and CCC, install the PhysX driver and see how you get on.

I'm reasonably sure that it won't work and that the PhysX driver won't allow the Nvidia card to perform solely physics calcs without the appropriate Nvidia drivers in place but hey, it's got to be worth a try for the hell of it no?
Irvine 13th August 2008, 19:30 Quote
I'm glad they're adding the physics processing now. Maybe they'll be able to standardize it soon, if ATI follows suit in some sort of fashion.

Plus, a Folding@home client for nVidia GPUs?! That's huge!
johnnyboy700 13th August 2008, 21:56 Quote
Can't help but feel sorry for the few folks who bravely decided to support a new hardware direction by splashing out £150ish on a dedicated physics card only to have their cash effectively incinerated in front of them with the nVidia/Ageia merger/takover. Mind you, can anyone name any more than five games that used the Ageia cards?

It's the kind of bad experience that puts people off investing in new ideas and directions in case they end up with a hole in their wallet and a pointless bit of hardware.
karolis 13th August 2008, 23:51 Quote
johnnyboy700, with respect, you don't buy a dedicated ppu because you need it. you buy it because you _want_ it, and that fact alone makes that piece of hardware not pointless.

on a side note, i would still want to see dedicated ppu's in future. people above are talking about getting a cheap graphics card just for physics. some even consider mixing ati and nvidia (doubt it's gonna work, but fingers crossed). ageia dedicated ppu's didn't have this issue. you get it and you get whatever graphics card you want, be it radeon or geforce. you don't even need to occupy/waste pci-e for it (they connect via pci, which is plenty on almost all motherboards). you could even run multiple gpu's in crossfire/sli and still have a dedicated physics processor.

lack of titles? yep. that's a problem. and first reason that comes to mind for this is the requirement of specific hardware to run it (not any more). And i think that now, since a major graphics card manufacturer decided to support it, i would expect for the range of games to expand.

if the new drivers are compatible with the old dedicaded hardware (i hope they are), owners of ageia ppu's will only benefit.
leexgx 14th August 2008, 02:52 Quote
Quote:
Originally Posted by WhiskeyAlpha
Are you 100% certain on this? I'm sure when I was reading up on this last night I found a guy on a forum who was using a 8800GT as a dedicated physics card which boosted his Vantage benchmark considerably? I'll try and see if I can dig it up (blast regualrly cleaning my browsing history!).

EDIT:

Found another one anyway here. Looks like it's working okay there?

problem with Vantage mark, is that PhysX API is used part as the CPU test so to max the CPU out (and Low res so that its CPU bound) so when you have an PhysX card or an Nvidia PyhsX driver installed it Inflates the CPU score by an stupid amount (300%) the CPU mark in no way is an 3dmark its purely an cpu streser, that allso inflates the total 3dmark score by about 10% as well (when the PyhsX driver is installed)

3dmark05 was the last 3dmark that did not take the CPU into account as part as the final score they most

likely Bring an patch out so that they disable Hardware API use on Vantage (think you can turn off)

--------
the Video card it self is for PPu is far more powerful then an PCI PPU card and is not likely to affect an system with 2 video cards in SLI , Vantage CPU test is very exaggerated (think i got that word right) as it has basically no GPU use
klutch4891 14th August 2008, 03:55 Quote
I downloaded this yesterday and although it is in a separate program from the rest of the nVidia control panel you can set cards to be solely for PhysX. Because I have my 8800GTS running normally and an 8600GT for PhysX. But I downloaded the drivers through the "Geforce Power Pack" link; so it may be different.
r4tch3t 14th August 2008, 11:02 Quote
Unfortunately I cannot get drivers for mine as Dells drivers are from last year! Does anyone know when drivers for the 8600m GT will be released somewhere? I want the extra levels in crazy machines.
WhiskeyAlpha 14th August 2008, 12:42 Quote
Quote:
Originally Posted by leexgx
problem with Vantage mark, is that PhysX API is used part as the CPU test so to max the CPU out (and Low res so that its CPU bound) so when you have an PhysX card or an Nvidia PyhsX driver installed it Inflates the CPU score by an stupid amount (300%) the CPU mark in no way is an 3dmark its purely an cpu streser, that allso inflates the total 3dmark score by about 10% as well (when the PyhsX driver is installed)

3dmark05 was the last 3dmark that did not take the CPU into account as part as the final score they most

likely Bring an patch out so that they disable Hardware API use on Vantage (think you can turn off)

--------
the Video card it self is for PPu is far more powerful then an PCI PPU card and is not likely to affect an system with 2 video cards in SLI , Vantage CPU test is very exaggerated (think i got that word right) as it has basically no GPU use

Yeah I was already aware of how Vantage "thinks" it's PhysX processing is being performed on the CPU and so attributes the increased score to your CPU rather than GPU.

However, that's besides the point, I was simply stating that others have shown it is possible to run an additional Nvidia 8+series as a dedicated PPU; refuting what it states in the article. If you'd looked at the link I posted it was actually for a benchmark of the Nurien tech demo which showed some impressive gains.
badders 14th August 2008, 14:08 Quote
Quote:
Originally Posted by Ninja_182
No game requires the simultaneous exploding of 3000 barrels.

WRONG.

All games need simultaneous exploding of 3000 barrels.
Smegwarrior 14th August 2008, 18:35 Quote
It is only a 2.7GB download if you go for all the demo's, if you download just the driver it is about 127MB, the Warmonger game is about 400 - 500MB, the folding at home GPU client is a couple of MB, I didn't download any more of it.
Timmy_the_tortoise 15th August 2008, 14:33 Quote
Quote:
Originally Posted by badders
WRONG.

All games need simultaneous exploding of 3000 barrels.

Then you've always gotta have that smart-ass game which pushes it with 4000.
kenco_uk 15th August 2008, 14:43 Quote
Aye, something like.. Crysis 2:Moar Barrels.
r4tch3t 15th August 2008, 23:35 Quote
Have just installed this on my Desktop, yay for folding, wouldn't work before when I tried it on the GPU, Now instead of 6 days 11 hours for one WU its 2 and a half hours.
Saivert 17th August 2008, 17:02 Quote
The Badaboom GPU assisted encoding software crashes when i launch it.

Motherboard: Gigabyte GA-P35-DS4 rev. 1.0 (BIOS version F12)
CPU: Core 2 Duo E6750
RAM: 4GB Corsair TWIN2x (pretty run of the mill stuff)
GPU: NVIDIA GeForce 8800GTS 320MB
OS: Windows Vista Ultimate x64

Oh and that Java based download manager is really CPU hungry.. the "firefox.exe" process topped at 48% CPU utilization during the downloading.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums