bit-tech.net

Nvidia PhysX pack due shortly

Nvidia PhysX pack due shortly

If you own an Nvidia card, the new drivers are a free update.

Hot off the phone to Nvidia we found out that the GPU-PhysX pack will see public release as soon as next week. Current game releases like Ghost Recon Advanced Warfare #1 and #2, Unreal Tournament 3 and some other games no one really cares about (well, it's true!) will get GPU-PhysX functions.

PhysX driver updates will be a regular (quoted monthly) feature and future games that include PhysX are touted to include the much anticipated Mirrors Edge.

We can't complain really though, the drivers are a huge free update to all owners of Nvidia 8-series and 9-series cards, unlike other companies *cough*Apple*cough* that would charge for it.

The current driver will work with only one GPU currently, although that includes iterations of SLI since the driver sees multiple cards as one. In this respect it will automatically balance graphical and physics data depending on what the game requires - this will hinder your frame rate a little, but it should make other factors look and feel a little more realistic (NB: "should").

Future driver releases will allow assigning of PhysX to a separate GPU - so if you upgrade from a GeForce 8800 GT to GTX 260 for example, the GT can work on PhysX data only. We double checked and you do not need an SLI board to do this - just one with multiple PCI-Express slots - great news for those with X38, P45 or even P35 boards - we can't imagine PhysX needs that much bandwidth if previous hardware used a PCI interface!

Looking forward to PhysX on GPU adding extra value to your Nvidia (or ATI?) card? Or is it just another useless feature until there's a killer title? Let us know in the forums.

31 Comments

Discuss in the forums Reply
Timmy_the_tortoise 29th July 2008, 13:57 Quote
Let me know when I can get PhysX on my 4850... Then I'll be interested.
[USRF]Obiwan 29th July 2008, 14:18 Quote
Thats a interesting feature, let your old card do the physics without needing SLI. So when I upgrade to a new card my old 8800GT can act as physics processor. Thats cool.
Andune 29th July 2008, 14:18 Quote
"unlike other companies *cough*Apple*cough* that would charge for it."
I have never payed for a Leopard update...

So far though i'm not terrible exited by PhysX....
Bindibadgi 29th July 2008, 14:20 Quote
Quote:
Originally Posted by Andune
"unlike other companies *cough*Apple*cough* that would charge for it."
I have never payed for a Leopard update...

You paid for Leopard though :p Last time I checked I haven't paid for a Windows service pack ;)
Romirez 29th July 2008, 14:28 Quote
So.... My 8800 will do my physics and my graphics in some games? You guys doing any benchmarks for the new nvidia physx by any chance? Might break out advanced warfare (shouldn't that be warfighter btw? :p) again to see if I can spot a difference
zabe 29th July 2008, 14:29 Quote
I think it's great as an advancement for the overall industry. Ageia had their chance and they didn't execute their strategy in the most brilliant way, thus they went down. Nvidia saw the opportunity of their parallel processors (aka GPUs) and bought them.

Now, the interesting bit is that, with nvidia so concentrated on getting Physx to work in all their new GPUs with stream processing, they weren't able to deliver as much performance in the new hardware as ATI (I refuse to call them AMD, they'll always be ATI to me), who had plenty of time to refine their new hardware (although after a very long struggle) and now they bring better performance at a more attractive price.

So now nvidia has (or will have shortly) great (hopefully) physics and good GPUs, while ati has great GPUs with no phisics. When nvidia overcomes the problems that surely will arise from the new drivers mixing graphics and physics functionality, they'll re-focus on graphics performance. Meanwhile, ati will have to focus on intruducing phisics to their hardware, either via Havock or some other way, before Intel comes into the scene with Larrabee.

So yup, the market is at an interesting point where, more than in battle, it seems we're gathering new ammo for the upcoming graphics/physics war!! In the meanwhile, I'm just excited to see what my humble geforce 8600M GT can do in my laptop... I'm not expecting anything big as if I had a 9800 x2, but nevertheless, it'll be interesting to see what my core 2 duo can process once leveraging the phyisics calculations to my 8600...

In any case, it's an exciting moment for all of us gamers!! Graphics + physics = BETTER graphics!! woohoo!!
Timmy_the_tortoise 29th July 2008, 14:31 Quote
Whatever happened to Cell Factor?
Mentai 29th July 2008, 14:36 Quote
So I could use my 9600GT as a physics card if I bought a 4870? Hmmm
zabe 29th July 2008, 14:40 Quote
Quote:
Originally Posted by Timmy_the_tortoise
Whatever happened to Cell Factor?

You can still download the full game here if you want to try it when the new NV drivers are out. Another cool game utilizing Ageia Physx is Warmonger which is also a free game with some levels like Cellfactor, and you can also download the full game here.

They both run like crap (25 fps for CF and 5-7fps for Warmonger cos it makes heavy use of physx and without the appropriate accelerator the cpu just can't keep up...) in my 2.2 ghz core 2 duo, but i'm hoping to see an improvement next week with the new NV drivers for my 8600... we'll see!!
Timmy_the_tortoise 29th July 2008, 14:45 Quote
Quote:
Originally Posted by zabe
Quote:
Originally Posted by Timmy_the_tortoise
Whatever happened to Cell Factor?

You can still download the full game here if you want to try it when the new NV drivers are out. Another cool game utilizing Ageia Physx is Warmonger which is also a free game with some levels like Cellfactor, and you can also download the full game here.

They both run like crap (~25 fps? c'mon...) in my 2.2 ghz core 2 duo, but i'm hoping to see an improvement next week with the new NV drivers... we'll see!!

I'll have to wait for ATI drivers supporting PhysX... Unless I can buy a nice cheap and cheerful 8600GT and stick it in my spare 4x slot.. Would that work? Could I use an Nvidia card for PhysX with an ATI card for graphics?.. Nah.. That wouldn't work, would it..
Bindibadgi 29th July 2008, 14:56 Quote
We can try - it's a good question. I forsee that it'll be a graphical driver conflict though.
zabe 29th July 2008, 14:57 Quote
Quote:
Originally Posted by Timmy_the_tortoise
Would that work? Could I use an Nvidia card for PhysX with an ATI card for graphics?.. Nah.. That wouldn't work, would it..

Actually, that should be totally doable. If you buy a Nvidia card, you can use it for a mix of gpu/physics job, just gpu and deactivate physics calculations, or just leave it for physics. Now, I'm just assuming, but if you can do that with multiple NV cards (even though as the bit-tech news piece says, that functionality will be possible in the short term future, but not yet in this first release of the driver), it would be absolutely logical to assume that you'd be able to configure your nvidia card for physics while you use your ati card for graphics.

In the end you would have bought nvidia hardware anyway, so I doubt they care what you use it for... then again, exactly as Bindibadgi said, they could be a real pain and force that functionality only when 2 NV cards are available, one for graphics, the other for physics, while if there's only one NV gpu they could force you to use it only for graphics... even though the cards would be perfectly capable of a NV/physics-ATI/graphics combination, it wouldn't be the first time for nvidia to limit functionality of some cards/configurations through their drivers...
ComputerKing 29th July 2008, 15:16 Quote
This is nice but ATI don't support this. So I was asking too, can I buy a 8800GT and make it act like PhysX GPU and I use my ATI card normally?

I Mean ( 4870 Main + 8800GT for PhysX ) = This will work or no?
DriftCarl 29th July 2008, 15:22 Quote
im not sold on these physics processors yet. I have seen a few comparison vids and it didnt really get me more into the game. With the vanilla UT3 I still get hooked into the gameplay and start to sweat alot while running round shooting stuff.
sotu1 29th July 2008, 15:29 Quote
great news and a great development, a good stepping stone too! now let's get some games actually worthwhile to use this tech :)
Timmy_the_tortoise 29th July 2008, 15:30 Quote
Quote:
Originally Posted by Bindibadgi
We can try - it's a good question. I forsee that it'll be a graphical driver conflict though.

I propose that, as usual, Bit-tech do all the hard work, so we don't have to!

Looking forward to the results of your tests, Bindi.. Well volunteered.
zabe 29th July 2008, 15:33 Quote
Quote:
Originally Posted by ComputerKing
This is nice but ATI don't support this. So I was asking too, can I buy a 8800GT and make it act like PhysX GPU and I use my ATI card normally?

I Mean ( 4870 Main + 8800GT for PhysX ) = This will work or no?


I don't think you need ATI to support it. You wouldn't be configuring anything in the ATI. You'd use your ATI for graphics, which is what it was made for. Then you buy the Nvidia, and you configure that for physics instead of graphics (if they let you, but the card as such is perfectly capable). It's two different devices, one for graphics as it was intended from ATI, the other as physics processor, as nvidia now gives you the option. In fact, if you select the ati as the video adapter, and configure the NV for physx, there should be no problem in each card doing their thing, cos it's just 2 separate pieces of hardware running tasks independent from each other.

Then again, I remind you about my fear: controling what users do via DRIVERS. Nvidia has done it before, and could very well do it again... hopefully not, fingers crossed!!
Quote:
Originally Posted by DriftCarl
im not sold on these physics processors yet. I have seen a few comparison vids and it didnt really get me more into the game. With the vanilla UT3 I still get hooked into the gameplay and start to sweat alot while running round shooting stuff.

Think about the following idea: you're playing UT3 against me (also love it!!) and we're in a team deathmatch or something. I just shot the hell outta ya and you're health's down to 10, but you're good at avoiding my missiles and you're running for your life. You find this wall and you bomb it so that the wall breaks and you can go in, where it's gonna be way more difficult for me to shoot you. Or once inside if you have time you could bomb the rest of the walls and leave all the debris so that I can't catch you, or at least you can slow my chasing you and end up being saved after picking some health. Wouldn't that be cool?

That's the kind of new gameplay tactics I expect from physics, which is why I'm so excited about it. Apart from the more realistic feel of the whole thing, I think it'll change the way we play!!
Omnituens 29th July 2008, 15:48 Quote
for those with tri-sli boards, could have an sli setup, then a third card to do physics?
zabe 29th July 2008, 15:56 Quote
Quote:
Originally Posted by Omnituens
for those with tri-sli boards, could have an sli setup, then a third card to do physics?

The bit-tech article said that for now the driver will understand all X number of cards as ONE, so with the first release you'll get a 50-50 of graphics/physics processing. With later releases you'll be able to specify which card will do graphics and which physics.
Bindibadgi 29th July 2008, 16:05 Quote
Quote:
Originally Posted by Omnituens
for those with tri-sli boards, could have an sli setup, then a third card to do physics?

That was the idea of boards with 3-slots in the first place ;)
Gunsmith 29th July 2008, 16:42 Quote
Quote:
Originally Posted by Bindibadgi
Quote:
Originally Posted by Omnituens
for those with tri-sli boards, could have an sli setup, then a third card to do physics?

That was the idea of boards with 3-slots in the first place ;)

**** that, i'm sticking to 3 GPU's then allocate one purely to physics.
Smegwarrior 29th July 2008, 16:58 Quote
One problem you might face mixing ATI and Nvidia cards is when you install them, Windows will try and install the GPU drivers for the Nvidia card on boot up and cause a conflict with the ATI drivers before you get to the point of installing the PPU (Physics Processing Unit) drivers for the Nvidia card.

Unless Nvidia and Microsoft work together to have the Windows hardware installation manager give you the option of choosing generic Windows drivers for PPU use which means you can then install the Nvidia official PPU drivers without conflict.
samkiller42 29th July 2008, 17:29 Quote
Quote:
Originally Posted by Gunsmith
Quote:
Originally Posted by Bindibadgi
Quote:
Originally Posted by Omnituens
for those with tri-sli boards, could have an sli setup, then a third card to do physics?

That was the idea of boards with 3-slots in the first place ;)

**** that, i'm sticking to 3 GPU's then allocate one purely to physics.

I just need the power of 3 to power high end games at full settings at 2560*1600 *grins*

Sam
Project_Nightmare 29th July 2008, 17:43 Quote
How well do the Physx games run on the nvidia beta drivers already because I'm not sure if people are running the games on the AEGA Pysics Card or on a Nvidia geforce card.
Gunsmith 29th July 2008, 18:43 Quote
Quote:
Originally Posted by samkiller42
Quote:
Originally Posted by Gunsmith
Quote:
Originally Posted by Bindibadgi
Quote:
Originally Posted by Omnituens
for those with tri-sli boards, could have an sli setup, then a third card to do physics?

That was the idea of boards with 3-slots in the first place ;)

**** that, i'm sticking to 3 GPU's then allocate one purely to physics.

I just need the power of 3 to power high end games at full settings at 2560*1600 *grins*

Sam

absolutly, i love my carbon footprint as well as crysis :)
zabe 29th July 2008, 19:35 Quote
Quote:
Originally Posted by Project_Nightmare
How well do the Physx games run on the nvidia beta drivers already because I'm not sure if people are running the games on the AEGA Pysics Card or on a Nvidia geforce card.

That we will see next week with the new drivers. Right now the only people that can run physx code are those with Nvidia 200 series cards and Forceware 177.39 or later drivers. I guess there MUST be somebody doing benchmarks at home already, but I haven't found any yet :(
zabe 29th July 2008, 19:49 Quote
EDIT: ok, so I've found a few things. You can find a few benchmarks here, a link that I found in one of the hardforum physics pages. In this link many people are complaining about big slowdowns after a few minutes, then again this surely is because of the unfinished nature of these physx drivers. Still, after some changes somebody seems to have fixed the slowdown. Finally, this link seems like fun to try, it reminds me of the good old days of Doom II, but this is completely destructible, so it'll be fun to have a look!!

EDIT: DO NOT, I repeat, DO NOT try the last link. It's utter crap.
Redbeaver 30th July 2008, 00:00 Quote
oooooh snap! nice!

i was contemplating on getting the (now dirt cheap) 260 but feel sorry to just toss away my 8800GTS640...

now i can make use of it!!!!!!! :D
klutch4891 30th July 2008, 01:01 Quote
I like the sound of being able to run PhysX on one card and graphics on another without sli. Now I'll have a use for my barely used 8600gt.
r4tch3t 30th July 2008, 01:36 Quote
Oh yay I can get the bonus PhysX levels in Crazy machines ;)
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums