bit-tech.net

Nvidia responds to AMD’s PhysX criticisms

Nvidia responds to AMD’s PhysX criticisms

Nvidia and AMD: still crazy after all these years.

We recently had the chance to interview Richard Huddy, AMD’s Worldwide Developer Relations manager. Our chat ranged over general topics such as DirectX 11, PCs vs consoles, and working with various game developments.

Huddy also addressed, at length, AMD’s rival, Nvidia. He had sharp words for Nvidia over its conduct concerning GeForce-only anti-aliasing in Batman: Arkham Asylum, saying, “I totally hold in contempt... the appalling way they added MSAA support that uses standard DirectX calls... and locked it to their [GeForce] hardware knowing it would run just fine on our [Radeon] hardware.

While Huddy had some limited praise for Nvidia’s PhysX, he also criticised the way Nvidia has developed the tech since buying its creator, Ageia. Huddy said that “when they [Nvidia] bought Ageia, they had a fairly respectable multicore implementation of PhysX. If you look at it now it basically runs predominantly on one, or at most, two cores. That's pretty shabby! I wonder why Nvidia has done that... because the company doesn’t care about the consumer experience it just cares about selling you more graphics cards by coding it so the GPU appears faster than the CPU.

Nvidia posted a response on its company blog yesterday, written by Nadeem Mohammad. He refuted the accusation that Nvidia has deliberately made PhysX less multi-core friendly: “ I have been a member of the PhysX team, first with AEGIA [sic], and then with NVIDIA, and I can honestly say that since the merger with NVIDIA there have been no changes to the SDK code which purposely reduces the software performance of PhysX or its use of CPU multi-cores.

He goes on to say:

This is yet another completely unsubstantiated accusation made by an employee of one of our competitors. I am writing here to address it directly and call it for what it is, completely false. Nvidia PhysX fully supports multi-core CPUs and multithreaded applications, period. Our developer tools allow developers to design their use of PhysX in PC games to take full advantage of multi-core CPUs and to fully use the multithreaded capabilities.

While Nvidia is known for its combative stance towards rivals, and isn’t afraid of a bit of smack-talk, Mohammad’s post is crisply written and calmly refutes the allegations, pointing out that 3DMark Vantage “ can use 12 threads while running in software-only PhysX.

Convinced by Nvidia’s response? Or are you still inclined to believe AMD? Let us know your thoughts in the forums.

37 Comments

Discuss in the forums Reply
MitchBomcanhao 21st January 2010, 14:05 Quote
Is it me or is everyone misspelling AGEIA? it's not AEGIA. not even the nvidia guys know that? or am I missing some name change? :D
woodss 21st January 2010, 14:13 Quote
PhysX does what it says on the tin, regardless
Sifter3000 21st January 2010, 14:13 Quote
Ha, good spot!
Pete J 21st January 2010, 14:30 Quote
To be honest, after the fiasco with Nvidia disabling PhysX whenever an ATI GPU is present, I'm more inclined to believe what AMD/ATI have to say on the matter.
mi1ez 21st January 2010, 14:32 Quote
Exactly my thoughts Pete
fingerbob69 21st January 2010, 14:34 Quote
His answer fails to address the Batman A. lock out issue. There'll be more in the future for sure.
erratum1 21st January 2010, 15:12 Quote
Never mind all this squabbling, bring on fermi.
knutjb 21st January 2010, 15:16 Quote
Maybe this is a good subject for an unbiased test?
I don't think AMD or Nvidia would be willing to show how they "know" but it should be possible to dig into it. Seeing that Nvidia has turned off standard direct x features from running on ATI cards, I forget the game, maybe there is something to this.
Skiddywinks 21st January 2010, 15:20 Quote
It might "support" multicore, but there is no doubt they are not giving PhysX-on-CPU users as much optimising and accessibility as those who have bought they're own brand of card.
alpaca 21st January 2010, 16:19 Quote
Quote:
Originally Posted by Skiddywinks
It might "support" multicore, but there is no doubt they are not giving PhysX-on-CPU users as much optimising and accessibility as those who have bought they're own brand of card.

which is, as it is their brand and software, their right. but if something is your right, it does not mean that if you do it, people are going to like you.
chizow 21st January 2010, 19:09 Quote
Quote:
Originally Posted by fingerbob69
His answer fails to address the Batman A. lock out issue. There'll be more in the future for sure.

The only realistic solution was already provided in an answer by Huddy in the original interview:
Quote:
bit-tech: Given Nvidia licensed its own MSAA technology for Unreal Engine 3, why don't you just do the same thing? Put your code in as well and when the game detects your vendor ID it uses this code instead.

We're currently working with Eidos and we want that to be in there in a future update. That's not a commitment to it, but we are working with Eidos to make it happen because I believe it's in every consumer’s interest.

Of course, non-committal on their obligation to support features for their own products.
chizow 21st January 2010, 19:18 Quote
Quote:
Originally Posted by Pete J
I'm more inclined to believe what AMD/ATI have to say on the matter.

In that case, you don't have anything to worry about, according to Chris Hook (AMD something something of lies, BS, misinformation etc) your ATI cards have supported physics since at least the 4890....

http://www.youtube.com/watch?v=NFSUh8OjO-0#t=7m23s

Because physics are important to them....really.
thehippoz 21st January 2010, 19:18 Quote
lol *as he inserts the multi core code back into the driver- look there it is!
chizow 21st January 2010, 19:39 Quote
Quote:
Originally Posted by thehippoz
lol *as he inserts the multi core code back into the driver- look there it is!

Its been in the SDK since day 1, if you run Vantage its the part that y'know, pegs all available physical AND logical cores to 100%....y'know the portion that runs at 6-15 FPS because modern CPUs simply aren't fast enough to adequately accelerate physics simulations.

If it were possible to accelerate advanced physics effects on the CPU, don't you think Intel would've rolled it out with Havok by now? ;)
frontline 21st January 2010, 20:16 Quote
Personally, i think Valve had the right idea using Havok to introduce some basic CPU powered Physics into the Source engine games, which enhance the gameplay, rather than dominate it.
confusis 21st January 2010, 20:30 Quote
all we need is a copy of the source code and we will definitely find out :)

(i think chizow is an nvidia fanboy!)
chizow 21st January 2010, 20:44 Quote
Quote:
Originally Posted by confusis
all we need is a copy of the source code and we will definitely find out :)
http://developer.nvidia.com/object/physx_downloads.html

Knock yourself out, it may only cost ya $50K, but even if you got it, would you know what to do with it to "definitely find out"? ;)
Quote:
(i think chizow is an nvidia fanboy!)
Perhaps, but at least I wouldn't be an ignorant ATI fanboy! ;)
thehippoz 21st January 2010, 20:46 Quote
yeah was being sarcastic chiz.. and the code for 3dmark was always a bunch of marketing- I mean it artificially inflated scores for nvidia by rewriting code.. it's like hax- what's the point in a bench then.. like messing with the lod bias

the cpu bench used the gpu as well.. so it doesn't really matter if they claim it uses all cores- physx has always been a marketing tool for them- pretty much nothing that can't be done with havok on todays cpu.. think that's what the guy at ati was getting at
confusis 21st January 2010, 20:47 Quote
Quote:
Originally Posted by chizow
http://developer.nvidia.com/object/physx_downloads.html

Knock yourself out, it may only cost ya $50K, but even if you got it, would you know what to do with it to "definitely find out"? ;)


Perhaps, but at least I wouldn't be an ignorant ATI fanboy! ;)

I'd rather be an ignorant ATI fanboy than an ignorant nVidia fanboy.. Had my share of geforces, happy with my radeon :)
mrbens 21st January 2010, 20:50 Quote
NVIDIA sure are getting a bad reputation these days! And it's about time they got some new cards out to give ATi some healthy competition.
Baron1234 21st January 2010, 20:53 Quote
Well after the Intel compiler scandal, I really don’t know, I have to believe what AMD says. I feel sorry for AMD, it is the smaller company and everyone tries to kill it with dirty tricks.
chizow 21st January 2010, 21:20 Quote
Quote:
Originally Posted by thehippoz
yeah was being sarcastic chiz.. and the code for 3dmark was always a bunch of marketing- I mean it artificially inflated scores for nvidia by rewriting code.. it's like hax- what's the point in a bench then.. like messing with the lod bias

the cpu bench used the gpu as well.. so it doesn't really matter if they claim it uses all cores- physx has always been a marketing tool for them- pretty much nothing that can't be done with havok on todays cpu.. think that's what the guy at ati was getting at

Yes the CPU bench uses the GPU as well as a complement and when it does, you can see it accelerates performance but overall performance is still not acceptable, it goes from single digits to double digits.

There is no artificial inflation because again, you can run the advanced PhysX effects meant for the GPU on the CPU by simply disabling PhysX in the NVCP or by running an ATI card. You can do this in Mirror's Edge, Cryostasis, even Batman:AA and when you do, you can see very clearly that the CPU is not capable of adequately accelerating these effects even with all 4 cores utilized.

It simply comes down to computational power and the great irony here of course is that AMD can't tell you fast enough about the amazing computational power of their GPUs (we've been hearing 1 TFLOP for how long now?) when its convenient for them, but when its not, somehow a CPU with less than 1/10th the floating point capabilities is suddenly good enough?

Also, as for why game physics don't make better use of the CPU, it should be obvious, the game engines running on the CPU handle more than just PhysX, you have the main rendering thread along with AI, control input, sound, etc. along with PhysX, so obviously you can't put those aside just to dedicate all cycles to PhysX, or the game would run even worst. Given most gaming PCs are still dual core, developers have to budget for weaker CPUs but as you can see with console ports that have more baseline CPU power at their disposal , this is beginning to change.
chizow 21st January 2010, 21:22 Quote
Quote:
Originally Posted by confusis


I'd rather be an ignorant ATI fanboy than an ignorant nVidia fanboy.. Had my share of geforces, happy with my radeon :)

Which is why I strive to be neither, as you can see I try and inform myself on the topic so as to avoid commenting ignorantly.....enjoy your radeon!
thehippoz 21st January 2010, 21:36 Quote
AUOr4cFWY-s
Quote:
Game running at 1920x1200 with 4x Adaptive AA (set in ATi Control Panel) and all in-game settings maxed. Used trick to force PhysX to run on the CPU.. never dips bellow 30FPS.

Seems like nVIDIA are purposely misleading the public on how "bad" CPU PhysX is compared to their GPU PhysX.

This is how to get it working: http://forum.beyond3d.com/showthread.php?p=1332461....

System Specs:

Intel Core i7 920 D0 @ 4.4Ghz (HT on) | eVGA X58 Classified Hydro | 6x 2GB Corsair XMS3 Dominator GT PC3-2000 DDR3 RAM | ATi Radeon HD 4870X2 2GB Graphics Card | ATi Radeon HD 4870 1GB Graphics Card for Tri-Fire | eVGA 9800GT 512MB PhysX Card | Intel Pro/1000 CT NIC | Auzentech Forte 7.1 Audio | 2x Intel X25-M 80GB SSD RAID0 | 1TB Seagate 7200.11 7.2K RPM HD | LG GBW-H20L BLU-RAY | Antec Quattro 1000W PSU | Watercooled Silverstone TJ09-BW Case
Pete J 21st January 2010, 21:36 Quote
Quote:
Originally Posted by chizow
In that case, you don't have anything to worry about, according to Chris Hook (AMD something something of lies, BS, misinformation etc) your ATI cards have supported physics since at least the 4890....

http://www.youtube.com/watch?v=NFSUh8OjO-0#t=7m23s

Because physics are important to them....really.

Hmm, that video confused me. So ATI can run Havok physics on their cards? Why don't they implement it then to counter Nvidia's PhysX?

By the way, I don't have ATI cards, they're GTX 260s :) .
Initialised 21st January 2010, 22:33 Quote
If you are lucky enough to have a 8800 GTX that hasn't had to be re-cured in the oven or RMA'd nVidia wont let you use it along side your shiney new 5970.

ATi Graphics with Aegia/nVidia PhysX can be done, but it wont work using nVidia drivers after 191.07. Since the 'Cake' crack surfaced and AMD's first attack on the issue nVidia have taken further action to prevent their customer base from getting a year or two extra out of the last gen cards they paid hundreds of pounds for. Presumably the next step is for new TWIMTBP games to require drivers later than this to run PhysX.

Basically they are telling their existing customer base that they don't support them any more in order the hope that they will wait for Fermi rather than getting a Radeon.
impar 22nd January 2010, 00:00 Quote
Greetings!
Quote:
Originally Posted by Pete J
To be honest, after the fiasco with Nvidia disabling PhysX whenever an ATI GPU is present, I'm more inclined to believe what AMD/ATI have to say on the matter.
Yep.
chizow 22nd January 2010, 00:46 Quote
Quote:
Originally Posted by thehippoz
AUOr4cFWY-s
Quote:
Game running at 1920x1200 with 4x Adaptive AA (set in ATi Control Panel) and all in-game settings maxed. Used trick to force PhysX to run on the CPU.. never dips bellow 30FPS.

Seems like nVIDIA are purposely misleading the public on how "bad" CPU PhysX is compared to their GPU PhysX.

This is how to get it working: http://forum.beyond3d.com/showthread.php?p=1332461....

System Specs:

Intel Core i7 920 D0 @ 4.4Ghz (HT on) | eVGA X58 Classified Hydro | 6x 2GB Corsair XMS3 Dominator GT PC3-2000 DDR3 RAM | ATi Radeon HD 4870X2 2GB Graphics Card | ATi Radeon HD 4870 1GB Graphics Card for Tri-Fire | eVGA 9800GT 512MB PhysX Card | Intel Pro/1000 CT NIC | Auzentech Forte 7.1 Audio | 2x Intel X25-M 80GB SSD RAID0 | 1TB Seagate 7200.11 7.2K RPM HD | LG GBW-H20L BLU-RAY | Antec Quattro 1000W PSU | Watercooled Silverstone TJ09-BW Case

And here's the end result of such hacks/workarounds:

http://www.youtube.com/watch?v=AUOr4cFWY-s#t=1m10s

It should also be noted that the workaround requires you to turn down the physics fidelity/calculations in order to run even remotely adequately on a CPU which results in the very distinct collision detection and simulation problems with papers/fog.

You can also run the GPU effects on the CPU in some other titles like Cryostasis and Mirror's Edge where it becomes abundantly clear even today's fastest Quad core CPUs simply cannot adequately accelerate more advanced physics effects. If they could've, they would've.
chizow 22nd January 2010, 00:49 Quote
Quote:
Originally Posted by Pete J
Quote:
Originally Posted by chizow
In that case, you don't have anything to worry about, according to Chris Hook (AMD something something of lies, BS, misinformation etc) your ATI cards have supported physics since at least the 4890....

http://www.youtube.com/watch?v=NFSUh8OjO-0#t=7m23s

Because physics are important to them....really.

Hmm, that video confused me. So ATI can run Havok physics on their cards? Why don't they implement it then to counter Nvidia's PhysX?

By the way, I don't have ATI cards, they're GTX 260s :) .

And that's a great question, maybe one for ATI? Maybe ATI should worry less about what their competition is doing to add value to their own hardware and instead, look to implement the features they promised and claimed support nearly a year ago for their own customers?

Since you already own GTX 260s you won't have to worry, Nvidia has already stated they're fully open to supporting any and all physics middleware and API on your Nvidia hardware. ;)

http://www.bit-tech.net/news/hardware/2009/10/14/nvidia-doesn-t-care-what-physics-library-de/1
l3v1ck 22nd January 2010, 01:01 Quote
Quote:
Originally Posted by Baron1234
Well after the Intel compiler scandal, I really don’t know, I have to believe what AMD says.
+1 until Nvidia explain themselves. If it wasn't true I'd have expected them to deny it by now.
I imagine the EU will be on them like a tonne of bricks if it's true.
Krayzie_B.o.n.e. 22nd January 2010, 05:43 Quote
Physx has been around for like 10 years and this is all we got?

I believe ATi/AMD because basically Nvidia hasn't really pushed the technology and if it is technology worth pushing make it accessible to everyone in the best way.

I only see PHYSX as a way for NVIDIA to sell cards.

Locking out ATI cards proves the point that NVIDIA is just looking to make a buck
Elton 22nd January 2010, 06:15 Quote
You know what I find quite funny?

Even when Ageia first came out with their PhysX cards, I distinctly remember someone posting up ini modifications for making the CPU run it.

If I remember correctly, the CPU ran it much better than the card did..
Saivert 22nd January 2010, 08:28 Quote
People believe what they want to believe. Facts are irrelevant.
Rezident 22nd January 2010, 14:58 Quote
Quote:
Originally Posted by Pete J
To be honest, after the fiasco with Nvidia disabling PhysX whenever an ATI GPU is present, I'm more inclined to believe what AMD/ATI have to say on the matter.

Same as. Despite being an nvidia customer for years because of their usually excellent hardware, their nasty business practices over the last few years have put me right off them. I honestly couldn't support (or even believe them) anymore. I'm afraid their dominant market power has corrupted them.
Snuffles 22nd January 2010, 21:15 Quote
The attitudes and business practices of both nVidia and Intel have pretty much sealed the deal on my next computer being AMD/ATI. Ill take the hit on performance over having to deal with the Electronics Mafioso
hrtz_Junkie 23rd March 2010, 15:10 Quote
Omg lets all jump on the nvidea bashing bandwagon?????

I am really starting to hate all this "nvidybashing" Nvidea is a company who (just like all the other's) is trying to servive in a harsh and cut-throught business.

Also nvidea does an awfull lot that we as gamer's take forgranted!! take for example epic's lack off aa support for the unreal engine (obviosly a payoff from intel to make there xbox 360 graphics still seem relevant). So what do njvidea do? they code there own aa solution and implement it for free to anyone who owns an nvidea card????

Obviosly there not going to do it for ati's hardwhere and why the F$$$ should they??? they did the work??

Its the same with phys x they developed/paid for the technology. why in gods name would they then port it to there competition? thats finantial suicide?????

Again what about gpgpu? nvidea does all the hard work developing it then ati suddenly realises its missing and bolt's on stream wich is inferior in every way to cuda, but no one even seems to care???????

If you want conspirecy's, try this one on for size,,,,

Intel Is worried about nvidea atm, they do not have anything to compete with fermi/gt200 in terms off it's tesla "blade" sytems. Nvidea are offerring to build supercomputer's at 10 times the performance at 100th off the cost (of say using I7)

Seeing as Buisnes use compromises the majority of intels customers there are right to by worried. and knowing intel as we all do I woudn't put it passed them pre emptively attaking nvidea by sideing with ati and pumping out a load of anti nvidea propaganda!!

They allready support ati by using the there gpu's in the 360, they also know that amd cant touch them in the cpu department so there no big threat at the moment,

Maybe the've said to ati keep to vga and dont go stepping on our toes in the high performance parralell architecture and whe'll help you commbat nvidea???

I know this is quite a leep of faith but it's only an idea ???? will be interesting to hear what you'll think!!!

and to all you fermi basher's Just wait till it's relleased or risk looking foolish!! dont say i didn't warn you !!
barndoor101 23rd March 2010, 15:47 Quote
you should get that fanboyism checked out you know, it might get you in trouble (its already fooked up your spelling in any case).
Quote:
Originally Posted by hrtz_Junkie
Omg lets all jump on the nvidea bashing bandwagon?????

I am really starting to hate all this "nvidybashing" Nvidea is a company who (just like all the other's) is trying to servive in a harsh and cut-throught business.

Also nvidea does an awfull lot that we as gamer's take forgranted!! take for example epic's lack off aa support for the unreal engine (obviosly a payoff from intel to make there xbox 360 graphics still seem relevant). So what do njvidea do? they code there own aa solution and implement it for free to anyone who owns an nvidea card????

Obviosly there not going to do it for ati's hardwhere and why the F$$$ should they??? they did the work??

OK firstly, why in gods name would intel make a payoff for the xbox 360? they dont have ANY part of the xbox 360. IBM make the CPU, ATI the gfx. at least get your facts right.
Quote:
Originally Posted by hrtz_Junkie
Its the same with phys x they developed/paid for the technology. why in gods name would they then port it to there competition? thats finantial suicide?????

Again what about gpgpu? nvidea does all the hard work developing it then ati suddenly realises its missing and bolt's on stream wich is inferior in every way to cuda, but no one even seems to care???????

nvidia have locked physx and cuda to their own hardware, which isnt reprehensible normally, but when they actively dont allow you to run (for example) an ATI card as the renderer with an nvidia card as a physx accelerator then that is anti-competitive.
Quote:
Originally Posted by hrtz_Junkie
If you want conspirecy's, try this one on for size,,,,

Intel Is worried about nvidea atm, they do not have anything to compete with fermi/gt200 in terms off it's tesla "blade" sytems. Nvidea are offerring to build supercomputer's at 10 times the performance at 100th off the cost (of say using I7)

Seeing as Buisnes use compromises the majority of intels customers there are right to by worried. and knowing intel as we all do I woudn't put it passed them pre emptively attaking nvidea by sideing with ati and pumping out a load of anti nvidea propaganda!!

They allready support ati by using the there gpu's in the 360, they also know that amd cant touch them in the cpu department so there no big threat at the moment,

Maybe the've said to ati keep to vga and dont go stepping on our toes in the high performance parralell architecture and whe'll help you commbat nvidea???

I know this is quite a leep of faith but it's only an idea ???? will be interesting to hear what you'll think!!!

and to all you fermi basher's Just wait till it's relleased or risk looking foolish!! dont say i didn't warn you !!

intel dont make the CPU in the xbox 360, and why the hell would intel side with ATI - ATI are owned by AMD who are their rivals in the CPU space.

you talk about tesla being a supercomputer - not really. they are very good at certain types of calculation, but you still need CPUs to do the general purpose stuff.

come back when you have filled the holes in your theory, and also learn how to spell.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums