bit-tech.net

Ageia outlines plans for PhysX

Ageia outlines plans for PhysX

Ageia's CEO claims that PhysX can accelerate calculations in scientific, engineering and financial applications.

CEO of Ageia, Manju Hedge, has revealed some of the company's plans for its PhysX physics processing unit in a recent interview published on Chile Hardware.

Hedge outlined that the company has been working with a number of game engines and triple-A titles based on them and he stated that there will be several titles with PPU support coming this year. The two titles that he made specific references to were Cell Factor: Revolution and Warmonger: Operation Downtown Destruction.

The first is probably a familiar name, since we looked at a demo version of the game in our PhysX coverage last year. The latter is an apocalyptic MMO FPS based on Unreal Engine 3 - an engine that is said to make heavy use of Ageia's PhysX technology.

He claims that current games use only a small portion of the PhysX's power, and games like Cell Factor: Revolution and Warmonger: Operation Downtown Destruction will put the PhysX card's power to better use.

Hedge also revealed that PhysX is able to accelerate other calculations in scientific, engineering and financial applications. He added that Ageia is working with partners so that PhysX can be used to accelerate their applications and went on to say that the company is also working on an SDK to make this process easier for its partners. You can read more here.

Discuss in the forums

30 Comments

Discuss in the forums Reply
DougEdey 22nd February 2007, 15:38 Quote
Still no use for them IMHO.

Taken from Ageias hidden plans:
Quote:

Plan for PhysX:
Sell lots of units
Make lots of money







That is all
r4tch3t 22nd February 2007, 15:44 Quote
I think that this is a better option than using a GFX card to do it. Just think how much power 8800GTX SLI uses, not to mention R600 (which is rumored to be more power hungry) And they want us to use a third to do physics? I would prefer to go the PhysX or CPU route.
TheSaladMan 22nd February 2007, 15:52 Quote
I think the entire concept of a PPU came too late, with multi core processors becoming more common these days there's just gonna be no point in having a dedicated PPU to do your physics when you'll be able to dedicate one or two of your CPU cores to it sooner or later.
yakyb 22nd February 2007, 15:57 Quote
the only saving grace for these will be unreal tournament 3 if this does indeed use the ppu then i could see thsi working however if not it will fail
Omnituens 22nd February 2007, 16:33 Quote
waste of a mobo slot tbh. already hard to come by.
randosome 22nd February 2007, 16:42 Quote
now that their opening it up a bit with a SDK may finally allow this to take off

ATM there just is not enough support for PPU's and that is what will kill it, hopefully they will succeed eventually, if they increase support and drop prices i can see this working
Tyinsar 22nd February 2007, 18:08 Quote
I thought this was dead already. Plus as others said, yes this is better than using the GPU but do multi-core systems really see any benefit?
Nikumba 22nd February 2007, 19:12 Quote
Great product but not real DX support, if they had MS include their API into DirectX they would have a much larger customer base IMHO

Kimbie
atanum141 22nd February 2007, 19:28 Quote
Quote:
Originally Posted by r4tch3t
I think that this is a better option than using a GFX card to do it. Just think how much power 8800GTX SLI uses, not to mention R600 (which is rumored to be more power hungry) And they want us to use a third to do physics? I would prefer to go the PhysX or CPU route.
Na mate, the GFX card option is better as there would be lot less code and drivers added into the equation, also similar hardware allways works best together most of the time.

I prefer the idea of having two 8800GTX's in sli and using a old Geforce4 to do the physics calculations, same goes for the ATi camp. just using a old card would be enough power for any of the physics calculation's, that means more money for the firms but not at the rate Ageia is asking for its POS.

Funny thing is that theres a thread i made last year about the Ageia demo, and even without the card it would run the demo, i was on SktA last year and i had miniumal overheads when running in software mode, even i believe that Bit-Tech or someone did a test on games that used the card like GRAW and showed no improvement or extra "wow" factor other than the extra cost to the end user.
Baz 22nd February 2007, 21:03 Quote
the rise of dual and quadcore CPUs has killed physx where it stood.

Why spend £100+ on a physx card when you have a 2md/3rd/4th CPU core sat idle?

Company of Heroes for instance, uses the 2nd core to handle its physics in multicore machines.
Kipman725 22nd February 2007, 21:38 Quote
erm seen the physics in alan wake? thats using a quad core cpu not a phyisics card or special GPU drivers. The whole phyisics card idea and using graphics cards for phyisics (at least in games) is dead with the rise of multi core cpus.
EQC 22nd February 2007, 22:20 Quote
For those of you who think an extra core on a CPU can do the physics that a PPU can do: you're very likely wrong. You can successfully argue that the physics YOU are looking for in games could be done on the CPU...but the PPU from Ageia is (according to the article I link below) capable of more than 10x more physics than a dual core CPU.

An extra CPU core is still a general-purpose piece of hardware that has tons of extra silicon that does very little to help physics calculations. A PPU is designed for physics the way a GPU is designed for graphics...think about that...how much graphics can your computer do if you take out the graphics card and force it to use the CPU? Even if your CPU has 4 cores and runs at 3GHz (almost 6 times faster than a high-end GPU), it still can't compete with a simple integrated GPU when doing graphics.

For more information, I've quoted one of my own previous posts (which referenced a previous post I made) here:
Quote:
Originally Posted by EQC

for a little more info on what the physx chip can do relative to a high end cpu, you might check out this post I made a little while ago:

http://forums.bit-tech.net/showthread.php?p=1322716#post1322716

I have no 1st hand knowledge...but my post mentions a toms-hardware article:

http://www.tomshardware.com/2006/07/19/is_ageias_physx_failing/

...toms-hardware seems to show that using the physx chip for certain very intensive physics calculations (fabric and fluids) is many times better than using a dual core AMD Athlon FX-60. So, in that sense, even with a quad-core CPU, there might be some advantages to using a PPU because it's designed specifically for physics, while the CPU is designed more generally.

In a way, I think the PPU is good for physics the way a GPU is good for graphics -- it's custom built hardware to do 1 thing very well. You still can't run high-end games without a decent GPU, no matter how good your CPU is...I suspect that if games start including more physics, it could end up the same way for a PPU.

But, with multi-core cpus, there may eventually be the option of actually making one of those cores be a PPU/GPU (ie: not a general core, but a physics/graphics-only core)...in that case, you'd still actually have a PPU in there and get all its advantages, but it would be more like Intel/AMD using Ageia's structure as a component inside one of their CPU packages.

Additionally, as this bit-tech article mentions, there could be applications for a PPU in scientific physics-type computations -- for those of us interested in the physics, and not the gaming, a PPU is a better choice than a GPU for certain. And, if Tom's hardware is correct, then a PPU is much much better than a run-of-the-mill multi-core CPU by itself.
randosome 22nd February 2007, 23:08 Quote
EQC hit the nail on the head

ATM games could use another core for physix, because they don't have the level of physix that a PPU can handle, once you start moving into the PPU realm the physics in the game rises sharply, and a CPU just cant handle it
However, if we ever get games that make real use of a PPU - i have no idea
Tyinsar 23rd February 2007, 00:07 Quote
The problem is that superiority does not automatically mean success <insert VHS / Beta or such example>. Without a decent market-share and game / app support <insert chicken / egg quip> this will die / is dying. I think it would be great if this took off and games got way more realistic and this became a common and reasonably inexpensive part of every game computer - but until then I'm putting my money back in my pocket.
DXR_13KE 23rd February 2007, 00:27 Quote
wasn't this already dead?
Aankhen 23rd February 2007, 04:11 Quote
Quote:
Originally Posted by atanum141
Funny thing is that theres a thread i made last year about the Ageia demo, and even without the card it would run the demo, i was on SktA last year and i had miniumal overheads when running in software mode, even i believe that Bit-Tech or someone did a test on games that used the card like GRAW and showed no improvement or extra "wow" factor other than the extra cost to the end user.
Funny thing is that Tom's Hardware did an article (perhaps other sites did too, TH is the one I remember offhand) where even they had to grudgingly admit that a CPU can't handle the same sort of physics that a PhysX can. For example, in software mode, liquids are completely ignored; they aren't even rendered. In addition, as soon as you see a hint of proper cloth physics, the game comes to a grinding halt.

PhysX is overpriced, and there aren't many reasons to buy it yet. But I don't believe a GPU or CPU can suffice in its place.
DougEdey 23rd February 2007, 06:55 Quote
Sorry, but I've seen physics models produced that run at a few hundred FPS on a single core machine. These are engines produced by students and liquids and cloths render correctly.
Cthippo 23rd February 2007, 07:29 Quote
I still believe that the PPU is the correct answer, but conversely I think Ageia arreived too early with a product the market is not yet ready for. I've long been a believer in discreet cards for seperate tasks on the principal that somthing designed to do just one thing does that thing better than somthing designed to do many things. Perhaps if they had done a better job on the demo, produced somthing that truly stressed the PPU and would bring a conventional computer to it's knees they might have a better chance.
DougEdey 23rd February 2007, 07:33 Quote
But at £140 you're reaching 8800GTS territory (only £35 more) and I'm sure that the GTS could do the job better.
r4tch3t 23rd February 2007, 07:36 Quote
Quote:
Originally Posted by DougEdey
But at £140 you're reaching 8800GTS territory (only £35 more) and I'm sure that the GTS could do the job better.
It might be able to get close to the PPU, but at what TCO? I am sure that the PPU uses less power than an 8800GTS.
It really is a good idea, just a bit too pricey and not enough support yet.
randosome 23rd February 2007, 13:35 Quote
Quote:
Originally Posted by DougEdey
But at £140 you're reaching 8800GTS territory (only £35 more) and I'm sure that the GTS could do the job better.
the 8800GTS still probably cant handle the same level of physics as a physics card
GFX cards just aren't designed to do physix calculations, although because they are quite parallel they do a better job then a CPU - it still cant compare to a PPU
DougEdey 23rd February 2007, 13:40 Quote
randomsome: We won't know that for sure until the drivers come out to enable it. At the moment if you have 2x8800GTX you'll be unlikely to use even the full power of one, so why not use the second to do some PPU style work?

F@H has a program designed to run on the GPU and I have a friend who's trying to benchmark power relative to CPU?
Tim S 23rd February 2007, 13:50 Quote
Quote:
Originally Posted by randosome
GFX cards just aren't designed to do physix calculations, although because they are quite parallel they do a better job then a CPU - it still cant compare to a PPU
G80 is actually incredibly good at doing any massively parallel calculation providing it's optimised for the architecture, as is the case with a PhysX card.
randosome 23rd February 2007, 14:09 Quote
Quote:
Originally Posted by Tim S
G80 is actually incredibly good at doing any massively parallel calculation providing it's optimised for the architecture, as is the case with a PhysX card.
well if we ever see some good adoption of Physics calculations we may one day be able to judge the performance
Although the hardware is likely to be only as good as the software, and i would like to believe that programming for a card designed to do physics is easier then one designed for graphics

BTW has nvidia actually released any GPGPU software yet ? or are they still developing it
sinizterguy 23rd February 2007, 15:07 Quote
Plans should include
- Stop ripping people off with unwanted crap
- Shutdown company before bankruptcy
Paradigm Shifter 23rd February 2007, 19:16 Quote
Quote:
Originally Posted by DXR_13KE
wasn't this already dead?
That was what I was thinking.

As far as I was aware, the only two things that actually seem to use the PhysX are the Cell Factor tech demo (comes with the card) and Ghost Recon: Advanced Warfighter (also tends to come with the card...)

Of course, PhysX might actually become useful when Unreal Tournament 3 (2007... whatever...) comes out. But I doubt it...
EQC 23rd February 2007, 19:46 Quote
Are you guys seriously comparing the PhysX PPU to nVidia's 8800's? Man...you may or may not be right about the 8800 being able to do more physics than the PPU, but please acknowledge some of the major problems with the comparison:

First of all, look at the heat sinks and think about them. With a quick google search, the PPU eats roughly 20-30 Watts. 8800? Neighborhood of 150-200 Watts. You could run 5 PPU's for the same power budget...or overclock the hell out of the existing one. The 8800 is running at the bleeding limits of what a giant heatsink/fan/heatpipe system can deal with.

Second: I'm not sure how the VAT's work, but here in the US, newegg now sells the PPU card for $159. The cheapest 8800, with 320MB, sells for $279 after rebate. PhysX is selling for a good chunk cheaper...if taxes make the price closer, I don't think that Ageia needs to be blamed for overpricing their hardware.

Third: PhysX is new...so it can't be mass-produced at the same level (or with the same company resources) as an nVidia/ATI GPU. Just comparing the look of the card, I'm guessing the current PPU would be $40 if researched/produced by a bigger company that was using the weight of past research dollars (instead of all of their research so far being invested in one product). The card really looks like an nVidia 5000 series card. It'll be cheap one day, if it gets off the ground....

Fourth: if you're using the graphics card to do physics, you're wasting silicon. A GPU has silicon that is capable of physics, yes...but it's got other silicon and circuits that are suited for graphics exclusively. Why pay for that if you're not going to use it?

Fifth: A GPU is not designed to talk back to the CPU. 90% of the information flow is toward the GPU (not back to the CPU, after processing by the GPU). As things stand now, a GPU is good for "effects physics" (yes, that is changing in the future....). This means "eye candy" that doesn't effect gameplay. A PPU is designed to do useful physics...

Sixth: this is the first product put out by Ageia...how many generations has nVidia been working on graphics cards? Do you really think a first-generation company like Ageia could get away with selling a card that costs $500 and eats 200 Watts? They're starting at 20 Watts to show you exactly what such a "simple" level of PPU design can do. If they're successful in getting people to adopt that, think about what they'd be capable of in the 100-Watt power budget.

At the very least, you people should be arguing that Ageia should license their ideas to another company (like Nvidia) that has better mass-production capabilities. Maybe you should argue that Ageia chips should be added to an 8800 board. Maybe a PPU should actually be one of the cores on an intel processor (an actual PPU, not just a regular core dedicated to doing physics tasks). But man, please don't suggest that it's a good idea to make a high end graphics card, with all that graphics-exclusive silicon, do only physics.
DXR_13KE 24th February 2007, 02:25 Quote
if only they had made it more open......
metarinka 24th February 2007, 08:08 Quote
it seems to me at that power usage and such, would it be that far of a stretch to say in the future graphics cards would alot some extra silicone specifically FOR the task of physics rendering if it was that specialized and or different then general computing or graphics processing, I agree that a specialized standalone chip will always delieve a significant advantage than general processors or gpus, but I don't it's that wild for say an already specialized processor like a gpu to incorporate some chip space towards physics processing.
I think that's the clincher right there, no one wants to have to shell out for yet another add on card, and unfortuantely in physics the results right now aren't as tangible, you won't see more pixels.
Aankhen 24th February 2007, 10:19 Quote
Quote:
Originally Posted by metarinka
it seems to me at that power usage and such, would it be that far of a stretch to say in the future graphics cards would alot some extra silicone specifically FOR the task of physics rendering if it was that specialized and or different then general computing or graphics processing
Yes, it would be a stretch. Graphics cards don't need larger chests. :p
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums