bit-tech.net

BFG Tech AGEIA PhysX PPU

Comments 1 to 25 of 33

Reply
saeghwin 8th May 2006, 23:54 Quote
They just released the demo of Cellfactor; can anybody figure out how to play it without the card? I'm downloading it right now.
ozstrike 9th May 2006, 00:06 Quote
AFAIK it's not possible to play it without the card. No point anyway.

It looks promising, but it's disappointing that things aren't quite as realistic as they could have been, for example the stone chips disappearing. Also disappointing the lack of supported games as of yet.
MiNiMaL_FuSS 9th May 2006, 00:16 Quote
I actually prefer the shots on the left without PhysX!! I don't like the look of the lil nasty blocks of cr@p coming out of everything. Early days yet maybe it'll look better in future games.
The_Pope 9th May 2006, 00:33 Quote
It definitely adds to the atmosphere somewhat - it feels more immersive to have crap flying everywhere when you're being shot at. But overall, that's not worth paying £200 for right now.

Let's hope they get some more games!
Boon 9th May 2006, 00:38 Quote
By all accounts the implementation of PhysX in GRAW are pretty poor and the general impression left is... meh. As mentioned in the article perhaps future games will show some further benefit but as it is, for the price, it's destined to fail. The nVidia (and ATI) Havok based onboard solution seems to be the best of both worlds and is quite simply the future for dedicated Physics processing. A separate add in card, unless <£50, is never going to gain mass appeal.

The fact that it actually runs noticibly slower with current unimpressive effects doesn't bode well for future titles where the effects may be more spectacular but then have a greater performance hit. I supposed the idea is sound but wonder about the processing power free with dual core and quad core chips of the future.
simosaurus 9th May 2006, 00:38 Quote
it looks crap so far, but im sure Unreal Engine 3.0 will make great use of it. and as the reviewer says, could make the card something special. i think UEngine is also using it for actual physics,(mabye per poly collision?) as apposed to fancy bits of crud flying everywhere.
Fozzy 9th May 2006, 01:05 Quote
cool....The whole point to me of the physx card was to keep from haveing to upgrade every 6 months to a new video card. This sucks. It better get a hell of alot better before I consider it. And the fact that it requires sli is stupid too. It doesnt make sense to me at all. The eye candy isnt good enough to merit the costs and it just sparks the cost of gameing rigs. WTF I'm very mad.
FIBRE+ 9th May 2006, 01:17 Quote
Quote:
Originally Posted by MiNiMaL_FuSS
I actually prefer the shots on the left without PhysX!! I don't like the look of the lil nasty blocks of cr@p coming out of everything. Early days yet maybe it'll look better in future games.
2nd that, UE3 will decide it for me.

US$ 249 = 134.340437 UK£, screwing us Brits over like usual :(
Zekey 9th May 2006, 02:08 Quote
No way wil I spring fr one of these until the price goes down, more gaes support it, and it doesn't require insane graphics.
valium 9th May 2006, 04:16 Quote
I was kinda wondering if this is even worth 200$ since SLi is supposed to have Physx enabled.
zoot2boot 9th May 2006, 06:00 Quote
it kinda gives the impression it's not really doing anything. given the frame rate decrease what says it's not using that performance hit to compute more bits of crud in the cpu? given that no one would have invested in it if this really was the case it seems like an awfully inefficient way of going about things... why not just do away with the physx and have an option to turn on more physics and take the performance hit in game? i don't believe displaying those few extra bits of crud would make that big a difference to frame rate without the physx.

the anandtech video demo clearly showed the frame rate dropping to near zero whilst presumably the cpu was talking to the ppu before a big explosion. that's not adding to the experience in my world.
Xiachunyi 9th May 2006, 07:55 Quote
Although I am not a gamer at all, I enjoy reading about these reviews.

The only time I will likely consider purchasing one is when they become cheap and more "universal". For example, I bought a laptop with a nVidia 6600 not to play games, but rather to program on.

On page this page, below the cloth simulation, I believe there is a typo: "In that context, it suceeds admirably."
LVMike 9th May 2006, 09:02 Quote
ati and nivida have annouced Havok acceleation but only in sli/crossfire and you have to sacrifice a card so that it can run the physics while the other does the video. So its kind of a wash.

PhyiX was halfassed in GRAW because its not the physics engine... its just kind of layered in to provide a launch game. UT will be the decided as will the PS3 and the move by firms like UBI to support the technology...

I think you will not see physics cards adopted as maine strem untill Microsoft steps in with their physics API just like DirectX during the hole direct3d vs glide of the 90's
Deathrow 9th May 2006, 11:57 Quote
Quote:
...however the fact doesn't throttle down...

First page, shouldent that be "...however the fan doesn't throttle down..."?
Hepath 9th May 2006, 13:46 Quote
I'm all for advancement - but this doesn't yet cut the mustard for me. Not because it doesn't deliver on effects, but as pointed out - I'm restricting my mobo choice...

Until games offer a realistic evironment (such as spray and debris remaining around after the explosion) I wont be going anywhere near it - whats the point in spending effectively £200 for the PPU then having to 'compromise' on a mobo - when you're trying to spec a base system. Especially when its just a 'prettier' bang for buck, adding nothing to gameplay.
eddtox 9th May 2006, 16:08 Quote
O...k... umm... I'm not convinced. Why would I spend an extra £200+ for a performance hit? Ok, so it looks (arguably) better, but as I have said before, it is getting ridiculous. After all not everybody has £3000+ to spend on a gaming machine. I wouldn't mind it if it was provided as an option (for people with more money than sense) but to make new games require it is likely to alienate many gamers. As it stands, consoles are looking more appealing by the day as they dont require a £400+ upgrade twice a year and they don't cost £3000+. Personally, I have been a PC gamer from the very beginning - I have never owned a PS or XBOX or GC because I firmly beleived that computers are much more useful. However, true as that statement may be, it seems that I may have to satisfy my gaming needs in some other way until I win the lottery (or rob a bank). IMHO

-ed out
specofdust 9th May 2006, 16:14 Quote
Well thats what the article was telling us. It's not for everyone, theres only one decent game out, you'd really want SLI. This isn't for people that don't have £3,000 to throw away on a PC, it's for those who do. At least for the time being.
eek 9th May 2006, 16:56 Quote
I don't get what all the surprise is about the performance hit... it was obvious this was going to happen!

When running with a PPU in GRAW there is extra debris... the PPU calculates the physics of this debris, i.e. determining how it is affected by the world, and what happens to it. Once this has been calculated the position/orientation of the debris has to be sent to the GFX card for rendering, hence the card has more stuff to render so it's going to take a hit. This was one of the reasons I guess why SLI/Crossfire is a pre requisite for getting a PPU - the extra grunt is needed to keep up a decent frame rate.
Lazarus Dark 9th May 2006, 17:20 Quote
the article forgot one option for cramped mobos, though to some it may not be pretty or ideal: pci extenders are readily available and not to expensive. this will relocate your xfi, ppu or both so they can fit with your sli/crissfire. but personally, I don't see the benifits of sli over just getting the next gen card, asside from perhaps driving a dell 30" at native res 60fps. but thats another thread. I do see some benifit to ppu if frame rates can be brought up to acceptible, but not as it currently seems.

and this may be for nothing if microsoft comes out with dirctx for physics, if its not compatible with physx hardware
Barkotron 9th May 2006, 17:51 Quote
"Compare it with the PPU-enabled screenshot on the right: dozens of realistic-looking sparks fly about, bouncing on the road and the car body itself. They fade away after a few seconds, but unlike the rock / stone debris, this is at least in line with reality."

Um, you mean, "even less in line with reality"?

http://www.intuitor.com/moviephysics/mpmain.html#flashingbullets
Vonpo 9th May 2006, 20:47 Quote
I'm sorry but I dont see the point in AGEIA... It feels as if it's technologi thats coming out too late. I would have understood if it was 2 years ago. Maybe I'm naive or I dont know, but its just like Geoff said:

"CPU manufacturers: both AMD and Intel already have dual-core processors and quad-core is on their radar screens. What else would you do with four-processors-in-one than use one or more for physics calculations? After all, PlayStation 3 has seven such SPEs..."

So when these quad-core's come out for the desktop what then? What can AGEIA give us that 4 core's can't?? In my eyes they are facing some serious issues.

Anyways just my opinion
specofdust 9th May 2006, 21:06 Quote
Well, for a start, the more cores come out in CPU's, the more cores games will use. But I think just as importantly is, a CPU core and PPU core will be entirely dissimilar. You wouldn't expect your GPU core and your CPU to be the same, and it would take a lot more raw power in CPU form to do the same work that lots less GPU power could do, because a GPU is designed for graphics, and a CPU for fairly basic super fast maths. Just as thats the case, a PPU will be designed for physics, so as things get more and more complicated, the Physics processor is going to become more and more essential.
The_Pope 9th May 2006, 23:45 Quote
The PPU core is specially designed for accelerating physics. A CPU is by design very generalised, as specofdust says. There's no doubting the PPU will do a better job. My comment about quad-core was simply to suggest that if you throw enough raw GHz at a problem and it will go away.

So just as Havok FX on GPU *may* produce a similar result to AGEIA PhysX using a PPU, then perhaps the rumoured Microsoft physics API will do the same on a qual or quad-core CPU. At the end of the day, it ALWAYS comes down to developer support. The greatest number of worth-playing games will determine the dominant solution.
m435tr0 10th May 2006, 06:44 Quote
I'm not sure if anyone else was under this impression or not, but i always thought of the ppu not as a way to gain extra candy, as has been done in graw, this seems like a bad move as obviously adding more geometry to an already taxing scene will drop the frames.

I thought the purpose of the ppu was to enable high quality physics like that of objects in hl2 and ragdolls in oblivion to be done on much larger scales without affecting cpu load or hopefully reducing it.

i feel the goal of the ppu should be to take load off the cpu not put more on the gpu and i hope thats how future implementations are done
The_Pope 10th May 2006, 08:49 Quote
Quote:
Originally Posted by Barkotron
"Compare it with the PPU-enabled screenshot on the right: dozens of realistic-looking sparks fly about, bouncing on the road and the car body itself. They fade away after a few seconds, but unlike the rock / stone debris, this is at least in line with reality."

Um, you mean, "even less in line with reality"?

http://www.intuitor.com/moviephysics/mpmain.html#flashingbullets

LOL, ok, so the flashing bullet thing is NOT a reflection of reality - I will admit that much. What I meant is that IF one had a load of sparks flying everywhere (maybe from an angle grinder) thanks to the PPU they would fly and bounce around the screen realistically, and the fact that they fade away shortly after is also what happens in real life.

Contrast that that with small rocks etc flying out of a stone wall - they look cool and bounce around but then disappear almost instantly.

People should not pre-judge the entire AGEIA PhysX PPU project based on GRAW. The game itself has wonderfully large open levels without loading screens every 5 yards yet as a clear trade-off to this tremendous *level geometry* draw distance is the fact that regular non-physics eye candy has a bloody AWFUL draw distance.

We will show this in our game review but it is akin to pop-up in Oblivion - the streets & buildings in GRAW are all there, but a coke can on the ground is only drawn 2 seconds before you trip over it. It is THIS engine "limitation" that may well be the cause of the PhysX eye-candy fading away so quickly, not because the PPU couldn't calculate the on-going physics.

Rise of Nations will be out in the couple of weeks - I'm looking forward to seeing what differnce the PPU makes in a 3D RTS. I expect loads of extra explosion debris etc but 3D RTS is a different animal to FPS
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums