bit-gamer.net

Nvidia: Without TWIMTBP, PC gaming would be dead

Nvidia: Without TWIMTBP, PC gaming would be dead

Ashu Rege, director of Nvidia's DevTech team, said that PC gaming would quickly peter out if it wasn't for the work of his team.

Ashu Rege, the Director of Nvidia's developer tech team, said that without the work of his team, PC gaming would continue to decline until it eventually dies.

Rege's team supports developers as part of the company's The Way It's Meant To Be Played program and has come under continued criticism from AMD.

Clearly angered by this, he explained that there are over 50 engineers in his team who work "damn hard" around the clock with developers to make PC games better.

"If we hadn't done anything with Batman, for example, what we would have had is the continuing end of PC gaming," he said. "Let's be honest, developers love PCs and they love developing on PCs, but the reality is that the consoles are where the bread is buttered. It's where the money is made.

"If we didn't do all of this work with developers to help them improve their games on the PC side, most games would be straight ports from Xbox 360 or PS3 and with no differentiation whatsoever. Add that to the fact that PCs are, let's be honest, a little more challenging to get to grips with for the casual gamer, more and more people are just going to migrate to consoles and kill PC gaming."

He later pointed out that his team makes no money and the group's whole mission is to make PC games better. "It's an enormous investment for Nvidia, there's no doubt about that," added Tony Tamasi, head of content and technology at Nvidia. When asked, neither Tamasi nor Rege would confirm how much money was spent on DevTech each year, but we've heard that it's between 50 and 100 million dollars a year. That's a lot of money.

Discuss in the forums.

79 Comments

Discuss in the forums Reply
Elton 3rd October 2009, 04:55 Quote
Yeah...

NO.
feedayeen 3rd October 2009, 05:21 Quote
It seems like you guys are trying to alternate bad news with good news for Nvidia.

Good news - Nvidia: Without TWIMTBP, PC gaming would be dead
Bad news - Nvidia dismisses AMD's Batman accusations
Good news - High-end Fermi will ship this year
Bad news - Fermi card on stage wasn't real
Goty 3rd October 2009, 05:33 Quote
I think they mean they have over 50 engineers that work "damn hard" to make sure their games work better on their hardware than their competitor's while making it look like they're doing no such thing.
erratum1 3rd October 2009, 06:13 Quote
I do appreciate the investment Nvidia are making in pc gaming, and i hope the pc platform becomes stronger with Win 7 and Dx11. You can slam Nvidia but they are investing huge amounts of money and doing things that Ati are just not bothering with. Like they said their's nothing stopping Ati from doing the same but they can't be bothered. It makes me want to buy an Nvidia card just because they are investing in pc gaming and Ati are not. Nvidia don't have to do this, their doing it off their own backs so the games can be better. For the developers to get this service for free it must be invaluable. I am not a fanboy of either side but they are the facts.
Evildead666 3rd October 2009, 07:30 Quote
Proprietary standards kills gaming.
2bdetermine 3rd October 2009, 08:11 Quote
For the most part blame Intel for their crappy IGP.
reflux 3rd October 2009, 08:35 Quote
Oh yeah, lots of improvements in Batman. Like the fact that even the lowest PhysX setting kills the frame rate on my 3.8ghz i7 GTX 260 GS system. Nice one Nvidia, that was really worth the effort.
bogie170 3rd October 2009, 08:40 Quote
Quote:
Originally Posted by feedayeen
It seems like you guys are trying to alternate bad news with good news for Nvidia.

Good news - Nvidia: Without TWIMTBP, PC gaming would be dead
Bad news - Nvidia dismisses AMD's Batman accusations
Good news - High-end Fermi will ship this year
Bad news - Fermi card on stage wasn't real

Good news - You can have Phys-X
Bad news - Not if you have an ATi card in your system.

Screw Nvidia, I will never buy one of their products as they are so anti-competitive.
impar 3rd October 2009, 08:55 Quote
Greetings!
Quote:
Originally Posted by Evildead666
Proprietary standards kills gaming.
Especially when they also kill performance.
andrew8200m 3rd October 2009, 08:55 Quote
I have no problems with physx, I have a gtx285 dedicated to it however when I removed it and used just m gtx295 cards the frames dropped by about 30%. I think to a certain extent nvidia are forcing us to use a "seperatele physx unit" as using tour gpu that is providing the graphics has horrendous effects. At least nvidia are working with developers though...


Andy
wuyanxu 3rd October 2009, 10:01 Quote
don't see any reason to accuse nVidia with TWIMTBP program, if ATI had offered the same level of service to the game developers, everyone would benefit.
rollo 3rd October 2009, 10:51 Quote
That's a big If , dout ati have a spare 100 million these days, to the dude with a 260 , buy another dedicated to physics and it's fine
Dreaming 3rd October 2009, 11:05 Quote
I can appreciate Nvidia's sentiment, but at the same time obviously if it comes down to the wire and they have the choice between an nvidia optimisation and ati optimisation, they will do nvidia. This could be as little as, well nvidia supports physx but ati supports opencl, and the TWIMTBP team say 'we've got 2 weeks left, we've fixed the AA, we've got FPS rates working, lets work on hammering physx in'.

I'm not saying it's wrong of them to do that - if I was a shareholder of nvidia so it was my money going to paying their wages - I'd be angry if they weren't making sure our product was better supported. However, I think it's wrong of them to say 'bawww we are a charity we only do this out of love and kindness'. Obviously it's nvidia's bottom line that matters to them.

Also agree that AMD/ATI should ramp up the competition on the service front.
Dreaming 3rd October 2009, 11:07 Quote
Quote:
Originally Posted by 2bdetermine
For the most part blame Intel for their crappy IGP.

I know, imagine where we would be if all 'games for windows' games worked out the box with people's onboard stuff whether it's a craptop or desktop (probably not netbooks though).
Psytek 3rd October 2009, 11:29 Quote
As anti-competitive as it is, NVIDIA are probably mostly responsible for keeping PC gaming where it is. Remember when they did that deal with IW so that PC gamers could get the COD4 map pack for free.

Clearly they do want to keep PC gaming going. I'm not saying that isn't in their financial interest, but w/e the reason we shouldn't turn our noses up quite so fast.
V3ctor 3rd October 2009, 11:33 Quote
I just bought an HD5870... screw PhysX...
Why should a company (nvidia/ati) pay the developer to make better games? I think that it's in the interest of the developer to make a good game.
dreamhunk 3rd October 2009, 12:00 Quote
It's more like with out pc gamingthe pc industry would be dead!
Tokukachi 3rd October 2009, 12:32 Quote
Lol, this makes me laugh so hard, Nvidia are not doing this out of the kindness of there hearts, in fact TWIMTBP is an amazingly cheap marketing ploy. Having an unskip-able logo on the beginning of games and on the box is worth waaaay more than $100 million.

The fact is, its dirt cheap marketing, that also has the bonus of screwing the competition and that they can spin so that people believe they are the saints of PC gaming.
Paradigm Shifter 3rd October 2009, 13:04 Quote
Quote:
Originally Posted by dreamhunk
It's more like with out pc gamingthe pc industry would be dead!

This.

100%.

If it wasn't for gaming, the PC industry wouldn't be dead, but we'd all be using Intel Integrated, and anywhere that needed large amounts of crunching power would be running Silicon Graphics workstations or supercomputers, and everything would be done on CPU. Intel and AMD would be OK. nVidia, ATi, Matrox... they'd be dead. Or they'd have never taken off the way they did.

...

Sorry, I don't buy the idea that PC gaming would be dead without TWIMTBP. We might not have some of these 'OMG graphics!' games, but might I suggest that that might not be such a bad thing, if story and gameplay took the lead in games again rather than graphics-above-all...

However, the fragmentation of the market with locking out PhysX (for example) might do it unless DX11's physics implementation becomes dominant. (Which it might do, as it should work on everything, rather than just what one company wants it to...)
Rkiver 3rd October 2009, 13:25 Quote
nVidia responsible for PC Gaming. Sure........
flibblesan 3rd October 2009, 13:46 Quote
Quote:
Originally Posted by feedayeen
It seems like you guys are trying to alternate bad news with good news for Nvidia.
Good news - You can have Phys-X
Bad news - Not if you have an ATi card in your system.
<cough>

The block has already patched so you can use a primary ATI card and a secondary NVIDIA card for Physx. Works with 190.62 and 191.03 drivers, 32bit and 64bit. XP, Vista, Win7.
link
Veles 3rd October 2009, 14:33 Quote
I thought they just put obnoxious logos at the start of every game.
Akkatha 3rd October 2009, 14:36 Quote
I really don't understand the hatred.... I mean why shouldn't Nvidia push the marketing on games with Physx? They offer developer support, ATI dont. Nvidia have physx tech in their cards, ATI dont. Seems to me like all they're doing is pushing the features of their cards to get more sold which tbh is exactly what I'd do in their position, it is a business after all and that's their USP.
mi1ez 3rd October 2009, 15:03 Quote
So fed up of this kind of news. Go find a hole nvidia. a big hole.
infi 3rd October 2009, 16:32 Quote
Quote:
Originally Posted by Akkatha
I mean why shouldn't Nvidia push the marketing on games with Physx? They offer developer support, ATI dont. Nvidia have physx tech in their cards, ATI dont.

oh AMD offers developer support for sure as well
http://developer.amd.com
they just don't brawl all over the net with it.

the main problem is that nvidia is pushing proprietary standards, it's a little bit like directX, which basically locks out non-windows gaming just on a hardware level, you just don't notice it with directX because most gamers are using windows anyways, but it's basically the same situation.

if they'd really want to support the developer and pc gaming in general, they should push open standards so everyone can benefit from the developments and not just the companies themselves.
Evildead666 3rd October 2009, 17:40 Quote
Quote:
Originally Posted by flibblesan
Quote:
Originally Posted by feedayeen
It seems like you guys are trying to alternate bad news with good news for Nvidia.
Good news - You can have Phys-X
Bad news - Not if you have an ATi card in your system.
<cough>

The block has already patched so you can use a primary ATI card and a secondary NVIDIA card for Physx. Works with 190.62 and 191.03 drivers, 32bit and 64bit. XP, Vista, Win7.
link

Godsend.

Will plop the 8800GTS-320 back in tomorrow ;)
gavomatic57 3rd October 2009, 17:52 Quote
Quote:
Originally Posted by infi
oh AMD offers developer support for sure as well
http://developer.amd.com
they just don't brawl all over the net with it.

the main problem is that nvidia is pushing proprietary standards, it's a little bit like directX, which basically locks out non-windows gaming just on a hardware level, you just don't notice it with directX because most gamers are using windows anyways, but it's basically the same situation.

if they'd really want to support the developer and pc gaming in general, they should push open standards so everyone can benefit from the developments and not just the companies themselves.

The TWIMTBP programme doesn't push any standards, it just enables developers to get the best out of Nvidia hardware and allows Nvidia to release optimised drivers for the big releases - it's all based on the same standards & API's- with the exception of physx - which was offered, but snubbed by ATI, leaving ATI users with a bad taste in their mouth because they can't use Physx and ATI don't have their own physics middleware. Meanwhile, Physx's nearest rival Havok is owned by Intel - AMD's other big rival.

This proved to be interesting reading, despite being an Inquirer article
Clicky

Lets also not forget how much work Nvidia have put into OpenCL, oh and this Clicky
Tim S 3rd October 2009, 18:15 Quote
Quote:
Originally Posted by Dreaming
I can appreciate Nvidia's sentiment, but at the same time obviously if it comes down to the wire and they have the choice between an nvidia optimisation and ati optimisation, they will do nvidia. This could be as little as, well nvidia supports physx but ati supports opencl, and the TWIMTBP team say 'we've got 2 weeks left, we've fixed the AA, we've got FPS rates working, lets work on hammering physx in'.

I'm not saying it's wrong of them to do that - if I was a shareholder of nvidia so it was my money going to paying their wages - I'd be angry if they weren't making sure our product was better supported. However, I think it's wrong of them to say 'bawww we are a charity we only do this out of love and kindness'. Obviously it's nvidia's bottom line that matters to them.

Also agree that AMD/ATI should ramp up the competition on the service front.

AMD doesn't have an OpenCL driver for its graphics cards yet, but we're told by AMD that it is coming (see here: http://www.bit-tech.net/news/hardware/2009/08/10/ati-stream-sdk-v2b2-gpu-support/1). Nvidia supports OpenCL as well (and has a driver on their website), but it's not the only thing they support when it comes to general purpose GPU computing.

An example of Nvidia's commitment to OpenCL is the fact that Nvidia's Neil Trevett is actually the chairman of the OpenCL working group. During the 4770/4890 launch event, one of the presenters effectively said to me that "Nvidia doesn't support OpenCL" which simply isn't true... OpenCL is AMD's only strategy for GPU computing, whereas OpenCL is only a part of Nvidia's GPU computing work.

They've got compilers for C, C++, Java, Python, Fortran, Matlab (which have extensions for parallelism in CUDA, so while they're not the same, it's a minimal amount of effort to parallelise compared to porting to OpenCL/DirectCompute) as well as the open APIs supported by both companies. The reason for the additional programming languages is that some developers (think about the HPC market here, not the consumer market necessarily) don't want to write their applications in what is effectively a graphics API.
thehippoz 3rd October 2009, 18:18 Quote
well making a racket about things is how they divert buyers.. they have nothing to show against this new dx11 card.. the more they can get out there, the less your thinking about buying the 5850/5870

and claiming end of the year when your card is a bunch of wires or worse on paper still.. and 3 bill transistors, your expecting to have good yeilds? just gotta remember there's alot of people in this world- for everyone who knows what's up, there's a guy looking at a mac

I like they are working with devs, but physx is such a gimmick.. this latest batman stunt shows what you can expect from nvidia on the physx front.. the aa issues could be game dev related, but the physx was sticking it to paying nvidia customers- I don't know how to they expect to keep good faith, put out some hardware already you milking coasters :D
gavomatic57 3rd October 2009, 18:41 Quote
Quote:
Originally Posted by thehippoz
well making a racket about things is how they divert buyers.. they have nothing to show against this new dx11 card.. the more they can get out there, the less your thinking about buying the 5850/5870

Well, the 5870 is a good card and it is powerful, but its major selling point is compatibility with games that aren't even out yet and only have tacked-on support anyway. For everyone else a GTX 285 is enough until there is some point in getting a DX11 part. Frankly Dirt 2 isn't enough of a reason to fork out another £300.

If however, you want OpenCL support, Physx, AA in Batman and knowledge that the TWIMTBP team is working with developers to make sure the games that are out or are coming soon run great on the hardware you have, then it makes sense to wait a bit longer for Nvidia's DX11 part because on paper it sounds like a better option.

Physx sticking it to Nvidia customers?? Hardly - giving me added value more than anything...besides, those who bought their card recently would have had Batman for free.
thehippoz 3rd October 2009, 19:05 Quote
added value? really.. I'm talking about people who have a ati card and bought nvidia just for physx- they couldn't even use it
gavomatic57 3rd October 2009, 19:29 Quote
Quote:
Originally Posted by thehippoz
added value? really.. I'm talking about people who have a ati card and bought nvidia just for physx- they couldn't even use it

Why should Nvidia support people whose primary card is an ATI and possibly jumped ship when their old G80 or G92 got a bit old? They are a company who want to make money. They offered Physx to ATI, hell they may have even done most of the porting to ATI's Stream themselves but ATI snubbed them. Nvidia quite rightly could have turned around and said "f*** 'em"...so they did.
Star*Dagger 3rd October 2009, 19:32 Quote
People have been crying about the end of PC gaming for well over 15 years.
PC Gaming will be around until Holodecks, and you might run the first holodecks from your PC.

Also the comment about "we spend 50 to 100 million dollars..." ANYtime someone quotes you a stat that has a range of 100% error, it is bullsh|t.

I have enjoyed nvidia cards and ati cards, I buy whichever is the best, but this statement by nvidia is moronic and disingenuous.

Yours in 5870x2 Plasma,
Star*Dagger
thehippoz 3rd October 2009, 19:48 Quote
Quote:
Originally Posted by gavomatic57
Why should Nvidia support people whose primary card is an ATI and possibly jumped ship when their old G80 or G92 got a bit old? They are a company who want to make money. They offered Physx to ATI, hell they may have even done most of the porting to ATI's Stream themselves but ATI snubbed them. Nvidia quite rightly could have turned around and said "f*** 'em"...so they did.

bah.. can't reason like that- you've paid them, it's your card

and buying a 285 over a 5850 or 5870 is silly.. ati's cards right now are faster and future proof with dx11.. all the points nvidia is trying to make about why not to buy would be valid.. if they had a card that could even compete- what gets me about them is they are all about milking and marketing

they've had all this time sitting on the 200 series, and they are just now getting the gt300 on paper after ati has a dx11 card on the market? they didn't even bother with dx10.1- and if you remember twimtbp was responsible for getting it taken out of assassins creed

without competition, nvidia is happy to keep things exactly the same.. they would love for everyone to still be back on the g80 I bet.. I'd be happy to see that 3 billion transistor part.. not some glued together front they made in the back real quick.. I mean were not 10 years old here lol it's just all so weird, you gotta call shens
Tim S 3rd October 2009, 19:49 Quote
Quote:
Originally Posted by Star*Dagger
Also the comment about "we spend 50 to 100 million dollars..." ANYtime someone quotes you a stat that has a range of 100% error, it is bullsh|t.

I've heard several numbers from several different people and the number changes depending on who I speak to - it's not just a figure from one person - but nobody would put an actual figure on it officially. The number I've quoted is the range of figures I've been told over the past 12 (or so) months, but the numbers have been getting bigger over time. Late last year, I heard the $50 million figure, at Computex I heard "close to 100 million", this week I heard "60 to 70 million"... The best I've had on official terms is "tens of millions of dollars".

Who knows... but if it's 50 million or 100 million, it kinda doesn't matter - it's still a hell of a lot of money, which ever end of the scale we're at.
gavomatic57 3rd October 2009, 20:06 Quote
Quote:
Originally Posted by thehippoz
bah.. can't reason like that- you've paid them, it's your card

and buying a 285 over a 5850 or 5870 is silly.. ati's cards right now are faster and future proof with dx11.. all the points nvidia is trying to make about why not to buy would be valid.. if they had a card that could even compete- what gets me about them is they are all about milking and marketing

You are right, it is faster, but the 285 has been out since January...and it is still fast enough to play ANYTHING available today. Buying the 5870 works if willy-waving is your thing, but for gaming and without any DX11 games to play, it's a bit of a waste to upgrade now with GTX 380 around the corner. Hell, by the time the 380 arrives, there may even be two DX11 games to play - assuming it arrives next year.
Tim S 3rd October 2009, 20:21 Quote
Quote:
Originally Posted by gavomatic57
You are right, it is faster, but the 285 has been out since January...and it is still fast enough to play ANYTHING available today. Buying the 5870 works if willy-waving is your thing, but for gaming and without any DX11 games to play, it's a bit of a waste to upgrade now with GTX 380 around the corner. Hell, by the time the 380 arrives, there may even be two DX11 games to play - assuming it arrives next year.

If I was buying a graphics card today, it would be a 5870. The 285 is redundant...

Quote:
It generally runs rings around Nvidia's fastest single-GPU card, the GeForce GTX 285 in the scenarios we've tested here, and makes it an effectively irrelevant graphics card at its current price of around £250.


The question is a little more difficult if I was buying a graphics card around Christmas because GF100/Fermi is an unknown quantity at the moment. Nvidia says "it'll be faster" than the 5870, but how much faster and at what price point? Those are pretty important questions to answer when it comes to making a buying decision - the 280 was faster than the 4870, but it was almost twice the price at launch and the 4870 was only 15-20 per cent slower (off the top of my head). I know which card I would have bought at that time...
Phil Rhodes 3rd October 2009, 22:10 Quote
Touching assumption of the day: anyone knows what TWIMTBP stands for. I mean, fer chrissake, I had to copy and paste said abbreviation out of the article just to put it in this post.

And another thing: if they're basing their claim to greatness on the fact that they're stopping PC gaming from declining, or stopping it from becoming a series of poorly-ported console games, they're on extremely shaky ground.
theevilelephant 3rd October 2009, 23:44 Quote
Quote:
Originally Posted by Phil Rhodes
Touching assumption of the day: anyone knows what TWIMTBP stands for. I mean, fer chrissake, I had to copy and paste said abbreviation out of the article just to put it in this post..

The Way It's Meant To Be Played

Its plastered across loads of games...
Saivert 3rd October 2009, 23:48 Quote
well. cos it's so much more fun to Type out the way it's meant to be played. 'cos you know. gamers have never seen that before. everybody uses hacks to remove those splash screens from their games and they put a sticker over it on the DVD case. </sarcasm>

Anyways... people will always buy the card that is the most value for money (except blatant fanboys of course, but those are morons).
This changes from being ATI to being NVIDIA back to being ATI again all the time. So who really cares? Just be happy you get to witness graphics evolution.

You can be pissed all you want, it doesn't change a thing.
SimoomiZ 3rd October 2009, 23:52 Quote
Nvidia clearly believe this movement to consoles is now unstoppable... the battle with Studios, effectively lost. Who can really blame them with PCGA members like Epic? Hence the repositioning towards compute-intensive revenue avenues. If their strategy succeeds what are the odds on their expensive TWIMTBP support surviving and with it PC gaming itself? What ever the result for Nvidia's strategy, clearly the next gen consoles can't come soon enough, as PC HW is now running way ahead of PC gaming titles, which seem for the most part to be simply console ports .

I mean, the very idea of sli'ed Fermi cards seems quite ludicrous at this point in time, based on the meager PC game offerings around right now. Most of which run quite happily on (now old) sli'ed 8800gtxs. Speaking of which, an 8800gtx was around £450 at launch and seemed like a good, albeit expensive deal at the time. Looking now at the current state of PC gaming a £450+ fermi card looks like a very poor investment unless new (free) Nvidia applications and next gen games appear fast, so all that Fermi computing power can actually be used in desktop computing
HourBeforeDawn 4th October 2009, 00:08 Quote
honestly I think nVidia if anything is just hurting the market in general with how much bribing, I mean funding they do to corner markets and make it bias towards their cards, not to mention the tech hold up they cause, case in point 10.1 ~_~ that whole the way its meant to be played thing needs to end, gaming industry would be better off as they could unify the standard to work on all cards like for example DX11 and OpenCL instead of the crap PhysX and so on.
SimoomiZ 4th October 2009, 00:19 Quote
Quote:
Originally Posted by HourBeforeDawn
honestly I think nVidia if anything is just hurting the market in general with how much bribing, I mean funding they do to corner markets and make it bias towards their cards, not to mention the tech hold up they cause, case in point 10.1 ~_~ that whole the way its meant to be played thing needs to end, gaming industry would be better off as they could unify the standard to work on all cards like for example DX11 and OpenCL instead of the crap PhysX and so on.

Tend to agree, in its current predicament the last thing PC gaming needs is the PC gaming equivalent of the recent HD standard format war. It's really in their interests to row together or their boat is sunk.
general22 4th October 2009, 01:34 Quote
Quote:
Originally Posted by HourBeforeDawn
honestly I think nVidia if anything is just hurting the market in general with how much bribing, I mean funding they do to corner markets and make it bias towards their cards, not to mention the tech hold up they cause, case in point 10.1 ~_~ that whole the way its meant to be played thing needs to end, gaming industry would be better off as they could unify the standard to work on all cards like for example DX11 and OpenCL instead of the crap PhysX and so on.

Disagree since ATI could set up a similar system where developers are able to test with a variety of ATI configurations and send in an engineer to work with the developers. But they don't have such a system or do not advertise it's existence.
cadaveca 4th October 2009, 05:03 Quote
Oh, Ashu Rege, the Director of Nvidia's developer tech team,

I got news for you, little man. You VERY little man. How about stop spitting lies and telling the truth.

TWIMTP is KILLING PC gaming, not saving it.

Proof:

Since nVidia released thier first Phys-C driver, stand-alone PCI Phys-X cards stopped working...only when installed with ATI cards.

How is that "saving pc gaming"?

Perfect example, of how this statement is nothing but lies:
Quote:
Tamasi later went onto say that "no game developer on the planet is going to let us do anything to a game which prevents it from running on ATI, or having a good experience. Whenever we go to do something, the first principle we apply is 'do no harm' - you never make it worse than before you went in. Ever."

Oh really, Mr, Tamasi? WHat about PCI phys-X cards and ATI vgas, and the games released prior to the purchase of AGEIA?

You broke THOSE GAMES.

Shall I start the list?

Ghost Recon: Advanced Warfighter.

Oh yes, every single game since the first now does not even run, period, with a PCI Phys-X card, and ATI cards, even though it worked up until the 7.13 PHys-X driver. If hardware Phys-X is selected, the game refuses to start. But there's a catch...

Because newer games that use PHys-X update the API, forget about trying to smooth this one over...you're busted.

I posted about the problem over a year ago...noone listens. Let's step back to August 13th, 2008, when I found the first driver you nVidia guys deliberately moved to ruin gaming:

Quote:
Originally Posted by cadaveca
Doesn't work with ATI cards and ageia Phys-X card. Seems like nV decided to forget about the pci cards....again.

This means I have issues with any game that supports Phys_X using any driver released by nV since they bought Ageia.

This app in particular gives error of nvcuda DLL, and requests Phys-X driver 80718, which I have installed, and the app confirms.


Useless.

http://www.xtremesystems.org/forums/showpost.php?p=3214282&postcount=2


Hard proof that nVidia purposely coded software to ruin the gaming experience...undeniable evidence the nvidia does the exact opposite of what they represented in this public statement.

I submitted a ticket to nVidia on the issue. Response?

lol. Guess what. It's still broken. For over a year, nvidia has purposely known the problem, and has failed to address it. Since then, every game that features Phys-X, when you install it, breaks other games. ON PURPOSE. Only way to fix it is to re-install windows, and install no driver or game with the "TWIMTBP" logo since the first released nV phys-x driver.


Using your influence in the software market to affect sales in the hardware market is what got Microsoft fined. Guess who has lawyers prepping a case? Guess who's gonna go out of business?

You know, the only reason INtel is making graphics cards is to get rid of you guys @ nV? As soon as larrabe is released, and AMD is not left with a monoply when you are goine...bye, bye nVidia!


I can't wait...I've got tons of evidence here to back up AMD's lawsuit. ANd believe me, it's coming.

AMD just wasn't complaining publicly...that was a smoking gun.




Quote:
said that without the work of his team, PC gaming would continue to decline until it eventually dies.

lol. Liar.
ssj12 4th October 2009, 05:27 Quote
Ill stick with Nvidia GPUs. I've never had an issue with them. ATI, I always have. So truthfully as long as Nvidia makes GPUs, I'll play PC games. If they stop, well PC gaming can rot in hell.

If they feel like improving my overall experience by helping developers, I'm all for it.
cadaveca 4th October 2009, 08:36 Quote
Let me just say, buy what you want...it doesn't matter. nVidia is going to disappear, and very soon.

As much as nVidia wants to spout helping developer baloney, why don't we bring up eyefinity?

You do realize, that for any game to support those high resolutions, nevermind how many are working now...just by how many have the ability to select them...had everything to do with AMD working with developers? It's nice for nV to say AMD just isn't doing what they are...unfortunately for them, they don't realize just what AMD DOES, nor do they have the ability to actually appeal to the programmers who are actually doing the typing at the keyboards, making the engines.

Thier strategy is with the suits, not the guys actually programming. Fact of the matter is that noone on the team for nV is actually trained for programming the future, and they are running scared.

Next-gen is in development...we are begining to see it now...Fermi...isn't centralized on graphics, but gpgpu. DO you wonder why?

It's far to late for nV to hire the people even, because they in short supply. Most are working on Larrabee.

Just how is nVidia helping the next-gen, anyway? Where's the future-thinking? Have you watched the Fermi presentation? Biggest joke on the market. Whose buying? If pc gaming would die without them, and the largest market is consoles, where's the next console gpu?

PS3 and 360 are stale...360 is how many years old? PS3? Both have just hit refreshes...thier time is limited. nVidia has already missed the boat in that market...all they have left is PCs. Running Fermi, the gpgpu chip.

lol. 80 people just wasn't enough. Not that those nV people aren't working hard...they'll get jobs at Intel and AMD when nV goes under. Other people though...better start looking now.
s3v3n 4th October 2009, 09:05 Quote
WIthout PC gaming, Nvidia would be dead. Not the other way around.
Now lets assume TWIMTBP has a big effect on the PC gaming industry:

- Kill TWIMTBP
- Less new graphics pushing games
- Less people out to buy new videocards
- Less high profit margin videocards for Nvidia/Ati
- People still making PC games, just ones that better use current tech

TWIMTBP is meant to push highend gaming hardware requirement adoption, and give consumers a reason to buy new hardware.
gavomatic57 4th October 2009, 10:12 Quote
Quote:
Originally Posted by cadaveca


Thier strategy is with the suits, not the guys actually programming. Fact of the matter is that noone on the team for nV is actually trained for programming the future, and they are running scared.

You do realise that Nvidia are the only one of the two GPU manufacturers that has an established GPGPU infrastructure - armies of people developing apps that work with CUDA and eventually OpenCL, meanwhile ATI/AMD are busy racing after the next thing that may come along. They did try releasing Stream but that didn't really get anywhere did it.

Nvidia were the first to bring working OpenCL drivers and development tools to market. Clicky

You also need to realise that Apple marketshare is growing and every mac on the apple store comes with a Nvidia GPU.

You also need to realise that AMD are also up against Intel and losing quite catastrophically and carrying millions of dollars of long-term debt. AMD are apparently on the brink of bankruptcy Clicky

You also need to realise that 65% of all GPU's used by Steam users are Nvidia, but only 27% of GPU's are ATI. 68% of CPU's are Intel, 31% are AMD and falling.

Eyefinity is a joke surely? There are only so many people who are going to spend £300 on a graphics card to play a dwindling number of games, but even fewer who are going to buy 6 identical monitors and stick them all together. Just buy one bigger monitor! If you miss the black lines down breaking up the image you can use duck tape!

As for Larrabee...nvidia are already developing ray-tracing applications Clicky. They don't need to hire anyone because they already have the staff they need. They're far from running scared, they're actually ahead of the curve. Larrabee is Intel's pipe dream and they have yet to show anything that looks like a product. Meanwhile Nvidia can achieve the same things that Larrabee is aiming for using CUDA...with any GPU from the 8800GTS onwards.

If PC gaming stopped tomorrow, they would still have Tesla and their Quadro line.
DOA Draven 4th October 2009, 11:04 Quote
It's in Nvidia interests to support PC Games, if PC gaming dies, then so does the need for these high end gaming graphics cards, which no doubt generate considerable profit. The same is true for ATI of course. Otherwise all they are left with is the 3D professional market, and intergrated GPUs which will do most desktop for the majority of people.
Phil Rhodes 4th October 2009, 13:52 Quote
I wonder if nvidia make more money out of selling graphics cards, or more out of selling GPUs to manufacturers of other devices. The former, I would expect, on the basis that they can charge us an arm and a leg for a GPU on a circuit board that they'd have to discount heavily in any B2B transaction.
dreamhunk 4th October 2009, 14:03 Quote
after crysis war head all hardware stocks went crashing so too did the pc market.


lets see here AMD has 80% of it's hardware in consoles and they are still going bankuprt! IN fact they AMD had to steal money from intel just to stay alive. Yea the them consoles are helping hardware comapnaies a lot what a joke!


Both the gaming industry and hardware companies will learn the hardway who is the top of the food chain around here!

we made you we can break you, you don't own us we own you!
ssj12 4th October 2009, 15:36 Quote
I wonder if nV can survive now they are making processors... lol
thehippoz 4th October 2009, 15:50 Quote
bet huang is still living back in the g80 days.. like I heard he was surrounded by japanese chicks 24/7 after g80 release and he use to pop perrier-jouet and take a bath in it.. but nowdays (I heard lolol) he roid raged on the 5870 release- beat those chickens off with a bath towel

smithers came in and said your gonna have to do something huang- he went into the richmans stare and said- 3 billion! you heard the ntune guy say WTF out loud
tejas 4th October 2009, 16:33 Quote
Nvidia are right. PC gaming would be dead as a dodo without them.

AMD still have not made a profit since they acquired ATI. AMD is hardly a company that I trust for PC gaming's future. The future is GPGPU as is made clear with Windows 7, Linux and MAC OSX Snow Leopard. With Tegra, Tesla and Fermi lines, I dont think Nvidia will need the Geforce line in the coming years for revenue.

Frankly AMD can go to hell. Intel make the best CPU and when Larrabee/ Nvidia Fermi comes next year Intel will have a superior platform to AMD as well.

Having said that I know that British people are jealous and dont like when other people like Nvidia do well. Always like to support the underdog asshole with his crap products....
cadaveca 4th October 2009, 16:46 Quote
Quote:
Originally Posted by gavomatic57
You do realise that Nvidia are the only one of the two GPU manufacturers that has an established GPGPU infrastructure - armies of people developing apps that work with CUDA and eventually OpenCL, meanwhile ATI/AMD are busy racing after the next thing that may come along. They did try releasing Stream but that didn't really get anywhere did it.
Why would AMD quash thier cpu lines with gpus? are you silly?
Quote:
Originally Posted by gavomatic57
Nvidia were the first to bring working OpenCL drivers and development tools to market. Clicky
Really? then how come Stanford was folding on ATI gpus first? nVidia is YEARS behind AMD, hardware-wise....but they got some solid programming...
Quote:
Originally Posted by gavomatic57
You also need to realise that Apple marketshare is growing and every mac on the apple store comes with a Nvidia GPU.
Larrabee will be replacing nV in Apple products. Apple would rather have a complete platform, and deal with just one hardware developer...it's so much easier, and cost effective. Intel is not far off from having exactly what apple is asking for. In fact, everything they are doing now is directly related to meeting Apple's needs. Or didn't you know that?:D
Quote:
Originally Posted by gavomatic57
You also need to realise that AMD are also up against Intel and losing quite catastrophically and carrying millions of dollars of long-term debt. AMD are apparently on the brink of bankruptcy Clicky

You don't know who is financially backing AMD then. You're not aware of what "asset-light" means...nor how companies are run. AMD is far from bankruptcy...sure, they are a bit short on cash, but guess who is taking over when nV dies? And they've been developing hardware for Microsoft, and have Microsoft's support...while nV's nose is in the air when it comes to DX advancements. Guess who provided the DX11 gpus for M$ to develop DX11 on? Who is working in the consoles? Who is busy developing for those console titles that work on the competitors hardware? LoL. You're pretty comical, you know?
Quote:
Originally Posted by gavomatic57
You also need to realise that 65% of all GPU's used by Steam users are Nvidia, but only 27% of GPU's are ATI. 68% of CPU's are Intel, 31% are AMD and falling.
So? Even Gabe himself will tell you that STEAM does not accurately depict the market...just thier little segment of it...and guess who built up STEAM with buying Valve titles, and putting coupons in the box of every new DX-compliant hardware update? Why don't you ask Gabe why he thinks STEAM shows those metrics, and what those numbers really mean? Not good as statisics, are ya?
Quote:
Originally Posted by gavomatic57
Eyefinity is a joke surely? There are only so many people who are going to spend £300 on a graphics card to play a dwindling number of games, but even fewer who are going to buy 6 identical monitors and stick them all together. Just buy one bigger monitor! If you miss the black lines down breaking up the image you can use duck tape!
Sure, with many monitors, and bezels, and TODAY's market, it seems foolish...however...here's AMD running the resolutions of the future...today! Working with the silicon panel makers today, to ensure that things will work in the future! Patents are filed for next-gen panels, and nV holds how many of them? again...lol..
Quote:
Originally Posted by gavomatic57
As for Larrabee...nvidia are already developing ray-tracing applications Clicky. They don't need to hire anyone because they already have the staff they need. They're far from running scared, they're actually ahead of the curve. Larrabee is Intel's pipe dream and they have yet to show anything that looks like a product. Meanwhile Nvidia can achieve the same things that Larrabee is aiming for using CUDA...with any GPU from the 8800GTS onwards.
BUT...nV is NOT getting developers working on Raytracing...how can they be ahead of the curve, when they're heading in a straight line, down the same path they've always been headed?
Quote:
Originally Posted by gavomatic57
If PC gaming stopped tomorrow, they would still have Tesla and their Quadro line.

First it was console gpus. Then motherboard chipsets. Seems to me, Tesla and Quadro doesn't sell enough yet to be truly viable. What's left for nVidia? Ion? Geforce? Tesla? Quadro? Too bad they are losing liscencing agreements left and right...the more they lose, the less important they are. Tesla will replace Quadro...or didn't you know that? That statement alone shows that you're not even aware of what's going on...you are too busy looking at today, when the day's half over! Looks like they got one product...and nobody is really interested! And what are they working on to succeed Fermi? They barely have enough Fermi cards as it is, and had to use mock-ups this week.

If Fermi was good...ready...we would have seen a real one this week. Huang is visiting TSMC later this week, isn't he? DO you know why? What's he taking over there, that he doesn't want anyone else to see?

Seem's you're a victim of thier marketing. At least they've got that part right! :D
Silver51 4th October 2009, 21:40 Quote
Quote:
Originally Posted by cadaveca
Let me just say, ...

Dude, did you just raagggee quit from a Left 4 Dead VS match just to post here?
cadaveca 5th October 2009, 00:05 Quote
Quote:
Originally Posted by Silver51
Dude, did you just raagggee quit from a Left 4 Dead VS match just to post here?

Does my passion for honesty make you uncomfortable? :( I just hate liars.




I don't play pc games much any more..nV killed that fun.

:D
chizow 5th October 2009, 00:41 Quote
Quote:
Originally Posted by Tim S

They've got compilers for C, C++, Java, Python, Fortran, Matlab (which have extensions for parallelism in CUDA, so while they're not the same, it's a minimal amount of effort to parallelise compared to porting to OpenCL/DirectCompute) as well as the open APIs supported by both companies. The reason for the additional programming languages is that some developers (think about the HPC market here, not the consumer market necessarily) don't want to write their applications in what is effectively a graphics API.
Tim you might be interested in some of the recent news about Nexus shown at GTC. Its basically Nvidia's GPGPU plug-in for Visual Studio that provides an all-in-one debugger and compiler for Nvidia hardware for all of the relevant gaming API: CUDA C, OpenCL, DirectCompute, Direct3D, and OpenGL. Its pretty clear Nvidia's efforts and support for all things GPGPU far surpasses AMD, and Nexus is just another huge step in that direction.

http://developer.nvidia.com/object/nexus.html
http://developer.nvidia.com/object/nexus_features.html

Its amazing that AMD and their supporters somehow feels a value-add feature implemented by Nvidia for their own hardware somehow detracts or cripples AMD hardware. Its equally laughable to think AMD should automatically benefit from other's efforts without any of their own. Is computer hardware the only industry with such a misguided sense of entitlement?

Perhaps AMD should do a better job of allocating resources and "Get In the Game" as their own program title suggests. They do have similar efforts, the only problem is they're busy implementing features no one cares about in relatively obscure titles. DX10.1 and DX11 in Battleforge, HAWX, Stormrise, STALKER..... Dirt 2 is a good title but still relatively obscure.
Tim S 5th October 2009, 02:57 Quote
chizow, yep, I'm very, very aware of Nexus.. I just haven't written a great deal about it yet. I was saving that for my Fermi piece ;)

oh, the joys of being jetlagged to buggery thanks to delayed flights!
chumbucket843 5th October 2009, 03:29 Quote
Quote:
Originally Posted by cadaveca

You know, the only reason INtel is making graphics cards is to get rid of you guys @ nV? As soon as larrabe is released, and AMD is not left with a monoply when you are goine...bye, bye nVidia!


I can't wait...I've got tons of evidence here to back up AMD's lawsuit. ANd believe me, it's coming.

AMD just wasn't complaining publicly...that was a smoking gun.



lol. Liar.
your posts are full of damnfoolishness and useless nvidia hate.how did you even come up with something this ridiculous?
DarkLord7854 5th October 2009, 06:11 Quote
I love all the bashing people do to nVidia.. Lol.


I don't agree that they saved PC gaming, but you can't argue that they help games look best on nVidia hardware. If they do it willingly for it to look like **** on AMD hardware or not can't be determined accurately from a code-standpoint and like everyone demonstrates clearly, it's only going to be viewed in a negative way, regardless of the code reasons why it looks better on nVidia and not AMD/ATI.

W/e.. no-one will be pleased with w/e nVidia say/do anyways when it comes to the TWIMTBP program
azrael- 5th October 2009, 07:30 Quote
Shouldn't it be TWIMTBF ("The Way It's Meant To Be Faked") these days...?
Silver51 5th October 2009, 09:02 Quote
Quote:
Originally Posted by cadaveca
Does my passion for honesty make you uncomfortable? :( ...

Not really. People have been taking sides and arguing their corner for decades.

Spectrum/Amstrad
RISC/CISC
Intel/AMD
3dfx/Matrox
Nvidia/ATI
Apple/Windows/Linux
Orange/Lemon-Lime


If you really have a burning issue with Nvidia, go ahead and contact them. Seriously. Last time I contacted them, they were (admittedly busy,) but happy to answer my questions.


On topic, it's good to see that there are some companies backing PC gaming.
Kúsař 5th October 2009, 09:26 Quote
Quote:
Add that to the fact that PCs are, let's be honest, a little more challenging to get to grips with for the casual gamer

That's something I can agree with! I believe this is in fact very important reason why PC gaming is on decline. Big publishers are making games for causal gamers because they think it's the largest community on both - consoles and PC. But PC is land of hardcore gamers...


I think that nVidia is actually doing a good job helping smaller developers with their games but they're definitely not the reason PC gaming is alive. However they're helpful as long as they don't enforce PhysiX. Vendor specific API(which is on par with others) is a BAD thing and should be forgotten. Does anyone remember GLide???
[USRF]Obiwan 5th October 2009, 10:53 Quote
Quote:
Originally Posted by Silver51
Quote:
Originally Posted by cadaveca
Does my passion for honesty make you uncomfortable? :( ...

Not really. People have been taking sides and arguing their corner for decades.

Spectrum/Amstrad
RISC/CISC
Intel/AMD
3dfx/Matrox
Nvidia/ATI
Apple/Windows/Linux
Orange/Lemon-Lime

Hey do not forget Atari vs Commodore :)
tad2008 5th October 2009, 11:59 Quote
I always used to stick by ATI for the quality of their cards, performance and overall price point, then almost 2 years ago, nvidia's cards finally won me over and at the moment ATI's 58xx cards have my eye (shame about the price atm).

Yes Nvidia have made a valid and noteworthy contribution to gaming, as have any number of other graphic card manufacturers, both past and present, except Intel who should stick to what they know, cos if they really knew anything about graphics hardware, they'd have given us something better years ago.

For once, manufacturers should stop trying to fob us off with poor revamps and "whining" about the competition and for once give us true value and performance with stable drivers and cool running hardware that doesn't require our own personal power station to be able to run their hardware.

Personally, i would be happier seeing fewer cards released with greater jumps in performance and sensible pricing than rushing another card out the door.
cadaveca 5th October 2009, 13:01 Quote
Quote:
Originally Posted by chizow
Its amazing that AMD and their supporters somehow feels a value-add feature implemented by Nvidia for their own hardware somehow detracts or cripples AMD hardware. Its equally laughable to think AMD should automatically benefit from other's efforts without any of their own. Is computer hardware the only industry with such a misguided sense of entitlement?
Quote:
Originally Posted by chumbucket843
your posts are full of damnfoolishness and useless nvidia hate.how did you even come up with something this ridiculous?


Personally, I'm an ATI fan. Have been for years. I have never claimed anything else.

However, and ideal situation for me would be seeing AMD's hardware meeting nVidia's programming prowess, in a way that would benefit the industry as a whole.

Why are they fighting each other for cash, screaming "It's mine!" when they could both benefit from working together? nVidia sounds like a bully, they way they present themseves.

Huang:

"Look at this bit of non-functional stuff I'm gonna make. Give me a few months, and see how my programming team will have hardware. Until then...just wait, why don't you?".

Windows7 is coming, and are they ready with hardware for the launch? Does that sound like someone working with the rest of the industry, furthering the idea of bringing people digital entertainment?

How can they possibly claim they are supporting the pc gaming industry, when new stuff is here in a couple of weeks, and their hardware isn't ready?
chizow 5th October 2009, 17:10 Quote
Quote:
Originally Posted by Tim S
chizow, yep, I'm very, very aware of Nexus.. I just haven't written a great deal about it yet. I was saving that for my Fermi piece ;)

oh, the joys of being jetlagged to buggery thanks to delayed flights!
Heheh aye, that's quite a trek from GTC across the Atlantic, I'm guessing your footnote means you attended in person. Look forward to the write-up on Nexus. They had a page or two about it in their Fermi whitepaper but wasn't all too interesting.

I'd also be interested in your thoughts on the whole "AMD says PhysX Will Die" bit from our earlier discussion on that piece, given what we know and have seen now. I think some of the recent developments we've seen clearly show Nvidia innovating when it comes to GPU physics and in doing so, advancing PC gaming overall. I'd also say AMD's recent shift from pushing Havok as their solution to the relatively obscure Bullet Physics also shows they were being disingenuous at the time of the article and that they really have no solid answers, solution or plan for GPU physics on their hardware.

It seems to me AMD has gone fully viral with their anti-Nvidia campaign with some of the recent news bits, which of course stem from AMD blogs and interviews and not "official channels". AMD's excuses all focus on trying to deflect blame on Nvidia for deficiencies in their own hardware and driver support. If you get another chance to speak to AMD about it, ask them why they just don't write a CUDA driver for their hardware and suport PhysX natively instead of applying their free rider approach to technology. If they simply took ownership of their own hardware all of this nonsense about workarounds then goes away.....
chizow 5th October 2009, 17:15 Quote
Quote:
Originally Posted by cadaveca


Personally, I'm an ATI fan. Have been for years. I have never claimed anything else.

However, and ideal situation for me would be seeing AMD's hardware meeting nVidia's programming prowess, in a way that would benefit the industry as a whole.

Why are they fighting each other for cash, screaming "It's mine!" when they could both benefit from working together? nVidia sounds like a bully, they way they present themseves.
Hi, given some of the comments and replies you made to gavomatic57, I really have no interest in going into detailed replies with you. I'll just say free rider economics don't apply well in any capitalist industry. If this were some not-for-profit or educational research concern you might have a point but that's clearly not the case here. I'm not sure what makes you think there should be an open IP exchange in the GPU business when it doesn't apply anywhere else in business or society.
cadaveca 5th October 2009, 18:10 Quote
Quote:
Originally Posted by chizow
Hi, given some of the comments and replies you made to gavomatic57, I really have no interest in going into detailed replies with you. I'll just say free rider economics don't apply well in any capitalist industry. If this were some not-for-profit or educational research concern you might have a point but that's clearly not the case here. I'm not sure what makes you think there should be an open IP exchange in the GPU business when it doesn't apply anywhere else in business or society.

I completely understand wanting ROI. And to me DirectX IS an open IP exchange platform. I don't think nVidia should be sticking thier fingers in the software development, when they are a hardware company, but I guess, in the end, that's a moot point when clearly nvidia has become so focused on both.

I'm angered over having my PCI Phys-X card broken by software developed by nVidia, and having them claim they don't break things. As a business, the re-application of the nv-4x series was a beautiful masterpiece, as the ROI was very large there, but thier purchase of Ageia has heavy-handedly ruin the gaming experience on one of my machines.

And had nV supported DX10 in hardware when it released, maybe it would have been more of a success. I'm very concerned their actions may result in the same for DX11. DX10 wasn't supposed to have "cap bits", and nVidia has effectivly replaced that functionality on the software side, when it comes to the competitor's hardware. Are we going to see the same in DX11...? It sure looks like it, capitalism or not.


At the same time, I also think AMD should pony up, pay a liscencing fee, and get nV software running on thier chips. I never said AMd wasn't perfect, but nV is over-agressive.
gavomatic57 5th October 2009, 20:30 Quote
Quote:
Originally Posted by cadaveca


And had nV supported DX10 in hardware when it released, maybe it would have been more of a success. I'm very concerned their actions may result in the same for DX11. DX10 wasn't supposed to have "cap bits", and nVidia has effectivly replaced that functionality on the software side, when it comes to the competitor's hardware. Are we going to see the same in DX11...? It sure looks like it, capitalism or not.

Funny, I distinctly remember my 8800GTS arriving before Vista was released. I remember having to wait for the drivers to be posted to their website on launch day thanks to the time difference. It arrived way before AMD's first DX10 offering, which turned out to be slower than Nvidia's 8800GTX and was more power-hungry under load.

Clicky

Its all academic anyway, AMD will be gone soon. Maybe Nvidia will buy their x86 license when they fold??
cadaveca 6th October 2009, 00:31 Quote
Quote:
Originally Posted by gavomatic57
Its all academic anyway, AMD will be gone soon. Maybe Nvidia will buy their x86 license when they fold??

That would make me happy too. nVidia programming, AMD hardware...Intel would then just pay to make then work together...though Lucid may fix that regardless. Either way...I'd like to see the guys @ AMD stay in charge. Remember that they are still recovering from Hector's foolishness.


As it is now, AMD holds nothing other than liscencing and rights, anyway. Well, maybe some real-estate and office furnishings too...heh.


But Intel still needs Larabee on market first. And if Huang sold nV...he'd have an easy retirement...or does he just really trust noone?
chizow 6th October 2009, 03:41 Quote
Quote:
Originally Posted by cadaveca

I completely understand wanting ROI. And to me DirectX IS an open IP exchange platform. I don't think nVidia should be sticking thier fingers in the software development, when they are a hardware company, but I guess, in the end, that's a moot point when clearly nvidia has become so focused on both.
I don't think you understand, if you don't buy Nvidia products, they don't owe you ANYTHING. If you buy AMD, you get what you pay for. It seems the majority of the comments from angered AMD fans are from people who can't come to grips with these basic realities. The whole situation is encapsulated perfectly by the common fallacy that Nvidia is somehow hurting consumers with their TWIMTBP and PhysX programs when in reality, these features benefit ALL of their customers which are an overwhelming 2:1 majority by ANY metric.

Also, DirectX is not an open IP exchange platform, its a proprietary standard stewarded by Microsoft with input from IHVs and ISVs like Nvidia, AMD, etc. They set the rules, after that its free game for whoever makes best use within those rules.

Nvidia doesn't push PhysX because they wanted to get into software physics development, hell they give it away for free. They simply saw PhysX as a great opportunity to help them sell their hardware and further their GPGPU efforts. Its obvious these potential advantages weren't lost on AMD as Richard Huddy went through the same paces years ago, if they weren't so cash poor you might see a big red PhysX by AMD instead:

http://www.bit-tech.net/custompc/news/601680/amd-considers-buying-ageia/page1.html
Quote:
I'm angered over having my PCI Phys-X card broken by software developed by nVidia, and having them claim they don't break things. As a business, the re-application of the nv-4x series was a beautiful masterpiece, as the ROI was very large there, but thier purchase of Ageia has heavy-handedly ruin the gaming experience on one of my machines.
Have proof of this? Or are you just taking cues from random bits of internet misinformation? The last PPU driver release was Aug of 2008, so I'm not quite sure how Nvidia would disable PhysX on your system if you didn't update that driver. The driver lock-out only went into effect with the 190 drivers, which aren't needed at all for the PPU to function. From the few reports I've seen, the Ageia PPU still works up to the capabilities of its limited hardware capabilities.
Quote:
And had nV supported DX10 in hardware when it released, maybe it would have been more of a success. I'm very concerned their actions may result in the same for DX11. DX10 wasn't supposed to have "cap bits", and nVidia has effectivly replaced that functionality on the software side, when it comes to the competitor's hardware. Are we going to see the same in DX11...? It sure looks like it, capitalism or not.
Huh? Nvidia had the only functional DX10 hardware on the market when Vista launched by a long shot, the R600 wasn't available in quantity until May 2007, some 4 months after Vista's retail General Availability launch. If anything, Nvidia was punished by being first to market as ATI's failed R600 launch allowed them to regroup and support DX10.1 almost a year later with SP1. DX10 was a failure due to Vista' s poor adoption rate and market acceptance. Win 7 will be Microsft's second shot at it and all early indications show it won't repeat Vista's failures. As for cap bits, there's always going to be a need to query hardware as standards and hardware capabilities are always constantly evolving. These fundamentals are essential to progress.
Quote:
At the same time, I also think AMD should pony up, pay a liscencing fee, and get nV software running on thier chips. I never said AMd wasn't perfect, but nV is over-agressive.
Its been offered to them, they not only declined, they've repeatedly downplayed and belittled PhysX to the point I'm not even sure that offer is still on the table. Instead they have these talking heads spouting nonsense about how important they consider physics and pushing different standards with no results to show for it on their hardware. At some point, AMD's customers are the ones who need to wake up, stop taking AMD's cues and realize nvidia is not to blame, AMD just needs to step up their game with more action and less rhetoric.

http://www.bit-tech.net/news/hardware/2008/12/11/amd-exec-says-physx-will-die/1
Elton 6th October 2009, 05:57 Quote
Holy hell batman. There is more partisan-ism than the Obama election. Honestly Nvidia did a damned good investment in TWIMTBP. Nothing more nothing less. In terms of PhysX it was simply an investment protection of sorts. Nothing really wrong anyways.

That said I've been Buying ATI for a while now, if only because they're so much cheaper.
Tim S 6th October 2009, 15:15 Quote
Elton: It's been fun to watch :)
dreamhunk 14th October 2009, 15:35 Quote
yea too bad them consoles can't keep a big company like AMD alive and well. AMD was so close to going bankuprt that they needed to steal money from Intel! The way microsoft have treating pc gamers lately should give pc gamer more reason to boycott AMD from the picture. The sooner AMD goes bankuprt the faster them consoles are going to go by by.
taloshz 28th October 2009, 01:25 Quote
I guess there are still people out there who blindly listen and follow everything Intels spin team sews out. But that is another story.. What this boils down too is has nothing to do with ATI not having the same so called values Nvidia has. Ot os kist ATI is basically not doing something they really should not be and calling it oh we are protecting the PC gaming industry. I have used both sides products so I am not a fanboy of either side. I have multiple gaming rigs at home here with both set ups. Nividia is basically spending the money not to save the PC gaming industry but to try a sneaky low way to get rid of there competition. Anyway you slice it I will use batman as an example that game was a TOTAL snow job. They paid the company then low and behold when the gasme released it was disabling AA on ATI cards that is BS right there. Then when they are busted on it they act like it was some sort of bjug they never knew about. I am sorry but that bloat does not float. Both sides are wll know for sneaky tactics but Nividia by far plays dirtier. I honestly wish ATI/AMD would start using the same tactics. But I bet Nvidia would cry so load and so fast and there would be a lawsuit right away if you saw games designed on ATI start disabling AA and other things along with since havok is now going to be the norm for windows and for quite a few gaming software physx if suddenly nvidia cards were cut out of that. While both have some questionable practices it seems ATI is the more mature of the 2 children.
SNiiPE_DoGG 28th October 2009, 02:26 Quote
Quote:
Originally Posted by taloshz
I guess there are still people out there who blindly listen and follow everything Intels spin team sews out. But that is another story.. What this boils down too is has nothing to do with ATI not having the same so called values Nvidia has. Ot os kist ATI is basically not doing something they really should not be and calling it oh we are protecting the PC gaming industry. I have used both sides products so I am not a fanboy of either side. I have multiple gaming rigs at home here with both set ups. Nividia is basically spending the money not to save the PC gaming industry but to try a sneaky low way to get rid of there competition. Anyway you slice it I will use batman as an example that game was a TOTAL snow job. They paid the company then low and behold when the gasme released it was disabling AA on ATI cards that is BS right there. Then when they are busted on it they act like it was some sort of bjug they never knew about. I am sorry but that bloat does not float. Both sides are wll know for sneaky tactics but Nividia by far plays dirtier. I honestly wish ATI/AMD would start using the same tactics. But I bet Nvidia would cry so load and so fast and there would be a lawsuit right away if you saw games designed on ATI start disabling AA and other things along with since havok is now going to be the norm for windows and for quite a few gaming software physx if suddenly nvidia cards were cut out of that. While both have some questionable practices it seems ATI is the more mature of the 2 children.

Right on mate!

if you think about it, if ATI started behaving like nvidia does, then nvidia would be more stringent about disabling AA and it would be a vicious terrible cycle.... Does anyone want a world where half the games have AA on ATI hardware and half have AA on nvidia hardware? no, no one wants that because we would all have to own two video cards or have some diligent hackers in the community.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums