bit-tech.net

Nvidia: Remember who got you where you are today

Posted on 12th Jan 2010 at 13:08 by Richard Swinburne with 43 comments

Richard Swinburne
At its main press conference this year in CES, Nvidia stamped in stone where it's heading in the future: towards a consumer technology design house, rather than a more "geeky" PC component design company.

Literally one minute was devoted to mentioning that its upcoming Fermi products are "in production", but Nvidia didn't go so far as to confirm an actual arrival date. There was a working Fermi card showed off on its stand running the DirectX 11 Unigen engine, but Nvidia's biggest and most complex chip to date didn't even make the main stage.

With the entire presentation dedicated to its Tegra 2 products, this tips in the face of the original Tegra launch at Computex 2008, where the launch was shoved in mid-week and in the middle of the afternoon - hardly prime time.

For two quarters without volume production for anything over the GeForce GTS 250 sorry, 9800 GTX+, we can see exactly how loyal Nvidia-exclusive partners are coping in what was an already weak economy: either they aren't, or the answer is defecting to the competition and making Intel motherboards ala EVGA and Zotac style.

Clearly the business motives are "move with Nvidia's whims or you'll be left in the cold" and we don't think "loyal friend" even features in Nvidia's vocabulary. Other reports from Nvidia's press conference stated that it was a "strong showing from Nvidia and its partners". It's new, and profitable partners, yes.

I don't doubt that I'd love an Audi equipped with Tegra 2, and I'm interested in the upcoming Tablet PCs, but the closest thing we, its oldest fans and long time supporters - you know, the little guys - got to a new PC product was the GT240 - a 40nm, DirectX 10.1, GDDR5-enabled 9600GT that had a quieter release than a mouse with its butt sewn up farting.

For a company that's juggling ARM CPUs, workstation products, and the odd lol-worthy chipset and 3D-everything, Nvidia seems incapable of doing two things at once within one of those product markets. For example, we've given up waiting for the 40nm shrink and cost down of the GT200 architecture for the midrange.

I will certainly acknowledge Nvidia has moved at the right time with its ARM investment and massively successful marketing machine. With the explosion of smartphones and impending bomb drop of tablet PCs, it will do very well to bring us new products in many segments that previously may have been uninteresting.

So the future looks bright for Nvidia, but I doubt it will be long until we're not invited to most of their press events and launches, and you'll start to see the company favouring more mainstream tech publications and national media. It's once dominant consumer graphics line has already become a sideline to mobile and workstation computing will the company only do "enough" to keep up in the race with ATI, but how long until that no longer becomes financially viable? How long until the other dollars overflow the investors pockets and the strain between long standing techies who have devoted entire careers to gaming and performance graphics, stretches the company internally? I hope those engineers remind the big wigs who paid for those Ferrari's.

43 Comments

Discuss in the forums Reply
NikoBellic 12th January 2010, 13:33 Quote
Well... Good bye Nvidia :'-(

I do think that tegra is gonna be massive of course, but Nvidia are forgetting the very succesful Graphics business, and thats not gonna do them any good, afterall, they have many fanboys who only buy Nvidia GPUs, but if they're forced to buy ATI, then they'll slowly stop being such an Nvidia fanboy, and that'll lose them alot of their free marketing....
Tim S 12th January 2010, 13:36 Quote
Quote:
Originally Posted by NikoBellic
its a shame that Nvidia have slowly become so bad since jen sun huang took leadership.
Jen-Hsun has always been CEO.
cjoyce1980 12th January 2010, 13:40 Quote
Quote:
Originally Posted by Tim S
Quote:
Originally Posted by NikoBellic
its a shame that Nvidia have slowly become so bad since jen sun huang took leadership.
Jen-Hsun has always been CEO.

so thats alot of bad then
salesman 12th January 2010, 14:09 Quote
I never thought the gpu war would come to something like this. It would be almost like if Intel decided to branch away from making cpus. I would really be surprised if Nvidia really did let Amd run away with the performance crown for gpus..
Skiddywinks 12th January 2010, 14:12 Quote
That us the longest sentence I have seen in a while.
NikoBellic 12th January 2010, 14:21 Quote
Quote:
Originally Posted by Tim S
Jen-Hsun has always been CEO.

Ahhh... my bad, I thought I read somewhere that jen became the CEO in 2005 :?
GamingHobo 12th January 2010, 14:23 Quote
To be fair, CES is a consumer electronics show. Stands to reason it would want to talk about its consumer tech product...
Centy-face 12th January 2010, 14:26 Quote
Well I have always bought Nvidia after a bad experience with an Ati card years ago but really Im no fan boy I go where the power is which is why I went to Intel when the core2's came out. If nvidia go with the mass market mainstream stuff then well that will be good for them and anyone who uses tegra stuff but I for one would have no problems going ATi however a completely monopolistic gpu arena will mean Ati can make any old basic improvement and slap £400 quid on it and what choice will we have.
kenco_uk 12th January 2010, 14:26 Quote
There's no way on earth the product can be dropped just like that. To do such a thing would be akin to dividing zero by fail.
andrew8200m 12th January 2010, 14:27 Quote
Quote:
Originally Posted by GamingHobo
To be fair, CES is a consumer electronics show. Stands to reason it would want to talk about its consumer tech product...


Exactly! And a point that possibly should have been addressed. There are other shows and plenty of other means to demonstrate new GPUs such as the mythical fermi so I wouldnt get too down about it. It took ATI a year and a bit to release anything that could tackle the 8800gts never mind the 8800gtx. Itll all arive soon. With every company there is a yo-yo effect and this time ati happens to be at the top with their legs dangling.

Andy
andrew8200m 12th January 2010, 14:29 Quote
It wont come to a point of becoming a monopoly as there a laws against such things. Ati will have to play fair if nvidia did fall from the sky.

Andy
Rkiver 12th January 2010, 14:29 Quote
If nVidia does indeed pull out of pc components I for one will be sorry. While currently ATI is the superior hardware (and has been for a while) the fact there was competition was good for us, driving development forward. I hope they realise their mistake and perhaps get back into gpus in a big way once their revenue picks up.
NikoBellic 12th January 2010, 14:30 Quote
Quote:
Originally Posted by Rkiver
If nVidia does indeed pull out of pc components I for one will be sorry. While currently ATI is the superior hardware (and has been for a while) the fact there was competition was good for us, driving development forward. I hope they realise their mistake and perhaps get back into gpus in a big way once their revenue picks up.

+1
SNiiPE_DoGG 12th January 2010, 14:36 Quote
Nvidia has a lot of stock piled rep with the average joe computer builder/buyer who doesnt even approach the level of understanding we do on these tech forums, so they can milk that for a while.

Myself being personally not very favorable towards Nvidia, I must say my zune HD with tegra can really browse a web page fast. I dont doubt they have a great processor going for them there. But in all reality - When has Nvidia ever done anything that did not have the highest possible fiscal reward for them? They are a business and as such they are perfectly able to see that the performance hungry games of former days are no longer existent on the PC. Heck they clearly have known for a long time that their GPU from 2005 can still run pretty much every game out there at the average joe 1280x1024 resolution.

So while we may want Nvidia to keep pushing the envelope and dumping millions into R&D in an epic battle with AMD, thats just not going to happen. Fiscally it makes no sense.

On the other hand our culture is seeing the rise of the casual gamer and they are a powerful foirce - people, and I say that classifying the vast majority, want to play Farmville (:(). Nvidia knows this, one needs only to look at the profits made by Zynga on facebook apps this year to know where they can make their money at.

Another huge indicator of where the market has shifted to is the PSP and Wii. On these platforms (missing citation) 45% of the gamers who play more than 1 hour a week of these two consoles are Female. Thats huge, where you once had a vastly male population of gamers you've now given yourself an outlet to the other half of the people in the world, the girls.


In conclusion: As much as we would like nvidia to stick around and play the ponies in the high end GPU market, they are going for the mutual fund and the emerging trends that will make them the money they so greatly lust for.
CardJoe 12th January 2010, 15:14 Quote
Quote:
Originally Posted by GamingHobo
To be fair, CES is a consumer electronics show. Stands to reason it would want to talk about its consumer tech product...
thehippoz 12th January 2010, 15:15 Quote
sooo true sniipe
MacWalka 12th January 2010, 15:44 Quote
Quote:
Originally Posted by SNiiPE_DoGG
snip

I was about to write something very simlar but nowhere as detailed.

I wouldn't put myself into the hardcore hardcore gamer envelope like a lot of people here. But I would say I'm a lot more clued up than the majority of people who play games these days.

I can see that it makes no sense for them to shove a load into R&D for new GPUs for PCs just now. There just isn't the market for it with very few PC games pushing the boundaries now and it looks like there won't be a new generation of consoles for a few years for them to try and get some product into either.

The casual gamer is where its at for making money just now. Unfortunately I think this means that AMD/ATI won't be pushing as hard to get cutting edge tech on the table either. However it may mean the end of "The way it's meant to be played" with games and GPUs being more universal.
cybergenics 12th January 2010, 16:24 Quote
One thing I don't get about Nvidia, most (not the Mac Pro and some iMac's) use Nvidia chipsets and GPU's, so if Apple are as popular as Jobs and the fanboys like to make out, why aren't hey making a killing on this platform ?
DarthBeavis 12th January 2010, 16:27 Quote
I have a little different view on this as I have been allowed on the inside. They are not turning away from the GPU market they are diversifying and leveraging technology from their other market segments to differentiate themselves. Do some research into their other technologies like Tesla ;)
Next, do some research into GPU sales to see where most sales occur. The casual gamer is not even the demographic where the dollars go. What they are doing is trying to add value to the gaming market by adding depth and breadth to the gaming experience with robust features such as 3d Stereo and improved physics. I realize some people say 3d is a gimmick - and then they admit they have not even tried Nvidia's 3d when I question them deeper. I demo 3d Stereo to many people at many events (PAX where I worked the Nvidia booth and many LANS where I bring my own setup including a 60" DLP that works with their 3d). People love it. Period. Heck, at Intel I had tons of Intel engineers with their mouths gaped.
Bindibadgi 12th January 2010, 16:35 Quote
Quote:
Originally Posted by DarthBeavis
I have a little different view on this as I have been allowed on the inside. They are not turning away from the GPU market they are diversifying and leveraging technology from their other market segments to differentiate themselves. Do some research into their other technologies like Tesla ;)

Where do you think every GT200b in the last few months has gone? Tesla cards. Or, so I hear from a few industry peeps.

Nvidia Sterescopic 3D is a gimmick. It encourages people to use poor TN panels with poor quality and gets people onboard with the novelty effect. You can't game for hours in it, it gives everyone that has tried in the office a headache. There is a Stereoscopic 3D kit sitting in the lab and no one ever uses :p

I'm not saying 3D has a general rule of thumb is arse - people love Avatar for example, it's just I strongly don't believe Nvidia's locked down propitiatory solution is right for our "open" PC market.

AMD's lazier approach is not the right answer either, but whatever happened to interoperability?

SNiiPE_DoGG: I hear you and I agree, but what I'm saying is that the change is unceremonious and feels somewhat underhanded, which is a shame. I feel sorry for the companies that have grown up around Nvidia to support them.
SNiiPE_DoGG 12th January 2010, 16:39 Quote
I don't agree with that at all (DB), the money is not in the high end gpu market anymore. thats a fact.

I meant there are two clear paths here:

TEGRA: spend a lot on R&D --> manufacture for relatively small cost --> sell in unit volume of tens of millions for good profit.

Highend consumer GPU: Spend a lot on R&D --> Manufacture for high cost --> Sell in unit volume less than 5 million for slim profit margin.

It's pretty clear from this simple comparison that the money is not in in high end VGA at all anymore. Not to mention the points I listed in my previous post.

EDIT: Bindi - yes I know it is really a shame that they hang them to dry like that :\ especially after so many years of passing all of the CS and RMA off for them to handle the idiot masses
Bindibadgi 12th January 2010, 16:45 Quote
Quote:
Originally Posted by SNiiPE_DoGG
Highend consumer GPU: Spend a lot on R&D --> Manufacture for high cost --> Sell in unit volume less than 5 million for slim profit margin.

Oh I agree totally. But by this argument how long before performance PC gaming is in the *******?

In the end we will have a scenario where Nvidia could end up making an architecture for consoles, then ship the same derivative for the PC market and keep it until a new console arrives. Microsoft/Sony pay for the development, then the PC market is farmed off as a second thought that will always buy something?

Also, I do believe I successfully provoked a discussion today: Win me B)
DarthBeavis 12th January 2010, 17:22 Quote
Did I say money was in the high-end market? read my post again. I said "The casual gamer is not even the demographic where the dollars go" meaning the dollars are not in the enthusiast or casual market. The threshold is even lower than casual. Most PCs sold have GPUs that really don't even meet OUR definition of casual gamer. That is not to say the high-end is dead as Nvidia also has their powerhouse and cash cow Tegra and Tesla. The advancements in these two markets will transfer to the Geforce line granted not immediately.
Bindibadgi: I probably have come across many more people than you in terms of using this technology. I will go with my experiences on this one. The factor holding back the technology has been a diversity of 3d displays. That is changing this year.
Bindibadgi 12th January 2010, 17:31 Quote
Quote:
Originally Posted by DarthBeavis
Bindibadgi: I probably have come across many more people than you in terms of using this technology. I will go with my experiences on this one. The factor holding back the technology has been a diversity of 3d displays. That is changing this year.

Fair enough mate, but I still don't agree that the fundamental 3D technology will change things for many more years in the general public. TVs for movies, maybe, but gaming, I strongly doubt it.

Personally I find it a fad an would invest in a quality, larger panel, however I realise that more people like the "idea" that something is better, like a cheap HDTV, when it actually isn't. Marketing, not good quality sells products at the end of the day.

I've yet to really play a "3D" title that looks like it's anything more than layered 2D that has one trick: to throw things out the screen.

Also: will it be a case that we stop evaluating PC components on their own and in comparison to one another given the push to propitiatory standards? We already have an opening scenario where you must purchase one type of graphics card to play one game in full: Batman for PhysX, Avatar for Stereoscopic 3D, and in some respects, STALKER for DX11 - although that will change soon.

For all Intel's, or AMDs wrongs, at least they got it right with things like PCI, AGP, PCI-Express and HyperTransport..

Also: Tegra is taking products from GeForce line, not the other way around ;)
DarthBeavis 12th January 2010, 17:42 Quote
Bindibadgi I understand the Batman deal and the genesis of that issue is debatable (I have heard different people describe what happened with differing explainations as to who caused the issue). You are dead wrong about the Avatar game. I also have worked with Iz3d (in fact I worked with them while I worked with Nvidia on a project BEFORE Nvidia did their own 3d and almost had Iz3d lined up as the 3d component - then Nvidia had me go another route for obvious reasons). Iz3d told me both they AND Nvidia has been included in the 3d development for Avatar. NO proprietary action there. Both Iz3d and Nvidia will be at PDXLAN 15 this weekend and will have 3d demos up and running.
I love competition and have done projects for Intel and AMD/ATI - I support all three companies.
I think it does not really matter which direction the technology flows in the vertical Nvidia chain so long as it does flow in whatever direction benefits the end user.

I agree 3d is not the end all, it is a value added feature. All the monitor vendors and even DirectTV back me up on that assessment ;)
Tim S 12th January 2010, 21:36 Quote
Quote:
Originally Posted by DarthBeavis
Bindibadgi: I probably have come across many more people than you in terms of using this technology. I will go with my experiences on this one. The factor holding back the technology has been a diversity of 3d displays. That is changing this year.

3D TVs using IPS displays still have the same issues as the TN panels. Namely, causing severe headaches and even nausea. I was speaking off the record to a number of execs at TV manufacturers during CES and the vast majority agreed with me that 3D is a fad, but it's here and so they've got to push it. They're just hoping that, with a united push, consumers will buy into it because it'll increase ASPs for both hardware and content (I'm mainly talking TVs/3D Blu-ray here rather than games).

Every seasoned journalist I speak to says the same and it's not as if I'm new to this game either.

3D works in the cinema because the big screen fills your peripheral vision . A 50in TV does not, quite simply, and if you combine that with the glasses that refresh at 60Hz, it's a recipe for unpleasantness. It's like using a 60Hz monitor all day, only worse because the sense of depth around the edge of the screen is completely fubared and you end up getting a headache, feeling disorientated and looking like a tw*t. ;)
SNiiPE_DoGG 12th January 2010, 21:41 Quote
Quote:
Originally Posted by Tim S
*snip*

You hit the nail on the headache.
Tim S 12th January 2010, 22:09 Quote
Quote:
I agree 3d is not the end all, it is a value added feature. All the monitor vendors and even DirectTV back me up on that assessment ;)
If it's a value-added feature, why are some television manufacturers only bundling one pair of active shutter glasses? Pay more for the TV (because it's got 3D) and then have to pay more again for an extra three pairs of glasses (Nvidia's are £115 a pop) because your wife and two kids want to sit and watch TV together with you. Sounds like a value-add to me.

DirecTV will say it's a value-add because they can (and probably will) charge more for their subscription service in 3D.
Bindibadgi 12th January 2010, 22:16 Quote
Quote:
Originally Posted by DarthBeavis
All the monitor vendors and even DirectTV back me up on that assessment ;)

Because they've got to sell something now LCD prices are plunging and you get 24" monitors free in cereal packets these days.

I didn't see this the first time:
Quote:
I demo 3d Stereo to many people at many events

You are financially invested in it, so yes, I think you will try to push it as a benefit ;):)

When I can watch 3DTV without having to sit there with a headache or stupid, expensive glasses - I'll be the first in the queue.
retrogamer1990 13th January 2010, 01:54 Quote
Okay, so having read this article and discussion, it seems like perhaps nVidia have changed direction and arent focusing on performance graphics anymore. Instead, they are focusing on their GPGPU systems and mass market consumer products, while putting gaming on the back burner.

On one hand, this is quite possibly a genius move from the company's perspective, for several reasons it definately makes sense. With the rise of GPGPU applications predicted (albeit by nvidia themselves), the users already demanding high performance / server computing could switch to nVidia's Tegra platform if they can engineer a high performing product. I beleive this is a rather large market.
Also, If nVidia continue to develop GPGPU programming, ala CUDA, they could expand this market into other fields, possibly breaking into general, everyday areas of computing. This would create significant advantages for nVidia over competitors if they own the rights to the technology that becomes mainstream.
Obviously they are also targeting the more mainstream mass market (casual gamers) too, look at the high volume of GT250 GPU's as outlined in the article. Am I right in thinking these are the cheapest CUDA enabled GPU's too? This could work well for nVidia if they take advantage of the above theory. Don't forget about consoles either, the next-gen consoles will be capable of far more than gaming, my opinion is that they will turn into HTPC's with gaming as an add on.

On the other hand...it creates a distortion in 'our' market. Enthusiasts.
If nVidia, quite rightly, go where the real money is and leave AMD/ATI to gaming graphics, where is the incentive for progress? We may never play Crysis at 60FPS people! Competiton is clearly an issue for companies, but for consumers, it is highly beneficial. Innovation, progress and ultimately lower prices are driven by competiton. If nVidia reduce their presence in the market, or even if ATI follow suit, where does this leave us? We may have few sub-par, expensive gaming cards to play with.

well, that's my two cents anyway, I'm off to bed.
Elton 13th January 2010, 06:59 Quote
As long as people go back with the PC route(sadly most can't think that way) it's fine, that said I miss the days of 2004-2007, when you had GPU battles..People actually could take sides as every GPU broke the limit of the previous generation by quite a bit.
Horizon 13th January 2010, 08:14 Quote
Quote:
Originally Posted by Bindibadgi
In the end we will have a scenario where Nvidia could end up making an architecture for consoles, then ship the same derivative for the PC market and keep it until a new console arrives. Microsoft/Sony pay for the development, then the PC market is farmed off as a second thought that will always buy something?

Ughh :Shudders: I need to go lay down now.
ragman 13th January 2010, 13:59 Quote
Quote:
Originally Posted by Bindibadgi


Microsoft/Sony pay for the development, then the PC market is farmed off as a second thought that will always buy something?
Quote:
Originally Posted by Horizon
Ughh :Shudders: I need to go lay down now.


We are already there with software/games, hardware is just a matter of time.
SNiiPE_DoGG 13th January 2010, 14:15 Quote
While I love PC gaming as much as all of you and I am not now, but once was, extremely hardcore about PC gaming; I'm going to play devils advocate here.

What if the future of gaming is not high res moddable games that require the 3,000 dollar computer to play? What about a unified resolution between consoles and PC? - 1920x1080 for now but of course it could be raised in the future (2560x1440).

I can see quite a few benefits to this model (for arguments sake I wont put in the drawbacks), It could make out machines draw less power, run cooler, and ultimately be cheaper as we reach that threshold of performance (we are already there for this gen), developers might put more games out on the PC, PC gaming could become more accessible not necessarily to idiots but to lower income brackets.

I see a few benefits that would be good, not that I think its the outcome I want but IMO its better than the death of gaming computers all together.
Elton 14th January 2010, 01:28 Quote
The only problem is that console developers know that making a new console now is unprofitable.

If we could go your medium, I wouldn't mind at all seeing as everyone wins, well except people who like 16:10.
PureSilver 14th January 2010, 02:21 Quote
Quote:
Originally Posted by Bindibadgi
There is a Stereoscopic 3D kit sitting in the lab and no one ever uses :p

Aren't you based in London? Tell ya what, I'll swing by at Easter when I get back, you can give it to me, and I'll try it for a couple of months. I'll even indemnify you against claims for damages write you a review at the end of it.

Whaddya say? :D
barndoor101 14th January 2010, 03:16 Quote
the whole 3d thing is a gimmick - i saw Avatar recently and thought 'my eyes hurt', it didnt even seem as though 3d was vital to the film. I saw Coraline a while back and there were only TWO 3d effects in the whole film.

I did laugh when nvidia pushed their propietary 3d format as a standard. It made me think about the other times they have pushed 'standards' which only benefit themselves and hurt consumers ie CUDA, PhysX. In the case of PhysX they even blocked using an nvidia card with an ATi primary (which was probably the only time i was going to give money to nvidia).

This also made me laugh:
Quote:
Originally Posted by DarthBeavis
That is not to say the high-end is dead as Nvidia also has their powerhouse and cash cow Tegra

some cash cow it is, MS have sold millions of Zune HDs after all ;)

Nvidia are looking more and more likely to pull out of our beloved enthusiast market, and personally i dont trust ATi as a single-horse after the HD5000 pricing (although to be fair it wasnt all their fault).
SNiiPE_DoGG 14th January 2010, 05:56 Quote
Lol your calling ATI's 5k series pricing bad? someone clearly doesnt remember the launch of the GTX 2XX series :laugh: $800+ for the watercooled gtx 280!
Elton 14th January 2010, 08:31 Quote
Quote:
Originally Posted by SNiiPE_DoGG
Lol your calling ATI's 5k series pricing bad? someone clearly doesnt remember the launch of the GTX 2XX series :laugh: $800+ for the watercooled gtx 280!

Someone doesn't remember the X1800 series...or the 8800 Ultra...

Hell back in '05 the X1600XT was $300!! No such thing as a midrange back then.

Oh and the X850XT PE selling for about $900 wasn't ludicrous at all right? :D
thehippoz 14th January 2010, 08:36 Quote
yeah they were already caught price fixing (both ati and nvidia).. it looked like competition but it was backroom shens
barndoor101 14th January 2010, 09:20 Quote
i simply meant this:

if you look at the advantages of the 5000 series one of the big ones is it has a small die area, so it should you should be getting a high number of cores per wafer. ie prices should have been lower (btw i dont mean the RRP, i mean the inflated prices that suppliers charged when there was no supply). This wasnt due to ATI directly, it was TSMC having crap yields and driving the cost of each core up (supply and demand 101).

Any company in a single-horse situation will charge what they think they can get away with because their first loyalty is to their shareholders, then to us.
Star*Dagger 15th January 2010, 21:23 Quote
I can't hear anything nVidia says over the awesomeness of the 5000s series from ATI!

What was that, a little bird on me shoulder says "Nvidia hasnt had top card since the 4870x2 came out!" That a looooooooooong time to be in 2nd place.
I will give nvidia credit for the 8800gtx which was a great card, but we are talking ancient history now.
They burnt any good will with me when they screwed around with games only made for their inferior equipment and silly ideas on how to implement physics.

Game over nvidia, try again next century!
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums