bit-tech.net

Nvidia reveals plans to make x86 CPU

Nvidia reveals plans to make x86 CPU

Nvidia already has an ARM CPU in its Tegra platform, but this is the first time the company has talked about an x86 CPU.

Nvidia clearly isn’t worried about upsetting Intel now that the rift between the two companies is out on the public battlefield. However, we were still surprised at Nvidia’s comments at the Morgan Stanley Technology Conference in San Francisco yesterday, in which the company revealed that it had plans to enter the x86 processor market in the next two to three years.

In a Q&A session at the conference, Nvidia’s senior vice president of investor relations and communications, Michael Hara, was asked when Nvidia would want to get into the general purpose microprocessor business. Hara said that “the question is not so much I think if; I think the question is when.”

“I think some time down the road it makes sense to take the same level of integration that we’ve done with Tegra,” said Hara. “Tegra is by any definition a complete computer on a chip, and the requirements of that market are such that you have to be very low power, very small, but highly efficient. So in that particular state it made a lot of sense to take that approach, and someday it’s going to make sense to take the same approach in the x86 market as well.”

However, Hara also pointed out that Nvidia’s x86 CPU wouldn’t be appropriate for every segment of the market, and would be mainly targeted at smaller system-on-chip platforms. “If you look at the high-end of the PC market I think it’s going to stay fairly discrete, because that seems to be the best of all worlds,” said Hara, adding that “a highly integrated system-on-chip is going to make sense” in the MID (mobile intelligent device) and netbook markets.

Although Hara didn’t reveal a specific timeframe for the new CPU, he did point out that “it’s not necessary today,” because a combination of Intel’s Atom CPU with Nvidia’s Ion platform would suffice. However, he added that “two or three years down the road I think it’s going to make sense,” and said that “we won’t talk much more about what we think about that timeframe, but there’s no question it’s on our minds.”

Of course the big problem for Nvidia here is that it doesn’t have an x86 license from Intel to produce its own x86 CPUs. However, rumours appeared last year suggesting that Nvidia was considering an alliance with VIA to purchase its processor division, although these were later denied by VIA. Last week, Intel’s CEO Paul Otellini also took a swipe at Nvidia’s lack of x86 CPU technology, saying “If you don't have a microprocessor, what else do you have to sell?”

Should Nvidia enter the x86 CPU market? Let us know your thoughts in the forums.

31 Comments

Discuss in the forums Reply
kenco_uk 4th March 2009, 12:26 Quote
Last para, there's a hyperlink that needs fixing.

It would be a shame if they didn't at least consider the server market - it'd really set the relix amongst the cheesecakes.
tejas 4th March 2009, 12:29 Quote
About effing time!! I 'll be in line for a couple of Nvidia x86 CPUs for myself and my server business!

Bring it on Nvidia! More competition for struggling AMD and bullying Intel is always a good thing! :)
Goty 4th March 2009, 12:38 Quote
This just in: NVIDIA plans to make eight revisions of the same processor all with different product names, but with identical features and performance.
genesisofthesith 4th March 2009, 13:07 Quote
1/Intel announce that they will work with TSMC to allow partners to integrate atom cores into custom products - highly integrated x86 system on chips. It will likely be 2 years down the line before we see products.

2/Nvidia announce they want to make a highly integrated 'nvidia' x86 system-on-chip for lower performance sectors, with a rough start date of two to three years down the line.

It doesn't take a genius to figure out Nvidia are going to be integrating Atom cores into their own system on chips. Theres no need for Nvidia to get an x86 licence as they can now simply licence the cores from intel and integrate them into their products.
p3n 4th March 2009, 13:46 Quote
Quote:
Originally Posted by tejas
About effing time!! I 'll be in line for a couple of Nvidia x86 CPUs for myself and my server business!

Bring it on Nvidia! More competition for struggling AMD and bullying Intel is always a good thing! :)

Your gonna use MID chips in your servers? Nice.
Elton 4th March 2009, 13:55 Quote
I wonder if AMDs doing anything? My guess is that they already have the ITX thing down.
tejas 4th March 2009, 13:57 Quote
hehe maybe I will!! roflmao! :)

Seriously tho I meant if Nvidia make proper x86 CPUs and not just netbook level stuff. Of course if that does not happen then I am still happy to wait for AMD Istanbul 6 cores for my servers hehe ;)
UncertainGod 4th March 2009, 14:29 Quote
Don't they need to get there hands on a licence to manufacture x86 compatible cpu's?
DarkLord7854 4th March 2009, 14:41 Quote
Quote:
Originally Posted by Goty
This just in: NVIDIA plans to make eight revisions of the same processor all with different product names, but with identical features and performance.

I was thinking just that :)
Dr. Strangelove 4th March 2009, 15:03 Quote
Hmm two thoughts came to mind:
Nvidia: "The CPU is Dead.. Long live the CPU (made by NVIDIA)"

and

Nvidia: "The CPU is dead what you need is a GPU with a few CPU capabilities"
Intel: "The GPU is dead what you need is a CPU with GPU capabilities"

The question is.. would AMD not be perfectly positioned to make something in between that might actually work?
ParaHelix.org 4th March 2009, 15:45 Quote
More compotition, go for it.
Goty 4th March 2009, 16:03 Quote
Quote:
Originally Posted by Dr. Strangelove
Hmm two thoughts came to mind:
Nvidia: "The CPU is Dead.. Long live the CPU (made by NVIDIA)"

and

Nvidia: "The CPU is dead what you need is a GPU with a few CPU capabilities"
Intel: "The GPU is dead what you need is a CPU with GPU capabilities"

The question is.. would AMD not be perfectly positioned to make something in between that might actually work?

AMD's fusion initiative is aimed at exactly this same spot, it just hasn't been in the new much lately.
n3mo 4th March 2009, 18:12 Quote
I'd rather see Via in the high-end market than nVidia in low end/embedded market. While the competition is a good thing, I'm not sure if there will be any this way. nVidia will just buy chips from Intel and integrate them into own designs. (and continue selling the same thing under ten names ;) ) Anyway, it will still break my "no-Intel" policy so it doesn't make me excited.
HourBeforeDawn 4th March 2009, 19:53 Quote
the whole reason they want to make a "cpu" is because a few factors one being ray tracing, which is where on graphics side of the market AMD/ATI and Intel will have an advantage and why nVidia always bad mouthed it because they didnt have the tech to add to their GPUs and the other is the hybrid market, there are probably other reasons but those are the two main ones that I can see, I dont really see them getting into the CPU market to make CPUs for desktops and servers but to gain that side of the tech for their graphics cards and maybe at most the ITX market like their "ION".
azrael- 4th March 2009, 22:19 Quote
Quote:
Originally Posted by p3n
Quote:
Originally Posted by tejas
About effing time!! I 'll be in line for a couple of Nvidia x86 CPUs for myself and my server business!

Bring it on Nvidia! More competition for struggling AMD and bullying Intel is always a good thing! :)

Your gonna use MID chips in your servers? Nice.
I'd rather phrase it this way: Would you put CPUs from a company with known chip designing issues (underfill issues, anyone?) into mission-critical servers? No, I thought not...
DeX 4th March 2009, 23:38 Quote
Quote:
Originally Posted by kenco_uk
It would be a shame if they didn't at least consider the server market - it'd really set the relix amongst the cheesecakes.

It made no sense but, Best Metaphor Ever. :)
wuyanxu 5th March 2009, 01:13 Quote
why do they need a license to build x86 chips?? Intel owns the x86 instruction set?
bridgesentry 5th March 2009, 02:24 Quote
It's the time to combine CPU with GPU, one-chip is always the cheapest solution + physX capability right? Mobile gaming systems really need it.
HourBeforeDawn 5th March 2009, 02:35 Quote
Quote:
Originally Posted by bridgesentry
It's the time to combine CPU with GPU, onechip is always the cheapest solution + physX capability right? Mobile gaming systems really needs it.

people can stop mentioning physX come next year it will be dead anyways lol thats why you havent seen it converted to be true gpu processing instead its being simulated with CUDA, they just wont admit that its going on its way out.
Goty 5th March 2009, 03:51 Quote
CPU+GPU chips will be EXTREMELY low end for a long time after they're introduced. The die space reqirements for either are just too high to make even one of them truly high performance.
HourBeforeDawn 5th March 2009, 04:20 Quote
Quote:
Originally Posted by Goty
CPU+GPU chips will be EXTREMELY low end for a long time after they're introduced. The die space reqirements for either are just too high to make even one of them truly high performance.

well CPUs are at 45nm and GPUs are about to hit 40nm so we are almost there to where there is enough space for a decent configuration but it probably wont be until both CPU and GPU are sub 40nm that they really start cranking out the high end stuff but I do see low and mid range in the near future. Also its not like they cant make the socket bigger to make room for it all.
Slyr7.62 5th March 2009, 06:37 Quote
Quote:
Originally Posted by Goty
CPU+GPU chips will be EXTREMELY low end for a long time after they're introduced. The die space reqirements for either are just too high to make even one of them truly high performance.
Quote:
Originally Posted by HourBeforeDawn
well CPUs are at 45nm and GPUs are about to hit 40nm so we are almost there to where there is enough space for a decent configuration but it probably wont be until both CPU and GPU are sub 40nm that they really start cranking out the high end stuff but I do see low and mid range in the near future. Also its not like they cant make the socket bigger to make room for it all.
Good statements, yes.

I'll prefer my GPU and CPU seperated, for 1) I can get the highest bang for the buck(easier), and 2) For now it'll be easier to keep both components cool when they're separated, maybe.
HourBeforeDawn 5th March 2009, 06:44 Quote
Quote:
Originally Posted by Slyr7.62
Good statements, yes.

I'll prefer my GPU and CPU seperated, for 1) I can get the highest bang for the buck(easier), and 2) For now it'll be easier to keep both components cool when they're separated, maybe.

ya I think one approach that I normally disapproved of but in this case may work out better is Intels super glue approach for the C2D when they used two dies to make their quad cores, a similar idea would be to have one die be say a dual core cpu and the other given current sizes a single gpu, that potentially could offer great performance if its pulling from the same DDR3 banks but have their own traffic flow. I mean this could yield better performance then current setups (1 cpu, and graphics card), at least I hope. May be ideal for laptops in particular and reduce cooling and power requirements, again I hope lol.
Natima 5th March 2009, 07:08 Quote
Imagine if in the future we have entire PC's on single chips i.e. GPU, CPU, RAM, Chipset.
Companies such as intel could offer a choice of 2 chipsets with a choice of 3 gpu's per chipset and 3 cpu's corresponging to each chipset.
You could have the equivalent of X48 and P45. Call these P100 and X100.
P100:
- MID performance GPU/CPU
- Netbook performance GPU/CPU
- Crappy Office PC performance GPU/CPU
X100:
- Mid-range PC (budget minded people)
- High-end PC (average gamer/user)
- Ultra High-end PC (gaming/servers/music & graphics)
Horizon 5th March 2009, 08:11 Quote
Quote:
Originally Posted by ParaHelix.org
More compotition, go for it.

Spell-check! Go for it.
Nikumba 5th March 2009, 09:47 Quote
Quote:
Originally Posted by wuyanxu
why do they need a license to build x86 chips?? Intel owns the x86 instruction set?

Pretty much yes, if you make an x86 you need a licence from Intel, I guess if nVidia didnt have one, and couldnt get one they would partner up with someone who has one, but I am not sure if Intel will like that and could potentially revoke the licence of that company

Kimbie
UncertainGod 5th March 2009, 10:00 Quote
The only option nVidia would have for the x86 SoC space is partnership/merger with Via and at best all that would do is put them very firmly behind both Intel and AMD.
Cupboard 6th March 2009, 11:49 Quote
In concept, I don't like the idea of everything integrated on my CPU, at least for my main system. The big issue is that you can't just upgrade your graphics card but keep your CPU, like I have done. an what about all the people who may need powerful CPUs but no graphics card, like workstations servers etc. That is the major problem with these.

However, in the MID/netbook/nettop segments then great, lower power, maybe higher performance and its not like you are going to be upgrading anyway. (btw, I thought it was a mobile internet device, not a mobile intelligent device - wouldn't that have to be the person carrying it?)
Jenny_Y8S 6th March 2009, 17:39 Quote
Why is everyone talking about "I need to keep my CPU & GPU seperate".

The way things are going, PCs will one day have one (or more) Uber core processors, what they do will depend on what code they run on them. Pure number crunching, 3D rendering, Post processing, AI - Will all be done on the same architecture.

It's not that seperate GPUs won't exist, but I bet the mainstream won't need them, and when that happens the support will go.

My money is on the GPU (as we know it) has 5 years left tops
azrael- 6th March 2009, 17:59 Quote
The thing is that normal number crunching and graphics processing need to be handled differently to get the most performance from them.

Graphics processing mostly consists of massively parallel, yet mostly quite simple, operations, while normal mathematical operations, as done by the CPU (or its FPU) are much more complex and usually do not involve parallelism of any high order.

At some point CPUs and GPGPUs *will* converge, which is why it was only a matter of time before nVidia would start making CPUs (despite doing a Nietzsche on them). It's also the main reason AMD bought ATi, although it pushed AMD to near bankruptcy.

I'm not really seeing a combined CPU/GPGPU for some time, though. I believe the GPGPU will migrate to the motherboard/chipset in some way, being connected to the CPU and memory via a high-speed transport (enter HT and CSI).
pullmyfoot 14th October 2009, 07:08 Quote
Quote:
Originally Posted by n3mo
I'd rather see Via in the high-end market than nVidia in low end/embedded market. While the competition is a good thing, I'm not sure if there will be any this way. nVidia will just buy chips from Intel and integrate them into own designs. (and continue selling the same thing under ten names ;) ) Anyway, it will still break my "no-Intel" policy so it doesn't make me excited.

no Intel policy F T W. I buy AMD not so much because I am an AMD fan-boy but because I cant stand Intel. With that said I also take common sense into consideration. Ill buy AMD as long as they have something that can roughly match up to Intel.

I do hope NVIDIA teams up with VIA and makes some X86 CPUs though. That will be interesting. Then we will have 3 companies with complete system platforms going head to head. And I can imagine that it will pick up pretty fast if its competitive since there are so many NVIDIA fanboys out there.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums