bit-tech.net

Nvidia announces Project Denver ARM CPU

Nvidia announces Project Denver ARM CPU

Nvidia's Project Denver marries an ARM CPU with a GeForce GPU, and takes aim at the desktop, server and supercomputer markets.

Nvidia has announced its intentions to release a combined CPU and GPU product, using its ARM licence to produce a system-on-chip design under the codename Project Denver.

Announced at CES, the project is clearly an extension of the company's current Tegra ARM-based CPUs, which will get an upgrade to ARM's Cortex-A15 processor design in the near future. This time, however, they won't be powering mobile devices. Instead, the company is looking to encroach on rivals AMD and Intel with chips for PCs, servers and even supercomputers.

Speaking to press at the Consumer Electronics Show, Nvidia chief Jen-Hsun Huang described ARM as 'the fastest-growing CPU architecture in history,' and explained that Nvidia is looking to ride the wave by 'designing a high-performing ARM CPU core in combination with our massively parallel GPU cores to create a new class of processor.'

Although firm details of Project Denver are still being kept under wraps, the SoC design will combine a multi-core ARM CPU with a GeForce GPU. Nvidia's mention of server and supercomputer uses for the Denver chips also suggests that the company will be producing boards capable of accepting multiple Denver chips, producing many-core server systems capable of CPU and GPGPU processing.

The move marks a sea-change from Huang's stance in 2007, when he firmly denied that Nvidia would be launching a CPU, but provides a much-needed response to criticisms in 2009 from Intel chief Paul Otellini, who claimed that Nvidia couldn't compete without a CPU of its own.

Currently, ARM is the most popular architecture in the world of smartphones and embedded systems, vastly outselling rival x86 designs from companies such as Intel, AMD and VIA. However, the architecture has been under-represented in the desktop and server markets, largely because of a lack of support in mainstream operating systems.

That's all set to change, though, as Microsoft has also confirmed the rumours that the next release of Windows would be made available for both ARM and x86 platforms, along with an ARM-compatible release of Microsoft Office.

With Microsoft backing the architecture, 2011 could prove a good year for ARM, while also potentially signalling the end of the x86 monopoly on the desktop.

Are you pleased to see more companies investigating the possibilities for ARM in the desktop and server markets, or do you think too much has been invested in the development of x86 to switch architectures now? Share your thoughts over in the forums.

30 Comments

Discuss in the forums Reply
Tyrmot 6th January 2011, 14:51 Quote
Very interesting. Intel may come to regret the monopoly it's imposed on the market for so long
GoodBytes 6th January 2011, 14:53 Quote
Ah yes, this will finally mean real competition on the CPU market. If a special build of softwares needs to be done on Windows 8 for ARM processors, then it could take time, but if not, it will be very interesting.

Especially that we all know that Nvidia makes kick-ass motherboard chipset, while Intel: not so much... it's going to be very tight and interesting competition.

My vote is that it's going to be called: Tegra CTX 280 :) C for CPU
Landy_Ed 6th January 2011, 15:22 Quote
This potentially ups the game considerably. I don't think it's cause to hang fire on a sandybridge, but will be interested to see how this affects mini-itx offerings
roadie 6th January 2011, 15:23 Quote
Great if they can ship a competitive product.
Lizard 6th January 2011, 15:34 Quote
Quote:
Originally Posted by GoodBytes
Especially that we all know that Nvidia makes kick-ass motherboard chipset, while Intel: not so much... it's going to be very tight and interesting competition.

At the risk of starting a flame war I have to ask 'what planet are you from?'

Nvidia motherboard chipsets were horrific - with the exception of nForce 4 SLI (for AMD chips only) they were slow, terrible for overclocking, ran ridiculously hot, killed DIMMs and USB sticks and unstable.

Nvidia pulling out of this market was one of the best things that ever happened to the motherboard industry - RMA rates dropped back to normal when they did.
phuzz 6th January 2011, 15:52 Quote
Hmm, for me the advantage of running windows, is the massive amount of software I can run on it (eg games). Unless M$ come up with some mighty fine virtualiseation software then only software compiled for ARM will run on it.
In which case, you're left with all the downsides of windows, with very few of the advantages.
GoodBytes 6th January 2011, 15:58 Quote
Quote:
Originally Posted by Lizard
At the risk of starting a flame war I have to ask 'what planet are you from?'

Nvidia motherboard chipsets were horrific - with the exception of nForce 4 SLI (for AMD chips only) they were slow, terrible for overclocking, ran ridiculously hot, killed DIMMs and USB sticks and unstable.

Nvidia pulling out of this market was one of the best things that ever happened to the motherboard industry - RMA rates dropped back to normal when they did.

Hmm no. The only failure from Nvidia was the nForce 6, where magma was coming out of it.
Nvidia best chipset was indeed the nForce 2 and 4.
The rest was not bad, or on par than any other offering available, you are just exaggerating. Now if you had problem with your motherboards, then I blame the manufacture.
GoodBytes 6th January 2011, 16:11 Quote
Quote:
Originally Posted by phuzz
Hmm, for me the advantage of running windows, is the massive amount of software I can run on it (eg games). Unless M$ come up with some mighty fine virtualiseation software then only software compiled for ARM will run on it.
In which case, you're left with all the downsides of windows, with very few of the advantages.

M$ doesn't support ARM. All they have is those replica Microsoft Windows version, apparently. Microsoft should really sue them.

Anyway, MICROSOFT has many methods they can use for their coming OS to support ARM. My guess, is that they will use a translator system, which will cost performance. But if so, it means (on paper):
1- ARM compiled software runs at full speed.
2- x86 compiled software, will be slower due to the translation.

#2 overhead can be reduced, if Windows is able to use one of the core of the CPU for the translation, and the rest for the applications. But now we enter in the extremely complex and difficult system.

My guess is that nothing will be done... Win8 will run on it.. and it will be the role of the companies to make a ARM processor version.

I don't expect ARM desktop CPU's to be release as soon as Win8 comes out, let alone have the fist ARM desktop CPU's to be interesting, and even less that Windows 8 will do a perfect job to support ARM processors at release. BUT it's an entry. My guess is that the ARM version of Win8 will be on order, like XP 64-bit. Where you won't see it on store shelf.
Ross1 6th January 2011, 16:28 Quote
You wonder if Intel could get pushed out of the small, low power graphics market, amd have fusion, and nvidia have this.
Nictgsf 6th January 2011, 16:30 Quote
I have to agree with Lizzard, nforce chipsets were terrible. The first Asus 780i I had got returned after 6 months for a full refund no questions asked (overheating, unstable, constant memory issues) and the second Asus 780i lasted 1 month before the north bridge commited suicide. The 790i lasted longer, but eventually became unstable after 6 months. Switched to a x48-t, runs cooler, overclocks easier and with a little software mod it can run sli. Finally after 2 years my system is running how it should.

Nividia wil really need to raise their game if they re-enter the chipset market
Lizard 6th January 2011, 16:32 Quote
Quote:
Originally Posted by Nictgsf
Nividia wil really need to raise their game if they re-enter the chipset market

I suspect Nvidia would also have to give some pretty big incentives to motherboard manufacturers to go into business with them again as the RMA rates were so high on everything bar the AMD version of nForce 4.
Zinfandel 6th January 2011, 17:16 Quote
I can't cope with this anymore. Wake me up when everything had settled down.

Go ARM!
DbD 6th January 2011, 18:19 Quote
Quote:
Originally Posted by GoodBytes
[

M$ doesn't support ARM. All they have is those replica Microsoft Windows version, apparently. Microsoft should really sue them.

Anyway, MICROSOFT has many methods they can use for their coming OS to support ARM. My guess, is that they will use a translator system, which will cost performance. But if so, it means (on paper):
1- ARM compiled software runs at full speed.
2- x86 compiled software, will be slower due to the translation.

MS have said they will support ARM in windows 8 natively. MS has a major competitor now in google, who's O/S's are working up towards being a full windows competitor, if MS want to compete they have to support ARM.

In the medium term the world is changing, everything is getting smaller. In the same way laptops are replacing desktops, it may well be smart phones replace laptops. If people want a big screen+mouse+keyboard they just dock the phone to a shell and the phone is the PC - the peripherals just talk to it - by then it'll be fast enough to do everything a laptop can do today.

Heavyweight processing is done in the cloud, which is also where your data is stored. Sure you can still own a proper PC if you wish but for many they won't bother. This is what MS realise, and this is where windows 8+ are heading.
V3ctor 6th January 2011, 18:24 Quote
Quote:
Originally Posted by Lizard
Quote:
Originally Posted by GoodBytes
Especially that we all know that Nvidia makes kick-ass motherboard chipset, while Intel: not so much... it's going to be very tight and interesting competition.

At the risk of starting a flame war I have to ask 'what planet are you from?'

Nvidia motherboard chipsets were horrific - with the exception of nForce 4 SLI (for AMD chips only) they were slow, terrible for overclocking, ran ridiculously hot, killed DIMMs and USB sticks and unstable.

Nvidia pulling out of this market was one of the best things that ever happened to the motherboard industry - RMA rates dropped back to normal when they did.


Sooo true... I managed to have a NF4 chipset to survive until now, but it's always at 70ºc with active cooling on it... I never saw anything like it.

PS: Had 2 A8N-SLI's Deluxe, and 1 A8N-SLI Premium... All died with that stupid chipset, memories start failing at boot, that was the beggining... Sold that crap (except Opty 170), and bought the venerable Q6600, always stable with the P35/P45 chipset :)
frontline 6th January 2011, 19:02 Quote
Quote:
Although firm details of Project Denver are still being kept under wraps, the SoC design will combine a multi-core ARM CPU with a GeForce GPU


Well, they do say that imitation is the sincerest form of flattery...
wuyanxu 6th January 2011, 19:30 Quote
it's not going to be something we, PC hardware enthusiastics, will get to play with. it's a SoC aimed at tablets, servers and supercomputers.

the point of nVidia releasing this piece of information, is that they are not looking at entering the consumer CPU market, they are looking at entering the OEM device market, the market currently dominated by Samsung, Qualcomm and by project popularity, Apple (Apple's A4 processor is an good example of what nVidia is aiming for)
Gareth Halfacree 6th January 2011, 19:44 Quote
Quote:
Originally Posted by wuyanxu
it's not going to be something we, PC hardware enthusiastics, will get to play with. it's a SoC aimed at tablets, servers and supercomputers.
And desktop PCs. At least, that's what Nvidia says.
Quote:
Originally Posted by wuyanxu
the point of nVidia releasing this piece of information, is that they are not looking at entering the consumer CPU market, they are looking at entering the OEM device market
But Nvidia is already in the OEM device market: it has an ARM-based SoC design that is already appearing in tablets and smartphones - Tegra.

Denver is *totally* separate to Tegra, although it shares much of the design characteristics. Denver won't be appearing in tablets - that's what Tegra is for. Denver is for desktops, servers, and supercomputers - and while you're probably not likely to buy a Denver CPU as a retail item, you'll certainly be able to buy mini-ITX (and larger) motherboards with Denver CPUs embedded, much as you can with Intel's Atom and VIA's Nano now.
Quote:
Originally Posted by wuyanxu
the market currently dominated by Samsung, Qualcomm and by project popularity, Apple (Apple's A4 processor is an good example of what nVidia is aiming for)
Yes, Nvidia is aiming for the equivalent to Apple's A4 - with Tegra, not with Denver. Bear in mind that Tegra is already on its second iteration, having beaten the A4 chip to market by quite some time...
memeroot 6th January 2011, 21:08 Quote
ms have to support arm as their competitors will.

who'da thunk it - apps make opensource chips worth while
Tyinsar 6th January 2011, 21:55 Quote
Quote:
Originally Posted by phuzz
Hmm, for me the advantage of running windows, is the massive amount of software I can run on it (eg games). Unless M$ come up with some mighty fine virtualiseation software then only software compiled for ARM will run on it.
In which case, you're left with all the downsides of windows, with very few of the advantages.
true
dicobalt 6th January 2011, 22:29 Quote
I want them to succeed with the desktop ARM CPU but I seriously doubt this thing will ever make it to market. Here's why:

1) ARM is specifically made for low power processing. The cores of an ARM are simple and small. In order to make this work they are going to have to change ARM a whole lot. So they are going to be making up all sorts of new stuff for an instruction set that wasn't intended to do what it's doing. That's the same thing Intel had to do with x86. The difference is that Intel has had decades of experience and time in order to do it. Nvidia does not make CPU's they make GPU's and there is a big difference between them. Nvidia will end up with a chip that is much slower and cannot compete with Intel or AMD.

2) Say goodbye to all x86 software because virtualization doesn't work when you change instruction sets. You can only virtualize x86 instructions on x86 hardware - not on ARM hardware. Emulation is too slow to be practical. People who choose to use ARM hardware will be limiting themselves greatly in their software choices. Why change platforms when you have what you want on x86? The broad consumer market won't do it.

3) I believe that Nvidia is not serious about this project in the first place. Nvidia wants to force Intel to give it an x86 license. Nvidia is telling Intel that they are going to compete in the desktop CPU market and make them loose money and possibly the entire desktop/server/HPC market. Well that's what Nvidia wants them to think anyway. If Intel licenses x86 to Nvidia then Intel at least makes a royalty on each x86 Nvidia CPU, instead of making nothing on each ARM CPU. All of the companies involved are on the boat with Nvidia and also want Nvidia to have an x86 license and maybe some want licenses themselves also.
wuyanxu 6th January 2011, 23:24 Quote
Quote:
Originally Posted by Gareth Halfacree
And desktop PCs. At least, that's what Nvidia says.

But Nvidia is already in the OEM device market: it has an ARM-based SoC design that is already appearing in tablets and smartphones - Tegra.

Denver is *totally* separate to Tegra, although it shares much of the design characteristics. Denver won't be appearing in tablets - that's what Tegra is for. Denver is for desktops, servers, and supercomputers - and while you're probably not likely to buy a Denver CPU as a retail item, you'll certainly be able to buy mini-ITX (and larger) motherboards with Denver CPUs embedded, much as you can with Intel's Atom and VIA's Nano now.

Yes, Nvidia is aiming for the equivalent to Apple's A4 - with Tegra, not with Denver. Bear in mind that Tegra is already on its second iteration, having beaten the A4 chip to market by quite some time...

the OEM market for Atom is now pretty much evolved from Atom to tablets or (very soon) AMD's Fusion cores. i believe that's where nVidia's aiming at first stage, they cannot compete with desktop level processors. well, not with just an ARM core, i suspect they may be looking at AMD's Fusion approach.

in terms of Tegra, am i right in saying it is a graphics processor IP for SoC's? if right, that makes it same as Imagine Technology's graphics IP's, not the actual SoC products such as Hummingbird/A4. i got the impression that with Denver, nVidia is actually going to make a physical product themselves, rather than relay on other manufacturers to use their IP.
Krayzie_B.o.n.e. 7th January 2011, 02:13 Quote
ARM Tegra Denver Johhny Utah whatever!

But can a Nvidia Cpu do pi to the umpteenth numeral while running Crysis?
Cthippo 7th January 2011, 03:52 Quote
Quote:
Originally Posted by Krayzie_B.o.n.e.
But can a Nvidia Cpu do pi to the umpteenth numeral while running Crysis?

No, of course not, but that's the point. Most people, including many of us on Bit, don't have any use for the amount of processor grunt available on the current high end processors. These low powered systems will do everything we need them to do with fewer watts in the whole system than modern CPUs use by themselves. Even the high end games don't come close to using all the resources of a modern mid-range processor.

It could be just that I'm getting to be a cranky old git, but I'm not ready for my computer to be based around my phone. My next computer will probably be a mini-ITX based system, but even now I don't feel like I have any need for a laptop and the idea of web browsing on that tiny little screen is not attractive at all.

As for the future of Intel, I think they may soon find they're the best at making things that nobody wants. We've passed the point of "good enough" and are well into overkill on CPU power. Spending more money on higher powered systems doesn't give you a better experience, just more noise, heat, and a higher power bill. If a low powered uATX or mini-ITX system could do everything that your current system could do, would you not be tempted to go that route?
fluxtatic 7th January 2011, 04:54 Quote
Quote:
Originally Posted by wuyanxu

in terms of Tegra, am i right in saying it is a graphics processor IP for SoC's? if right, that makes it same as Imagine Technology's graphics IP's, not the actual SoC products such as Hummingbird/A4. i got the impression that with Denver, nVidia is actually going to make a physical product themselves, rather than relay on other manufacturers to use their IP.

Not quite. Tegra is an ARM-based SOC. The first Zune, at least, ran on a first-gen Tegra. Apparently there will be a port of Ubuntu for Tegra, as well, meaning many swollen nerd e-peens when they have Linux running on a Zune :P
GoodBytes 7th January 2011, 12:36 Quote
Quote:
Originally Posted by fluxtatic
Not quite. Tegra is an ARM-based SOC. The first Zune, at least, ran on a first-gen Tegra. Apparently there will be a port of Ubuntu for Tegra, as well, meaning many swollen nerd e-peens when they have Linux running on a Zune :P

Actually, only the Zune HD uses Tegra chip
rickysio 7th January 2011, 15:58 Quote
Quote:
Originally Posted by GoodBytes
Quote:
Originally Posted by fluxtatic
Not quite. Tegra is an ARM-based SOC. The first Zune, at least, ran on a first-gen Tegra. Apparently there will be a port of Ubuntu for Tegra, as well, meaning many swollen nerd e-peens when they have Linux running on a Zune :P

Actually, only the Zune HD uses Tegra chip

The Kin series are weeping in fury. They demand your blood.
GoodBytes 7th January 2011, 18:13 Quote
Quote:
Originally Posted by rickysio
The Kin series are weeping in fury. They demand your blood.

Kin is not a Zune.
b5k 8th January 2011, 09:52 Quote
On the subject of that flame war...





...My nForce 2 worked great...





...After I lapped the heat sink flat and put active cooling on it!
Jimbob76 19th February 2011, 00:04 Quote
I can see nVidia doing very well! and here is why.

1) The power per watt of the ARM is very very impressive.

2) Many ARM chip in phones can do 2x1080p decoding in real time. It was not long-ago that you needed a quad core x86 chip to decode one stream of 1080p without jumping.

3) ARM Cortex A9 chips are already working 2.5Ghz with no active cooling. Hard macro version.

4) ARM chips are held back by slower low power DDR2 Ram 200Mhz – 333Mhz that will change.

5) ARM hardware is fare more area efficient then x86 after all x86 chips a risc chip with a x86 decoder. The ARM dues not need a decoder. More cores per chip

6) ARM was built ground up for multiprocessing. It’s also designed to have new instructions added.

7) Intel graphic chips are well not very good. So it’s going to be AMD that are going to be the main problem for nVidia. I think Intel is going to be in big trouble from All the new ARM chips and AMD new chips. Laptops Net books and servers are going to change soon. I think many people don’t understand how may company’s are backing ARM. ARM also tock on all the Risc chip and won. In the next ten years x86 will be history. Why? Because everyone is making ARM chips except Arm themselves. It’s all about the business model!
Gazzy 21st July 2011, 07:31 Quote
software support, as others have mentioned is key.

game development as has been mentioned is also key.... HOWEVER do not forget that ARM processors out sells intel and AMD put together , and as far as game company's getting behind them for titles, well, haven't they already done this? I look at some of the games and the graphics on the i phone4 and HTC phones and tbh they are pretty good. looks to me as if publishers are already well versed in ARM platform's and have been writing ARM compatible versions of smiler titles to that of the PC for a while. There maybe be entirely new group's of develops more than keen to produce high quality software titles for an ARM based PC system.. and if they are good enough, we might see x86 platforms wanting 'ports' of game titles from copied over from ARM. :) It would be nice to see a bit of platform competition again. Its not impossible to come from nowhere and and become a house hold name. Look at sony Playstation. I think people bought them because sony was pretty cool. And I for one think I would buy an ARM cpu under an NVIDIA brand name, because NVIDIA is petty cool. And i don't think i'm the only person who thinks so.

As for ppl still buying 'intel' over 'AMD' as someone said, personally I prefer AMD. And in the early days i never understood hwhy people bought intel when they cost more than AMD and were slower.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums