bit-tech.net

Nvidia showcases quad-core Tegra 3 silicon

Nvidia showcases quad-core Tegra 3 silicon

Samples of Nvidia's tiny chip are already being tested by industry partners.

Nvidia suprised a few people last night by announcing the imminent arrival of its new quad-core mobile Tegra chip, based on the Kal-El core design.

The chip was detailed on an Nvidia roadmap just last month, but speculation has remained rife about whether the graphics giant could actually deliver what looked like an ambitious plan.

Clearly, though, Nvidia had already made a lot of progress with the design when the roadmap was released, as the company claims it's already sending silicon to prospective partners for testing.

Partners are likely to be impressed too, as Engadget claims to have seen a live demo of a device based on the chip scaling down a 2,560 x 1,440 video stream to the device's 1,366 x 768 native resolution, while simultaneously outputting the same stream at 2,560 x 1,600 on a 30in monitor. This is an impressive feat for such a dinky chip, and could herald the arrival of some seriously well-specified handheld devices.

The four processing cores are likely to be based around a beefed-up version of the ARM Cortex A9 architecture present in Nvidia’s current Tegra 2 chip, which is capable of running at up to 1.5GHz. This represents a 50 per cent increase in clock speed over the 1GHz frequency of current Tegra chips.

The GPU section of the chip has also been boosted, with a full dozen GPU ‘cores’ now present, rather than the eight in the previous design. It's hardly likely to pose a threat to the discrete graphics card sat in your desktop PC, but it should be more than capable for gaming on a small tablet or mobile device.

Would you be interested in a quad-core Tegra 3-equipped tablet? Can you see any other possible applications for such a small and powerful chip? Let us know in the forums.

21 Comments

Discuss in the forums Reply
maximus09 16th February 2011, 15:34 Quote
uumm sounds tasty, but it all depends on the device and os, but it might just make some of those tablets a bit more desirable as long as their below £1000.
hbeevers 16th February 2011, 15:34 Quote
Great, Nvidia's really come through with this i think, now microsoft just needs to make a revision of windows 7 making it better for tablets/ netbooks and increase the battery life to compete properly with the ipad and macbook air. Although if i got a tablet with this chip in it i'd likely just install ubuntu.
wuyanxu 16th February 2011, 15:34 Quote
it's amazing how fast these ASIC are advancing......

soon, a tiny general purposely made SoC chip will be all that's needed to do a space mission. (think GPGPU in these nVidia chips for vector calculations)
aleph31 16th February 2011, 15:45 Quote
I would like to know how those figures compares with the just-announced quad-core Snapdragon Krait (APQ8064) from Qualcomm...

That's the problem with all those new chips, you cannot make an isolated benchmarking, just smartphone vs. smartphone...
DarkFear 16th February 2011, 16:09 Quote
Anyone else having visions of Superman sitting somewhere in a lab designing nVidia chips? No? Must be the whisky then...
Fizzban 16th February 2011, 16:10 Quote
Quote:
Originally Posted by wuyanxu
it's amazing how fast these ASIC are advancing......

I was thinking the same thing.
coolius 16th February 2011, 16:32 Quote
Can't wait to see this in an Atrix-like device. Talk about an ultra-mobile PC.
Skiddywinks 16th February 2011, 16:33 Quote
There's an in-depth write up on Anandtech for all those interested. It does look like a very decent SoC. Tegra 2 was impressive, but the time frames involved with Tegra 3 are just astounding considering the gains. T3 will be in tablets by August, if all goes to nVidia's plans.
DbD 16th February 2011, 16:35 Quote
Quote:
Originally Posted by wuyanxu
it's amazing how fast these ASIC are advancing......

soon, a tiny general purposely made SoC chip will be all that's needed to do a space mission. (think GPGPU in these nVidia chips for vector calculations)

The computer that put man on the moon was less powerful then your average calculator. It had 64kb of memory, weight 35kg and used 55W.
wuyanxu 16th February 2011, 16:53 Quote
Quote:
Originally Posted by DbD
The computer that put man on the moon was less powerful then your average calculator. It had 64kb of memory, weight 35kg and used 55W.
thanks for a well known fact. but the reason for it being so slow is that it's a bespoke system. even today, if there were another mission to the moon, we would be using slower, more power hungry system because it require heavy customisation.

hence i suggested use of COTS SoC.

same for military tech, if we can adopt commercial off-the-shelf stuff we should see a huge increase in effectiveness for chip speed/heat/power.
mecblade 16th February 2011, 17:47 Quote
Poor Qualcomm Snapdragon...
DbD 16th February 2011, 23:22 Quote
Quote:
Originally Posted by wuyanxu
Quote:
Originally Posted by DbD
The computer that put man on the moon was less powerful then your average calculator. It had 64kb of memory, weight 35kg and used 55W.
thanks for a well known fact. but the reason for it being so slow is that it's a bespoke system. even today, if there were another mission to the moon, we would be using slower, more power hungry system because it require heavy customisation.

hence i suggested use of COTS SoC.

same for military tech, if we can adopt commercial off-the-shelf stuff we should see a huge increase in effectiveness for chip speed/heat/power.

It's not that bespoke, last I heard the ISS, and a number of nuclear subs were using 386's, same as we had in our pc's many years ago. There was the famous story of the American military trying to buy them on ebay cause they needed spares.
Guinevere 16th February 2011, 23:33 Quote
Quote:
Originally Posted by hbeevers
now microsoft just needs to make a revision of windows 7 making it better for tablets/ netbooks and increase the battery life to compete properly

Little bit of a problem with that...

It's not just windows, it's all the apps and software. Why bother going with a windows machine that makes using all the legacy apps a pain in the...

That's why android and iOS are really doing well in this space and exactly the reason why apple didn't just use OSX and google didn't just go with a standard Linux (or just leave it alone).

Both apple & google saw the need / demand for a touch based OS, Microsoft didn't or more likely decided they were too big to have to worry about it. Unless they've been working on something pretty pretty special (they haven't!) they are not going to compete in the handheld arena, not yet.

Apple + Google + RIM have taught a lot of people that it doesn't matter if your mobile device runs Windows or not, it's how it works with what else you've got that matters. Traditionally microsoft have been terrible at the whole sync / dual use side of things, that's why people moved away from them and are very very wary of ever going back - not when the competition works so well.
Guinevere 16th February 2011, 23:43 Quote
[QUOTE=wuyanxu]
Quote:
Originally Posted by DbD
if there were another mission to the moon, we would be using slower, more power hungry system because it require heavy customisation.

There's been a lot of missions to the moon since Apollo (just not any that landed a human).

Also companies are already working at using standard chips for space science. Surrey Satellites are world leaders in using non-bespoke hardware to make missions uber cheap (and they are here in the UK!). I would assume it would also make export a lot easier, as a lot of ESA missions have issues when exporting their tech to countries like China. The "Military Grade" Radiation Hard chips they use (Often sourced from the US) have to be swapped out for something less controversial. Switching to a system based on say a quad core android unit would be pretty cool.

( Trivia : I live with a Rocket Scientist / Spaceship Commander )

http://www.sstl.co.uk/news-and-events?story=1706

"Space researchers at the University of Surrey and Surrey Satellite Technology Limited (SSTL) have developed ‘STRaND-1’, a satellite containing a smartphone payload that will be launched into orbit around the Earth later this year. STRaND-1 (Surrey Training, Research and Nanosatellite Demonstrator) is being developed by the Surrey team to demonstrate the advanced capabilities of a satellite built quickly using advanced commercial off-the-shelf components."
memeroot 16th February 2011, 23:55 Quote
hubble used a 486
Tangster 17th February 2011, 01:47 Quote
Quote:
Originally Posted by Article
based on the Kal-El core design[/url]
It'd better be amazing to live up to that name.
D B 17th February 2011, 13:36 Quote
Any missions to the moon since Apollo have not needed to be made with "man rated" components .. apples/oranges
Not sure where I'm leaning toward , a good netbook pc would serve me at work very well , not so sure about having a tablet ... It's for sure going to expand the usability of both
SMIFFYDUDE 17th February 2011, 15:20 Quote
Waiting for a kryptonite chip to steal its thunder
somidiot 17th February 2011, 19:12 Quote
Quote:
Originally Posted by Tangster
Quote:
Originally Posted by Article
based on the Kal-El core design[/url]
It'd better be amazing to live up to that name.

+1
So, did Nvidia decide to start their DC universe naming codes? I probably won't bet getting the darkseid chip....
Madness_3d 17th February 2011, 21:07 Quote
Mobile Hardware is moving SO SO FAST :D
Blue Shadow 20th February 2011, 15:25 Quote
Quote:
Originally Posted by wuyanxu
thanks for a well known fact. but the reason for it being so slow is that it's a bespoke system. even today, if there were another mission to the moon, we would be using slower, more power hungry system because it require heavy customisation.

hence i suggested use of COTS SoC.

same for military tech, if we can adopt commercial off-the-shelf stuff we should see a huge increase in effectiveness for chip speed/heat/power.

Space is actually not that simple, radiation and other factors (high energy protons, etc.) cause faults that you wouldn't normally get on earth: random bit-flips in memory and even while processing. The smaller the fabrication tech used in modern IC's is more susceptible to these kind of problems. So instead they're likely to use larger fabrication processes, though that's still not perfect. A common approach used, especially in deep-space probes, is to use an odd-multiple of discrete computers (e.g. 5) and take the majority answer from them.

There's also issues surrounding predictive caching, multilevel caching, paging; super-scalar architecture and the like, that can cause problems in hard-real-time systems so they'd probably want to rip everything apart anyway. And if not you write software for the worst possible case so the extra performance is largely pointless (though it might save power in the long run).

Hence using older hardware, built on larger, more reliable processes that tend to not to have the less predictable performance enhancing technologies:
Quote:
Originally Posted by memeroot
hubble used a 486
Quote:
Originally Posted by DbD
It's not that bespoke, last I heard the ISS, and a number of nuclear subs were using 386's, same as we had in our pc's many years ago.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums