bit-tech.net

Gelsinger kicks off IDF Shanghai with Larrabee

Gelsinger kicks off IDF Shanghai with Larrabee

Gelsinger says that Larrabee has generate more excitement from ISVs than any other project he's worked on in the 30 years he's been in the industry.

Pat Gelsinger has kicked off IDF Shanghai by revealing some new details on its Larrabee project.

Gelsigner claimed that “today’s graphics architectures are coming to an end – it’s no longer scalable for the needs of the future.

The graphics pipeline is too inefficient for the next-generation workloads that will be created and used.

He said that the industry needs a programmable, ubiquitous and unified architecture and he believes that Larrabee delivers on that front.

Gelsinger said that he’s seeing a trend emerging, where ray-tracing is moving from the high-end into the mainstream and he believes ray-tracing will make an appearance in mainstream graphics programs.

Larrabee uses a very short pipeline that delivers “teraFLOPS of performance on a single die.” It features over 100 new instructions and massive amounts of bandwidth—both to on-die cache and to local memory.

Gelsinger said in the 30 years he’s been in this industry, he’s never seen so much excitement around a project that Intel has worked on. “[There is] stunning excitement from ISVs for Larrabee,” he said.

It’s an exciting product for enthusiasts and hardcore gamers as well, as it could help to move the industry on step closer to photo realistic gaming. And later this year, we’re set to get closer to that mark with the release of FarCry 2 – Gelsinger showed some footage from the title, which is set to be released later this year. Quite simply, it looks amazing – the benchmark at the moment is of course Crysis and FarCry 2 has a good chance of breaking the boundaries set by Crytek.

Gelsinger also showed off Quake 4 RT again, and hailed ray tracing as a way for artists to focus on content creation again – once Larrabee arrives, Gelsinger believes that artists will no longer have to worry about limitations. That’s something we’ll have to wait and find out if it holds true, as ray tracing has had its sceptics in the industry. Of course, that’s not to say there aren’t uses – it’ll be an extra tool in the game developer’s armoury.

We’ll be looking to get more information on Larrabee over the course of the next couple of days and you’ll hear more as soon as we do. For now, you can discuss these developments in the forums.

7 Comments

Discuss in the forums Reply
r4tch3t 2nd April 2008, 04:52 Quote
I'm looking forward to a review of this, see how it stacks up to the likes of nVida and AMD.

p.s. Article doesn't link here yet. Does now ignore me
naokaji 2nd April 2008, 09:10 Quote
sounds good, but games will have to be made differently to take advantage and how it performs with "regularly" coded games remains unknown especially considering that Intel never really mentioned that scenario...
Fod 2nd April 2008, 09:35 Quote
is raytracing really the way to go?

AFAIK from my graphics courses, raytracing in its basic form (which is really the only one becoming vaguely feasible in realtime at the moment) is remarkably artificial-looking, due to its ability to only model specular-specular light interaction from point-sources (ie, no soft shadowing, no colour bleeding between diffuse surfaces, no BRDF sampling for glossy surfaces, and generally no awesome light effects in general). In order to do anything better, you either have to add some cheating based on current techniques, or increase your complexity by a couple orders of magnitude and throw a few hundred to a thousand rays per pixel down the line, adding path tracing to the mix (basically ray tracing on steroids, but diffuse-diffuse paths are accounted for, as well as BRDF sampling and area light sources for soft shadows).

I mean, this is what i know from about a year ago; can someone shed some light on this?
Tim S 2nd April 2008, 10:29 Quote
I don't believe it is the way to go.. Intel would like it to be used exclusively, but it never will be - it'll become 'another tool' for developers.
boggsi 3rd April 2008, 02:21 Quote
I can't offer any insight into the subtleties of ray-tracing.. But I can vouch for the ever increasing processing power of graphics, cpu's etc. if we dare to talk about 10-15 years in the future, surely ray-tracing would be the holy grail all graphics would converge towards? Can raster keep it up?
completemadness 5th April 2008, 18:20 Quote
Quote:
Originally Posted by Fod
is raytracing really the way to go?

AFAIK from my graphics courses, raytracing in its basic form (which is really the only one becoming vaguely feasible in realtime at the moment) is remarkably artificial-looking, due to its ability to only model specular-specular light interaction from point-sources (ie, no soft shadowing, no colour bleeding between diffuse surfaces, no BRDF sampling for glossy surfaces, and generally no awesome light effects in general). In order to do anything better, you either have to add some cheating based on current techniques, or increase your complexity by a couple orders of magnitude and throw a few hundred to a thousand rays per pixel down the line, adding path tracing to the mix (basically ray tracing on steroids, but diffuse-diffuse paths are accounted for, as well as BRDF sampling and area light sources for soft shadows).

I mean, this is what i know from about a year ago; can someone shed some light on this?
I believe your right, and i don't understand why ray tracing is so hyped, from what Ive read, its makes good pictures, but their far less realistic (for example, due to the fact ray tracing doesn't work out shadows (i believe))
genesisofthesith 5th April 2008, 18:38 Quote
Ray tracing works out 'perfect' shadows, but to get the same kind of 'realism' you get from modern raster based graphics you have to calculate many 'bounces' of the light rays - which is very computationally expensive. One solution is to reduce the number of bounces to get a good enough approximation, but you still run into issues with soft bodies and particle effects such as fog or fire that are traditionally rendered by shaders rather than as physical entities within a scene. If you can throw enough horsepower towards the task you could get more and more realistic looking results by increasing the complexity of the ray traced scene, but the demands for compute power is exponential to the results.

Given a fixed current level of compute power you would achieve better results using raster based techniques, but as diminishing returns kicks in for raster techniques, ray tracing would still have huge potential for improvement - and given enough compute power a ray tracing solution would garner better results than a raster based one. We're a long way away from such a time when there enough 'disposable' computing power however to make ray tracing the better choice for gaming however, even if the current developments are promising. Think of them as laying a path for the future, rather than overnight replacing traditional techniques and you won't be dissapointed.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums