bit-tech.net

Larrabee will feature rasterisation units

Larrabee will feature rasterisation units

Tom Forsyth, a software engineer working on Intel's Larrabee project, has said that rasterisation is not dead.

An Intel software engineer working in the company's visual computing group has revealed that Larrabee will focus on rasterisation, and not ray tracing like some of the marketing bods would have you believe.

"I've been trying to keep quiet, but I need to get one thing very clear," said Tom Forsyth on his personal blog. "Larrabee is going to render DirectX and OpenGL games through rasterisation, not through raytracing."

"I'm not sure how the message got so muddled. I think in our quest to just keep our heads down and get on with it, we've possibly been a bit too quiet," he continued.

Pat Gelsinger's now infamous statement about current graphics architectures based on rasterisation being "no longer scalable and suitable for the demands of the future." at IDF in Shanghai last month is what caused the confusion, Tom, but it's good to know that it was essentially flat out wrong.

"Larrabee's tech enables many fascinating possibilities, and we're excited by all of them," said Forsyth. "But this current confusion has got a lot of developers worried about us breaking their games and forcing them to change the way they do things. That's not the case, and I apologise for any panic.

"There's only one way to render the huge range of DirectX and OpenGL games out there, and that's the way they were designed to run – the conventional rasterisation pipeline. That has been the goal for the Larrabee team from day one, and it continues to be the primary focus of the hardware and software teams. We take triangles, we rasterise them, we do Z tests, we do pixel shading, we write to a framebuffer. There's plenty of room within that pipeline for innovation to last us for many years to come. It's done very nicely for over a quarter of a century, and there's plenty of life in the old thing yet.

"There's no doubt Larrabee is going to be the world's most awesome raytracer. . . . but it is absolutely not the focus of Larrabee's primary rendering capabilities, and never has been – not even for a moment.

"We are totally focussed on making the existing (and future) DX and OGL pipelines go fast using far more conventional methods. . . . We would not and could not change the rendering behaviour of the existing APIs."

It's good to hear that Larrabee's engineers have their heads screwed on the right way, because breaking billions and billions of dollars of investment in software based on rasterisation would be a bad thing for the industry – or at least for Intel, a company trying to break into the discrete graphics market. Ray tracing is not the future of graphics, it's a part of that future – it's just one tool in the developers' toolbox and I know for sure that it's not a Swiss Army Knife or Leatherman.

Larrabee continues to interest me immensely and the picture is slowly coming together – one thing is clear from Tom Forsyth's comments though and that's that Larrabee will not render DirectX and OpenGL applications through an emulator.

Are you excited at the prospects of Intel entering the discrete graphics market? Share your thoughts in the forums.

9 Comments

Discuss in the forums Reply
zero0ne 24th April 2008, 13:57 Quote
who cares? I'm confident that Nvidia will crush their hardware!

They will then begin to produce GPUs that can also act as your computers main CPU via a OS written in CUDA!!!!

haha
bowman 24th April 2008, 14:02 Quote
Haha. NvidiOS. I like that.

Anyways, more competition is good. Nvidia should buy VIA for the x86 license. That way we'll have three CPU/GPU manufacturers.
[USRF]Obiwan 24th April 2008, 14:03 Quote
If Intel manage to make a GPU that is 2x faster then the current top ranked GPU. That would be incredible. anything less then that and its just another wannabe GPU maker...
chicorasia 24th April 2008, 14:12 Quote
They should start by making integrated graphics that doesn't suck as bad as current offerings do. And they must do it NOW.

It's all about branding and developing a solid user/fan base.

I'm sorry, Tim - the promised tenfold increase is just not enough. Unles you mean to compete with Geforce 7-series cards, two years from now!
Tim S 24th April 2008, 14:54 Quote
10-fold was for integrated graphics :)

I think Larrabee will be quite a bit faster than the current generation hardware from Nvidia/ATI (at least theoretically - there's more to it than that though as you know).
Nikumba 24th April 2008, 15:09 Quote
To be honest the Intel intergrated gfx has mostly been for laptops to do office/internet etc why should they make it more powerful? If you want to play games, buy a more expenisive laptop with a better gfx card.

Of course it would be reeally good if the gfx card from Intel has the ability to upgrade the GPU/Memory like we do with a normal pc
MrMonroe 24th April 2008, 16:25 Quote
I just can't see how this is going to pan out for Intel. They have no experience in this industry and they are going to try breaking into it by using a combination of untested tech and tech they aren't any good at. And what's to stop nVidia or ATI from pushing in a little ray-tracing for certain effects?
TreeDude 24th April 2008, 20:12 Quote
I have no doubt Intel will have awesome hardware specs wise. I think we all know what is going to make or break this new line. Drivers.

Personally I think both Nvidia and ATI have terrible driver quality. ATI has much more structure to their releases, but CCC is bloated and unnecessary. Nvidia is sporadic. One driver is great, then one gets released because of a new game and it kills the performance on your other games. It is hit and miss with their drivers.

If Intel puts out a decent set of drivers I will be very surprised. I say it takes them at least a year before their drivers come even close to ATI/Nvidias.
D3s3rt_F0x 25th April 2008, 15:29 Quote
It's all trash talk, from intel and those saying it'll be a great product wait and see its the first time intel have dipped there toes in this area and for all you know it could be a shambles, which tbh I could possibly see it being, but give them more time and you never know.

Im not going to take anything they say in untill i see people with the chips running programs people run and benchmarking them for true figures with just a pinch of salt.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums