bit-tech.net

Intel confirms: no discrete graphics plans

Intel confirms: no discrete graphics plans

Intel's Bill Kircos has confirmed that Larrabee won't be a consumer chip - but its technology will power HPC products.

Intel has confirmed that it is not looking to bring a dedicated graphics card into the market, finally putting paid to hopes for Larrabee in the consumer market.

In a blog post - via The Wall Street Journal - Intel's Bill Kircos states categorically that "[Intel] will not bring a discrete graphics product to market, at least in the short-term" - meaning that there will be no Larrabee-powered dedicated graphics cards to offer consumers a third choice, away from the AMD and Nvidia duopoly.

Blaming the fact that the company "missed some key product milestones" in the run-up to a Larrabee consumer product, Kircos states that Intel is looking to concentrate on integrated graphics for the foreseeable future - in particular "media/HD video and mobile computing[, which] are the most important areas to focus on."

The news will come as a blow to those who had been looking forward to Intel's entry into the discrete graphics market - especially as it's been three years since Intel originally announced the product line. Following poor performance at public demonstrations, Intel made the decision to cancel its planned products and turn it into a software development platform - but there were still hopes that hardware would see the light of day, which now appear to have been finally put to rest.

Intel isn't completely abandoning its investment in the Larrabee line, however: Kircos claims that Intel is "executing on a business opportunity derived from the Larrabee program and Intel research in many-core chips," which will see a "server product line [...] optimised for a broader range of highly parallel workloads in segments such as high performance computing." Sadly, Kircos is keeping silent on precisely what this entails - leaving it to Intel's vice president Kirk Skaugen to unveil at ICS 2010 in Germany at the end of this month.

Are you disappointed to see Intel giving up on its discrete graphics dreams, or was Larrabee never going to be able to take on the might of AMD and Nvidia? Share your thoughts over in the forums.

5 Comments

Discuss in the forums Reply
iwod 26th May 2010, 11:29 Quote
In a Perfect World, the GPU should only be using few W when i am browsing and idle. And scale up to 100+W when i am gaming. But of coz, we know unless there is some major transistor tech breakthrough, this is not going to happen in the next 5 + years.

So back to an ideal world, The IGP should be VERY low power, with superior 2D Rendering performance, Programmable DSP that allows Many if not all of the FFmpeg codec gets hardware acceleration. 3D Performance that is focused on UI, Effect workload, Browser canvas and other Vector acceleration. With final consideration for anything gaming related.

As it currently stand, the IGP size is fairly large that provide performance we dont need 90% of the time. And when we do need performance it is not capable anyway. So why waste transistors on it? Intel could give us a Extra Core or L2 cache was CPU performance.

I see Optimus as the future. At least in near term.

And final Notes. The worst thing about Intel HD is not the Hardware itself. Is the fact Drivers for Intel HD is poor, slow to update, and the main causes for poor gaming performance. Nvidia has more Software Enginerr then Hardware for their Drivers. It just shows GPU and CPU are completely different beast.
rickysio 26th May 2010, 12:42 Quote
Intel doesn't want to give you anything.

They want you to buy.
greigaitken 26th May 2010, 15:59 Quote
they did make a few last year though (albeit slow). The ones they did manage to make will surely be super collectors items now.
HourBeforeDawn 26th May 2010, 20:29 Quote
Im 50/50 about this, I think more comp would always be good but I dont know I think if Intel did do this nVidia be more on the losing side of the game then AMD, since AMD has their CPU and ATI for GPU, Intel would have well their CPU and then their GPU so nVidia could be left out in the cold unless they follow through with making their own CPU. Could be interesting then.
enciem 29th May 2010, 09:40 Quote
Would have been awesome for the competition. Just having two companies making all the graphics cards isn't all that great for prices. While we're all waiting for price wars between the big two they're still laughing all the way to the bank. A little extra competition could drive innovation as well.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums