The latest happenings in the war of words and PowerPointery between Intel and Nvidia have revealed Intel’s claim that the GPU gives the user no benefit in a wide range of applications. This is sure to stoke the fire in some big green bellies.
Not surprisingly, Intel says that upgrading your CPU
from a Core 2 Duo E6550 to a Core 2 Duo E8400 will deliver performance increases in 3D rendering, music, photo editing and video processing, while upgrading your GPU from a G33 integrated graphics chip to an Nvidia GeForce 8800 GTX will deliver no benefits in these tasks.
Of course that’s true, but Intel goes one step further to say that upgrading your CPU—as well as your GPU—will deliver higher frame rates in games too. “For mainstream gaming, the CPU and GPU work together to deliver a great experience. For high-end gaming, a CPU and CPU upgrade puts you in the winner’s seat,
” claim the slides.
Without a doubt, this is something that Nvidia will disagree with – it says that upgrading the GPU first will give you more profound benefits. Many of its recent PowerPoint presentations have claimed that upgrading from a mid-range CPU to a high-end Core 2 Extreme processor delivers little to no performance improvement—in games. And if you go back through any of our recent CPU and graphics reviews
, there are definite benefits to using a faster GPU in games, while the benefits of a faster CPU
aren't quite so profound.
Right now, there are no real benefits to a fast GPU outside of gaming and video decoding—two scenarios that Intel conveniently decided to leave off its PowerPointery—but Nvidia is talking up GPU-accelerated tasks
like video encoding, photo editing and it is already a strong performer in 3D modelling applications like AutoCAD. One has to wonder what benchmarks Intel is using in the 3D rendering benchmark – I’d imagine it’s something like Cinebench, although if that’s the case, Intel has only run the CPU benchmark to make the numbers fit its argument.
The other thing to consider is that Intel has admitted that it will be bringing a discrete GPU to the market with Larrabee, which is believed to be targeted at 3D graphics and other highly parallel computing tasks. Now, if that’s not evidence enough that the discrete GPU has a market for tasks beyond just gaming, I don’t know what is.
Is it just me that thinks this war of words is getting a little petty? Share your thoughts in the forums