bit-tech.net

Core i7 a waste of money for gamers, says Nvidia

Core i7 a waste of money for gamers, says Nvidia

Nvidia dismisses Intel's claims about Core i7's gaming performance as "disingenuous" as they're only based on the 3DMark Vantage CPU test.

Intel’s Nehalem architecture might have generated a lot of excitement and rave reviews, but it turns out that Nvidia isn’t that impressed with its impact on gaming performance. In fact, yesterday the company described Intel’s claims about Core i7's gaming performance as “disingenuous” in a presentation to introduce Nvidia’s concept of an optimised gaming PC.

In the presentation, Nvidia’s technical marketing director Tom Petersen said “I have a copy of Intel’s latest deck that they share with press and customers, and on there they have a slide that is called The Intel Core i7 920 Processor, where they claim that gaming performance goes up by 80 percent when you use a Core i7. Now, I was impressed by that claim, and I was trying to figure out how they could possibly say such a thing, and it turns out that Intel is basing that claim on only 3DMark Vantage’s CPU test.”

As Petersen points out, this test “is designed to show CPU difference, it doesn’t actually measure gameplay, it doesn’t actually measure anything about game performance. Sure enough, if you do that test you will see Core i7 running faster, but I think it’s a little disingenuous to call that game performance.”

To prove his point, Petersen outlined two types of PC, which he likened to cars - the Hummer and the Beamer. Petersen described the Hummer, saying that it “has got to be big, and it’s got to be expensive and of course it’s infused with Hafnium, which is kind of a dig at Intel. It has a Core i7, which is Intel’s latest, greatest CPU that they claim is the best for gaming.” The Hummer features a Core i7, 4GB of RAM, an X58 motherboard and a single GeForce GTS 250. Meanwhile, the Beamer swaps out the Core i7 CPU for a basic Core 2 Duo E8400, an nForce 750i motherboard and a pair of GeForce GTS 250 cards in SLI.

The cost difference between the two is massive, with a Core i7 965-based Hummer costing $1,501 US based on pricing from US etailer Newegg, and the Beamer costing just $715 US. Petersen also noted that even a Core i7 920 setup with a single GeForce GTS 250 would still cost more than the Beamer SLI rig at around $790 US. The prices were based on the core components only, and didn’t include features such as the case or PSU.

“You’re paying a pretty dear price to follow the Intel story of how to build the fastest PC for gaming”, said Petersen, as he showed a graph of how gaming performance scales with CPU upgrades. Petersen got his test results by adding together the frame rates from Crysis Warhead, Fallout 3, Call of Duty: World at War and Far Cry 2 at 1,920 x 1,200 (no AA or AF) and taking an average. With a Core 2 Duo E8400 and a GeForce GTS 250, the average was 41.6fps.

He then showed how this increased as you upgraded the CPU (the blue line in the graph above), and compared it to how the frame rate increased when you added another graphics card in SLI. The frame rate only increased to 42.4fps after upgrading to a Core i7 965, but jumped all the way up to 59.4fps after upgrading to a GeForce GTX 260 (216 stream processors) SLI setup.

This might seem obvious to those of us who know about how 3D acceleration works, but Petersen claims that the result is still “surprising to most people”. Petersen says that “it is a fact, that when you’re gaming and you’re running at resolutions of 1,920 x 1,200 or better, the Core 2 Duo is perfect for running all of today’s games. In real gaming, there’s no difference between a Core i7 and a Core 2 Duo.”

Petersen accepts that some gamers want the very best of everything, and likens the combination of a Core i7 and SLI graphics to a Ferrari. “If you’ve got money to burn, and you want to get the latest Core i7, and you want to get great graphics cards, then sure you can get the best of everything. There is some small benefit to having a Core i7 965 over a Core 2 Duo when you’re buying the best graphics cards and running at the highest resolutions, so a Core i7 has a place and it does have a benefit in what I’m going to call the Ferrari configuration. But the truth is that when you’re trading off money, there’s nothing like the Beamer configuration.”

“Particularly in today’s economic climate, people are concerned about getting the most value for their money,” says Petersen. As such, Petersen advises PC gamers to ignore Core i7 and instead set up a Core 2 Duo system using an nForce SLI motherboard. “With the leftover $800 I can go out and buy 16 games,” says Petersen, “it’s not even close.”

Is Nvidia just stating the obvious here, or do you think the general PC gamer thinks that they’ll get a big boost in gaming performance from a Core i7 CPU? Would you rather have a Core i7 system with one GPU, or a Core 2 Duo system with an SLI setup? Let us know your thoughts in the forums.

50 Comments

Discuss in the forums Reply
Mankz 23rd April 2009, 15:10 Quote
Wow... After all of these GTS250's and other stupid name changes, it appears a miracle has taken place..

nVidia have said something that makes SENSE! :D

(Might also be because nvidia don't have an i7 chipset, but, meh, they can 'win' this time :p )
Zoon 23rd April 2009, 15:15 Quote
Yeah but ultimately its all about having teh megahurtz for most enthusiasts.
Tyrmot 23rd April 2009, 15:23 Quote
Fair enough. I was thinking about core i7 till I saw the same numbers... realised that my E8500 is absolutely fine and money better spent elsewhere. To base 'gaming performance' on Vantage CPU test is worse than disingenuous, why not call it what it is - a barefaced lie to try and sell more hardware...
Aterius Gmork 23rd April 2009, 15:23 Quote
Quote:
As such, Petersen advises PC gamers to ignore Core i7 and instead set up a Core 2 Duo system using an nForce SLI motherboard. “With the leftover $800 I can go out and buy 16 games,” says Petersen, “it’s not even close.”

I'll just grab an Intel mainboard and a single powerful GPU instead of a pair of crappy ones, a slower CPU as the Intel mobo can actually clock, and a smaller PSU, and get even more games. Take that, Nvidia.
Turbotab 23rd April 2009, 15:25 Quote
Well if Intel was as lazy as Nvidia, the I7 would have been a rebadged Pentium D!
Tyrmot 23rd April 2009, 15:26 Quote
Quote:
Originally Posted by Aterius Gmork
I'll just take an Intel mainboard and a single powerful GPU instead of a pair of crappy ones and a smaller PSU, and get even more games. Take that, Nvidia.

Ha yes exactly :) - new GPU bought and intel mainboard on the way. Funny isn't it that this Mr Petersen slags off Intel for this, then himself suggest going nVidia mobo + SLI! All as bad as each other....
Tyrmot 23rd April 2009, 15:28 Quote
Quote:
Originally Posted by Turbotab
Well if Intel was as lazy as Nvidia, the I7 would have been a rebadged Pentium D!

haha 'Pentium D-1000', and twice the cost
Nictron 23rd April 2009, 15:28 Quote
Intel can claim impressive performance increases when it comes to Sli and XrossFire scaling, if you head over to Guru3D.com they show an impressive review with SLi and XrossFire scaling using the i7 architecture.

But it is quite lame from Intel to claim gaming improvements when they are not even using a game but a synthetic benchmark.

If I build an 3x Sli or 4x Crossfire rig I would only use an i7 and nothing else!
GFC 23rd April 2009, 15:32 Quote
I'd actually go for Core i5 and the best single-gpu VGA i can get xD. But that's cheating i guess, obviously most of us [that are gaming a lot] would go for more GPU juice.
JyX 23rd April 2009, 15:33 Quote
"Yeah but ultimately its all about having teh megahurtz for most enthusiasts."
Errr... wrong!
"enthusiasts"? What are those?... Imaginary rich people?
[/sarcasm]

Don't get me wrong... being an enthusiast is great but it's not ultimately about them.
If you remember, there was a shortage of GTX 295... and probably still is... and on part of AMD's side, they don't plan a reference 4890 X2... that'll be up to Sapphire or someone bold enough to do that. Why?... there's no interest.

it's all nice and well... but it's not something they can actually make money of.

And Dick here has a point... the i7 is currently to overpriced for all of us... and we, all of us, is what it's all about!

We like midrange... and if possible... overclock-monster midrange is a plus! Some... and do notice... it's SOME... will go for high end... and if possible... overclock-monster highend is a plus... but stop here.
naokaji 23rd April 2009, 15:36 Quote
If you have a big screen (talking 24"+) and play games at the higher end of the quality settings I7 is indeed a waste of money as games will simply not be cpu limited.
However, outside of high res gaming I7 rules the world of computing.
phuzz 23rd April 2009, 15:38 Quote
It's nice to see them saying something straightforward, but I'm sure there's some equally suspect figures in some of the nVidia powerpoints...
perplekks45 23rd April 2009, 15:55 Quote
That's why nVidia's own Dick-head [sorry, I know it's weak but they're getting on my nerves by now wiht all their "Bash me, bash you" crap] talked about a GAMING rig. :(

Anyways, of course it's overkill for gaming and we all know the absolute max you need for gaming at the moment is a nicely overclocked Core 2 Quad. And let's face it, even that is overkill. As long as there are no games really using multiple cores and lots of CPU power for an AI that actually deserves that name or some unheard-of physics effects no gamer needs i7.
I am more than happy with the gaming performance of my C2D E6600 @ 3.6 GHz. ;)

And I agree with not going the SLI way... yet. Buy a single card, go SLI when that isn't enough anymore.
aggies11 23rd April 2009, 15:56 Quote
While Nvidia's reasoning for taking potshots at Intel is questionable, their logic is sound.

For all the talk of "CPU Bottlenecks", unless you are running a seriously underpowered chip, the same money spent on a CPU upgrade almost always results in a better performance boost if you instead put it into a better GPU.
HourBeforeDawn 23rd April 2009, 15:57 Quote
I agree in terms of gaming the i7 is a waste of money, considering what you pay for, you might as well stick with a high-end ddr2 setup.
Zurechial 23rd April 2009, 16:10 Quote
I've had a fairly significant increase in performance since replacing my first-gen C2D E6600 with an i7 920.
I had the E6600 clocked at 3GHz, matched with a 680i and 4GB of DDR2.
The i7 is clocked at 4GHz and matched with an X58 and 3GB of DDR3.
I'm using the same GPU with the i7 as I was with the C2D, an 8800GTS512.

It's probably quite safe to assume that the difference in performance in my case is attributable to the 1GHz difference in clock speed and maybe also somewhat to the increase in memory bandwidth, but the important thing for me is that the difference is there and noticable.
The thing is, I also use my PC for music/audio production and experimental audio programming, not just for gaming.
If I were using the system solely for gaming, that upgrade probably wouldn't have been justified.
antaresIII 23rd April 2009, 16:11 Quote
Thanks for infusing a dose of reason Nvidia; especially regarding the new "rip off" project from Intel, named Core i5.
Turbotab 23rd April 2009, 16:15 Quote
Quote:
Originally Posted by antaresIII
Thanks for infusing a dose of reason Nvidia; especially regarding the new "rip off" project from Intel, named Core i5.

They didn't mention the I5, the I5 has not even been released, if you are not careful, you will end up, screaming about I5s in a padded cell:)
[USRF]Obiwan 23rd April 2009, 16:24 Quote
I rather buy a Phenom II 720BE with am3 gigabyte motherboard and 4gb ram for 300 euro. And get some more juice out of my GT8800 until its acting like a "slide-projector". With the money saved I can then buy the latest and greatest midrange videocard at the end of the year...
wuyanxu 23rd April 2009, 16:30 Quote
last year, they said e5200 is enough, this year, e8400. so in that logic, next year would be one of the i5.

therefore, following their logic of need to upgrade CPU every year. getting i7 now, and then only upgrade GPU makes more sense money-wise.
wgy 23rd April 2009, 16:55 Quote
before i even read this article,

could i make a request to bittech staff?

clickable to make big pictures in your articles? the graph is pretty small... and hard to make out.

*goes to read article*
FeRaL 23rd April 2009, 17:06 Quote
How do the two compare when you do other things while gaming like rip and encode MP3s from CDs and the like?
perplekks45 23rd April 2009, 17:16 Quote
Why would nVidia care about that? There's no CUDA app to do that anyways. ;)
Ending Credits 23rd April 2009, 18:02 Quote
AMD/ATI anyone?
mrplow 23rd April 2009, 18:10 Quote
Quote:
Originally Posted by Ending Credits
AMD/ATI anyone?

ME! FOREVER! EVEN IF IT'S CRAP! YAY!
perplekks45 23rd April 2009, 18:11 Quote
Fanboys... gotta love them. :D
s3v3n 23rd April 2009, 18:46 Quote
$800 difference? Are they comparing retail i7 prices to cheapest they can find core2 prices? From what I've seen the difference between bargain hunted i7 920 vs bargain hunted E8400 would be about $120 on the CPU, $100 on the motherboard, and RAM is actually about the same now (unless you want super low timing). Even if I throw another $80 in there, that's still only $300. If I didn't already have a Core2 system and was building an upgrade from scratch then I would definently go i7.
ForumNameHere 23rd April 2009, 19:39 Quote
yeah, i think that s3v3n is right. This, like most things, is relative. i just upgraded to an i7 920 system from an AMD AM2 platform over the holiday season. I considered an LGA775 platform, but figured that would be out of date sooner. I put up a bit more cash guessing I'd save some later. If you've got a core2 system, going out and buying all new kit might not get you the performance boost you spent money hoping to have, but i think that there ARE scenarios where buying an i7 system not only makes sense, but could help save some money if you like upgrade with any regularity.
Marsolin 23rd April 2009, 19:45 Quote
Of course the 3D Mark CPU score shows the rosiest picture, but Nvidia is looking for what fits their conclusion too. If you look at a recent game benchmark like Far Cry 2 there is a big difference. According to a recent Anandtech review (http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3551&p=14) the Core i7 920 is 60% faster than the Core 2 Duo E8200.

Granted that is a specific benchmark, but what it shows is that as new game engines get released with more multi-threading there is a big gaming benefit to be had. It all depends upon whether you are looking at what is out now vs. what is coming.

Chad
http://linuxappfinder.com
http://feedsanywhere.com
Redbeaver 23rd April 2009, 19:55 Quote
my Q6600 G0 at 3.6 is still running solid for almost 3 years now on my P35.... unfortunately its not an SLI board, but my 8800GTS640 is still rocking just fine. when i need an upgrade, i'll throw $250 to swap it with a geforce275. problem solved.

i7 is still a waste of money IMHO. i'll wait till they drop the mobo price....
cheeriokilla 23rd April 2009, 20:13 Quote
intel vs. nvidia round 305. FIGHT!
thehippoz 23rd April 2009, 21:44 Quote
this is old news in the ocing community.. after seeing the numbers on the oc'd 920 vs. an oc'd c2d even the old e6600 XD- there wasn't really a jump to justify cost.. the memory bandwidth on the i7 is insane though- and if you didn't already have a c2d gaming rig.. it might be worth going for the 920
Jenny_Y8S 23rd April 2009, 22:32 Quote
What about when gaming and multitasking? I always leave at least one VM machine running when I game on my i7/285 with no impact.

I bought my i7 to last a few years and if I have spare CPU cycles now, thatis what I wanted :)
thehippoz 23rd April 2009, 22:55 Quote
the old q6600 will handle your multitasking.. vista balances across cores also- running a quad on xp your better off with a cox.. gaming the c2d dual cores oc'd are still up there
D-Cyph3r 23rd April 2009, 23:18 Quote
Gotta love nVidia's smack talk, they really are the rowdy little jack russel of the pc component industry.
Psytek 24th April 2009, 00:17 Quote
He's technically right, but until NVIDIA decides it gives a **** about the people who buy two of their graphics cards and start making the SLI drivers as good as the regular ones, I'll stick with a single card thanks.
Marc5002 24th April 2009, 00:18 Quote
http://www.legionhardware.com/document.php?id=807&p=7
Cost a Load But Is giving you more power :) specially at 1920x1200 / 2560x1600
disturbed13 24th April 2009, 01:46 Quote
okay ive read the article
but here is an issue that i see
there isnt a game that is on the market that uses all of the cores
so how can we expect to see a jump in performance while we are gaming
when all of the cores arent being used
Tulatin 24th April 2009, 02:12 Quote
Well it's sort of strange, but it seems that at the moment, Intel's the only game in i7 town, and the option there for nVidia's involvement comes through NF200 chips. The one thing I really wonder about their system configurations, though, is why they gave the i7 machine just 4gb of ram. It's TRIPLE channel guys.
damikez 24th April 2009, 03:31 Quote
Quote:
Originally Posted by Marsolin
Of course the 3D Mark CPU score shows the rosiest picture, but Nvidia is looking for what fits their conclusion too. If you look at a recent game benchmark like Far Cry 2 there is a big difference. According to a recent Anandtech review (http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3551&p=14) the Core i7 920 is 60% faster than the Core 2 Duo E8200.

Granted that is a specific benchmark, but what it shows is that as new game engines get released with more multi-threading there is a big gaming benefit to be had. It all depends upon whether you are looking at what is out now vs. what is coming.

Chad
http://linuxappfinder.com
http://feedsanywhere.com

Have you noticed that the quality settings are set to "medium" or "mainstream"? In that quality setting, the GPU isn't maxed out, that's why you can see the increased performance of the i7 920.

I have a C2D E6550 (2.33GHz), i recently got a GTX275 (replaced my 9800GTX), playing Crysis @2.33GHz my FPS is around 33, @3.57GHz, its just around 35FPS all quality settings maxed out, clearly, the GPU is the bottleneck.

At this point in time, or until the 2nd half of next year, CPU performance won't be that critical when it comes to gaming.
metarinka 24th April 2009, 03:45 Quote
Quote:
Originally Posted by Zurechial
I've had a fairly significant increase in performance since replacing my first-gen C2D E6600 with an i7 920.
I had the E6600 clocked at 3GHz, matched with a 680i and 4GB of DDR2.
The i7 is clocked at 4GHz and matched with an X58 and 3GB of DDR3.
I'm using the same GPU with the i7 as I was with the C2D, an 8800GTS512.

It's probably quite safe to assume that the difference in performance in my case is attributable to the 1GHz difference in clock speed and maybe also somewhat to the increase in memory bandwidth, but the important thing for me is that the difference is there and noticable.
The thing is, I also use my PC for music/audio production and experimental audio programming, not just for gaming.
If I were using the system solely for gaming, that upgrade probably wouldn't have been justified.


I want to call out right now and say that audio/music production generally isn't that CPU intensive especially when any good audio interface will be handling the sound processing

I do quite a bit of music production my self, 100% soft synths using a C2Duo 8400 4 gigs of ram and an e-mu 1820M Never once has my cpu usage crept past 50% in Ableton Live and that's going full out with 20+ tracks and a heavy amount of chaining and rewiring into Reason 4

I've heard of a few physics modelling synths take up some cpu power but please tell me what you are doing audio wise that justifies an i7?

Compared to video or image editing, audio editing is realitivley low powered when using a proper audio interface
dyzophoria 24th April 2009, 07:36 Quote
what's with NVIDIA nowadays, instead of bashing everybody else, why not just spend their time researching on a new GPU. from the way I see it, NVIDIA has only two things in mind nowadays 1.) Bash Intel 2.) Rebadge every product they have.
V3ctor 24th April 2009, 08:39 Quote
Q6600 FTW... Best Intel CPU :D We have quad-cores... and those *******s aren't evenu used at the max of their potencial... i7 is just waste of money, (unless u have a sckt. 939 AMD, or a P4 3.4HT) maybe the new architecture "SandyBridge" in 2011 makes me switch my Q6600 and my 4870
xaser04 24th April 2009, 08:55 Quote
Quote:
Originally Posted by perplekks45
Why would nVidia care about that? There's no CUDA app to do that anyways. ;)

Probably because they sell chipsets / MB that can take a Core 2 Duo/quad yet don't have anything to run with the i7.

After doing a little testing myself when I had a HD4870X2 I found there was a noticable difference in minimum framerates between running the card on my old Core 2 Duo (3ghz) vs my new i7 920 (2.66GHZ). Of course this difference would be minimal with the card I have now (downgraded to a HD4850 as I realised I didn't need the GPU HP).

Personally given how prices of DDR3 have dropped along with 'affordable' X58 motherboards and I wanted a decent quad core system I wouldn't bother with Core2's anymore and move straight to the i7. Of course if I wanted a pure gaming pc with a fast single gpu I would go for a mid range Core2 Duo and clock it.
lewchenko 24th April 2009, 09:33 Quote
I normally upgrade to the latest and greatest, but when Core i7 came out I was not impressed. Despite the rave reviews that magazines and websites were giving it, for a gamer... the improvements could not be cost-justified.

So I upgraded my Core 2 e6750 chip to a quad Core 2 9550 and overclocked it to 3.8Ghz.... I then had money to spare to upgrade my old 8800GTX as well to a 260 216 XFX Black Edition. Now it eats games for breakfast.

Those upgrades cost me a fraction of the cost of having to build a new Core i7 machine... mainly due to the cost of new MB, new memory and a new CPU/Cooler (plus then a new GPU on top).

In this recession, people with Core2's could get much better bang for buck by upgrading.. not replacing.
perplekks45 24th April 2009, 12:08 Quote
I think we all agree by now that it's a good [read: future-proof] idea to upgrade to i7 only when you're not on any recent C2D, C2Q or Phenom CPU.

If I had anything to say at either of the big companies I'd just fire half my marketing staff for being nothing but stupid kids.
Shielder 24th April 2009, 14:23 Quote
The new helecopter sim BlackShark (very very good, gonna install it when I can afford it!) has a little utility that can utilise up to 8 cores (i7 w HT) for the game. I have played it on my father in laws rig and it is awesome! My 6 year old loves it too...

Andy
jimmymcjimmy1 26th April 2009, 10:41 Quote
I have got to say that until I read the latest Custom PC CPU Guide I was under the impression that the i7 cpu,s had taken PC performance to a whole new level and was serouisly contemplating my next upgrade from my Asus P5K Premium Mobo/ Q6600 CPU based rig. But what the CPC CPU guide has highlighted to me is that with my Q6600 clocked at 3.4GHz the substantial cost of an i7 upgrade for a relatively small gain in overall pc performance is simply not worth it at the present time. I would also agree with the CPC analysis that a more cost effective upgrade path from a Q6600 CPU would be to install a Q9650. So yes in my opinion the man from nVidia is bang on the nail!
thehippoz 26th April 2009, 16:44 Quote
Quote:
Originally Posted by jimmymcjimmy1
I have got to say that until I read the latest Custom PC CPU Guide I was under the impression that the i7 cpu,s had taken PC performance to a whole new level and was serouisly contemplating my next upgrade from my Asus P5K Premium Mobo/ Q6600 CPU based rig. But what the CPC CPU guide has highlighted to me is that with my Q6600 clocked at 3.4GHz the substantial cost of an i7 upgrade for a relatively small gain in overall pc performance is simply not worth it at the present time. I would also agree with the CPC analysis that a more cost effective upgrade path from a Q6600 CPU would be to install a Q9650. So yes in my opinion the man from nVidia is bang on the nail!

well I wouldn't say anywhere near bang on.. he's trying to sell nvidia sli chipsets over intel- only reason he's even talking.. and nvidia chipsets are crap compared to intels imo- not to mention sli is a big pile of marketing.. the single slot x2 cards are a better way to go if you really need the push for a 30" monitor

reason I say this is, your drawing way to much power to justify what you get back with sli.. they would love everyone to fall for sli (and they do.. common sense tells you 2 is better than one).. twice the money- oh yeah let's throw in a 3rd card to run physx! :D and they laugh all the way to the bank
adam_bagpuss 27th April 2009, 12:05 Quote
if i7 is a waste of money, then so is SLI.

2xGPU does not equal 2xperformance and its completely varied across games. Some like SLI and boost frames a bit others hate it and can actually be worse.

Nvidia should not be commenting on a waste of money since they would also need to say dont buy SLI cause its also a waste of money.

at least with an i7 it does other things besides gaming in which its amazingly fast.

2xGPU are good for only 1 thing gaming thats it. well maybe a few other 3d apps but thats it.

i7 920 + single high end GPU = fast performance at everything.

slow/average CPU + 2XGPU = fast gaming, mediocre performance at everything else.

Now which one would you pick !!!!!!!!!!!!
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums