bit-tech.net

Intel HD 4000 Investigation

Comments 1 to 25 of 39

Reply
Gusseteer 28th May 2012, 08:05 Quote
There are a couple of references to a 2570K in the article.

Sorry for being pedantic.
Harlequin 28th May 2012, 08:12 Quote
i would like to see the gamings tests run again with good high speed ram - its a well known that the A8 chips simply get better with faster system ram - so chuck in some 2133 and see better results!
xaser04 28th May 2012, 08:50 Quote
Quote:
Originally Posted by Article
Arguably though this is a little bit of an unfair comparison - yes, Intel has launched the HD 4000 GPU in desktop chips, but it’s real home is in mobile processors. Here it will excel, thanks to its good performance and tidy TDP. This is something that can’t be said for the A8-3870K, as its high TDP means that it’s an exclusively desktop bound processor.

More importrantly it should be noted that the mobile versions of Llano (and even Trinity) are somewhat crippled compared to their desktop counterparts due to much lower clock speeds (more so with Llano). In addition as you drop down the processor heirarchy the iGPU is even more crippled.

The same thing cannot be said for the HD4000 which only suffers a minor clockspeed reduction and remains fully intact right down to the lowest i3 entrant.

Factor in the CPU advantage Intel have (which is not even close when it comes to mobile parts) and things could prove relatively difficult for AMD in the mobile sector. The rumours of Haswell being a iGPU monster (all things relative of course) won't help either.
V3ctor 28th May 2012, 08:59 Quote
Could someone put images? For image quality comparison? There's lots of people saying that Intel "cheats" in their drivers with lack of shadows, color, depth...
That would be great to compare image quality too, it there is such a difference.

Thanks
Harlequin 28th May 2012, 09:02 Quote
Llano is also only 45w TDP - that cannot be said for the intel chip which is nearly twice as much!

as for the differences between the HD 6550D and the HD 6620G , please.

the differennce is 166mhz , they are both 400:20:8 parts (as used in the A8-3550MX)

disppointed at the obvious PRO Intel bias; go and get a A8-3550MX and bench that and see the results.
Deders 28th May 2012, 09:08 Quote
Quote:
Originally Posted by V3ctor
Could someone put images? For image quality comparison? There's lots of people saying that Intel "cheats" in their drivers with lack of shadows, color, depth...
That would be great to compare image quality too, it there is such a difference.

Thanks

This, and do Intel optimise their drivers as new games come out?
MrJay 28th May 2012, 09:12 Quote
Quote:
Originally Posted by Harlequin
Llano is also only 45w TDP - that cannot be said for the intel chip which is nearly twice as much!

as for the differences between the HD 6550D and the HD 6620G , please.

the differennce is 166mhz , they are both 400:20:8 parts (as used in the A8-3550MX)

disppointed at the obvious PRO Intel bias; go and get a A8-3550MX and bench that and see the results.

/agreed

The F1 socket system is a fantastic for the price. Let AMD have this one, they kinda deserve it : )
xaser04 28th May 2012, 09:17 Quote
Quote:
Originally Posted by Harlequin
Llano is also only 45w TDP - that cannot be said for the intel chip which is nearly twice as much!

as for the differences between the HD 6550D and the HD 6620G , please.

the differennce is 166mhz , they are both 400:20:8 parts (as used in the A8-3550MX)

disppointed at the obvious PRO Intel bias; go and get a A8-3550MX and bench that and see the results.

Ah you mean like the i7 3610QM which is also a 45w part.....

All of the previous SB mobile parts barring the 2920/2940QXM also had a TDP of 45w or less.

The mobile version of Llano has a lower core clock (or to put it another way the desktop variant is clocked 50% faster) and a lower officially supported memory clock 1600MT/s vs 1866MT/s. Couple this with the much lower CPU core clock on the mobile Llano variants and you end up with a part that is much slower than the desktop variant.

With the Intel mobile parts the only thig that suffers is a slight shaving of the iGPU clock. Drop below a IVB i3 and things go a little pear shaped (HD2500 instead of the HD4000) but a i3 based IVB laptop will be nearly as potent gaming wise a i7 based desktop part.
Harlequin 28th May 2012, 09:37 Quote
Quote:
Originally Posted by xaser04
Ah you mean like the i7 3610QM which is also a 45w part.....

All of the previous SB mobile parts barring the 2920/2940QXM also had a TDP of 45w or less.


as do all previous AMD mobile chips
Quote:
The mobile version of Llano has a lower core clock (or to put it another way the desktop variant is clocked 50% faster) and a lower officially supported memory clock 1600MT/s vs 1866MT/s. Couple this with the much lower CPU core clock on the mobile Llano variants and you end up with a part that is much slower than the desktop variant.

the intel chip you linked to (which btw isnt in channel in large quantities yet, and also isnt an i5 either) , is also a `crippled` version of the desktop version - as its only clocked at 2.3ghz;
a moot point on the ram since the intel part you quoted `only` supports 1600mhz ram as well - a better comparison would be to use the best possible ram for the AMD kit - but that would show the gpu is even stronger than the latest Intel offering.
Quote:
With the Intel mobile parts the only thig that suffers is a slight shaving of the iGPU clock. Drop below a IVB i3 and things go a little pear shaped (HD2500 instead of the HD4000) but a i3 based IVB laptop will be nearly as potent gaming wise a i7 based desktop part.

they are nearly as good because the GPU part is the bottleneck - and stuffing an i7 behind it doesnt allways help... and thats the one area (APU) where AMD are romping away -with the next gen AMD mobile kit sue soon , intel are still playing catchup
.
Deders 28th May 2012, 09:40 Quote
Would also be interesting to see exactly how much more powerful a few discreet GPU cards would be at the same settings.
HourBeforeDawn 28th May 2012, 09:42 Quote
The low to upper mid range is where AMD really shines, but then you have to give it to Intel once you throw in dedicated graphics whether it be Radeon or nVidia and go for more the mid-high range of things, Intel then takes it.
Harlequin 28th May 2012, 10:00 Quote
Quote:
Originally Posted by Deders
Would also be interesting to see exactly how much more powerful a few discreet GPU cards would be at the same settings.

toms have a relative diablo 3 review for gfx cards:

http://www.tomshardware.com/reviews/diablo-iii-performance-benchmark,3195-5.html

a bit of reading around the numbers sadly is needed as BT didnt use AA , but toms used low settings (still finding out the gpu load for low settings ) - IMO would place the tested AMD apu as amoungst the GT440/HD 6570 (funny that)

edit:

of interest is ram pricing - today , on scan , 2x4 corsair 1600 (9-9-9-24) is £39.46 , whereas 2x4 corsair 1833 (9-10-9-27) is £47.76

£8.30p difference , and +1 tRCD and +3 tRAS is more than made up by the 266mhz increase ;)
bowman 28th May 2012, 10:32 Quote
Well, they're still ****. Nvidia has a long time ahead of themselves making money still.

The suggestions of the GPU going the way of the FPU a few years ago might be the most premature prediction in modern computing history.
rollo 28th May 2012, 10:33 Quote
For Amd hardware sure

For intel hardware you just wasted £8
sandys 28th May 2012, 11:04 Quote
AMDs mobile side is bolstered by the recently launched A10s which are all low power (all less than 35w ) and quite handy.

http://techreport.com/articles.x/22932/1

A nice ultrabook fitted with one of these will do me, I have been well impressed with what my Brazo E450 can do even with rubbish CPU backup :) something with a bit more CPU back up will be nice and I wouldn't want to have to rely on Intel for driver support in games.
maverik-sg1 28th May 2012, 11:11 Quote
Intels driver support for it's gpu's is legendary for all the wrong reasons (don;t hold your breath for a D3 patch :) - there's no doubt that HD4000 is a step in the right direction though.

Trinity will be a more suitable comparison though for latest tech comparisons.

More of a request than anything else - You guys know that World of Warcraft remains a fave of mine, it now has a raid finder function that would allow you to run in a 25man raid, maybe kill the 1st boss and report back the peak and avg FPS? Could help shape a future laptop purchase for some :)
p3n 28th May 2012, 11:24 Quote
Is anyone using Virtu MVP with the HD4000 and a discreet card? I've not been able to get any non corrupt results from it. Starcraft 2 I could actually see my own mineral line, DotA 2 had some glitches and Diablo 3 just did run smoothly; what the eff, does anyone use this for anything other than benchmarks?
Harlequin 28th May 2012, 11:31 Quote
i did like the way D3 was added into the mix - although some battles in act 3 would be a heavier work load than act 1 :D

anand has a pretty good review of the new A10 and the new IB i7 - and , its a win for AMD - and actually looks reasonable for piledriver
lancer778544 28th May 2012, 11:57 Quote
Quote:
Originally Posted by Article
...it’s only the £170 i5-3570K and the £250 i7-3770K that come with the improved HD 4000 graphics core, all the other SKUs in the line come with cut down HD 2500 GPU cores.

Wikipedia says that the unreleased i5 3475S and the i3 3225 will also have HD4000 graphics.
misterd77 28th May 2012, 13:26 Quote
the a8 k chip is currently around £98 (due for a massive price drop when trinity hits), and the intel i5 3570k is £170, the amd gpu portion is about 15% faster than the intel chip, and has a few inbuilt technologies that intel cant match for legal reasons, also, I think AMD will wipe the floor with them when it comes to driver support, so to sum up, the intel chip is 70% more expensive, and 15% slower (gpu).......
Fingers66 28th May 2012, 16:18 Quote
Quote:
...It’s entirely possible that this result is because Diablo III is a very new game, so Intel could potentially be yet to optimise its driver for it...

Don't hold your breath, Intel's iGPU driver development and release management is pants.
Merglet 28th May 2012, 16:49 Quote
Um. How were the "roles reversed" from the first two tests? You said all throughout the article that the AMD chipset was better. Then for Diablo 3 that the roles were reversed and the AMD chipset was far better? Lost your train of thought? Good article except for that blunder tho.
KayinBlack 28th May 2012, 16:55 Quote
I've got one of these running now, and I call shenanigans. Loaded up Guild Wars, a seven year old game, and it stuttered and lagged so bad that I couldn't play. Put in a simple HD 5770, just fine. Ping was OK (before I get a question about that) but 1920x1200 completely crippled it in an old MMO designed to run on almost anything. It honestly looked like the old Intel integrated that we had to disable DX9 for it to run.

Only problem with the 5770 is it's not mine. I'll be getting a new GPU in a week or so, but I was just trying to hold out till then. But gaming capable this ain't.
2bdetermine 29th May 2012, 07:44 Quote
There isn't anything to investigated about the Intel HD 4000. It's rubbish! Like every others IGP that came before only to torment/killing PC gaming. On the other hand AMD solution offers some hope and not trying to killed off PC gaming in the process.
nuc13ar 29th May 2012, 07:50 Quote
In title. I wanted to know if the increased IPC of Ivy Bridge made enough of a difference over the Stars architecture that could catch hd 4000 up to 6550d @ 1024x768/1280x1024. I will have to wait for other websites to review this chip I guess :(
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums