bit-tech.net

Intel HD Graphics 3000 Performance Review

Comments 1 to 25 of 29

Reply
wuyanxu 27th January 2011, 17:26 Quote
Z68 is pretty much a well known upcoming product, or an educated guess?

none of the leaked slides mention it :(

what i'd be interested to see is nVidia Optimus technology on Z68, imagine overclocked to 4.6GHz CPU, but the system only draw 50w during idle.
Mankz 27th January 2011, 17:31 Quote
Can it play Crysis?
Fordy 27th January 2011, 17:31 Quote
I completely understand not using your normal suite of test games for a GMA test, however:

I am currently in an upgrade dilemna, strapped for cash, desperately needing more frames - but a new mobo/CPU too.

It would be fantastically helpful to my "which first" choice, if you also included benchmarks against lower end cards and on, say, BC2.
Salty Wagyu 27th January 2011, 17:40 Quote
Did you test to see if movie playback was correctly played at 23.976fps instead of 24.000fps? If it doesn't then it causes a frame pause every 40seconds which looks like a stutter
Timmy_the_tortoise 27th January 2011, 17:41 Quote
Given the OpenCL support.. could a game theoretically use this just for Havok Physics whilst a dedicated GPU handled all the rendering and the CPU handled the rest?

That'd be interesting to see.
mucgoo 27th January 2011, 17:49 Quote
LOL at minecraft being used in a benchmarking
barrkel 27th January 2011, 18:00 Quote
There's a non-linear return to marginal FPS; at low FPS, every extra FPS is noticeable, but it's worth less and less the higher up you go. An FPS per GBP calculation that doesn't take this into account is problematic.

Mind you, figuring out the marginal return curve isn't that easy.
r3loaded 27th January 2011, 18:31 Quote
Quote:
Originally Posted by barrkel
There's a non-linear return to marginal FPS; at low FPS, every extra FPS is noticeable, but it's worth less and less the higher up you go. An FPS per GBP calculation that doesn't take this into account is problematic.

Mind you, figuring out the marginal return curve isn't that easy.
I think Hexus use a system in their calculations where marginal FPS below 30 is awarded double points, and marginal FPS about 60 is awarded half points.
Fingers66 27th January 2011, 18:32 Quote
On page 8, under the image, you refer to Z68. Yet in the following paragraph you call it Z67...

Typo?
John_T 27th January 2011, 19:02 Quote
Nice review, although I am somewhat baffled as to how and why you tested as you did. I hate to be one of those picky people who incessantly moans over the smallest thing, but:

Left 4 Dead 2
- Tested at 1280x800.
- Cost of performance measured at 1280x800.

Starcraft II
- Tested at 1280x720.
- Cost of performance measured at 1680x1050.

Minecraft
- Tested at 1920x1080.
- Cost of performance measured at 1920x1080.

Three games, (six tests) compared on four different resolutions?

Why? What on earth was the benefit of doing it like that? Couldn't you just have picked one resolution and stuck with it for easy, consistent comparison across all tests? I just can't for the life of me understand the logic of it.

Anyway, it certainly looks powerful for integrated graphics. Looking forward to seeing these Z67/8 boards...
frontline 27th January 2011, 19:05 Quote
Interesting article, although i'm still not convinced by Intel's driver support in the long term. AMD and Nvidia will have competing products out soon, so it will be good to compare the options available in a few months time.

Another option would be to spend the £77 price difference between the Sandy Bridge system and the Phenom II system on something like a dedicated 5750 GPU and get far superior graphics performance.
bogie170 27th January 2011, 19:07 Quote
How would Nvidia ION2 benchmarks compare against these chips?
Scootiep 27th January 2011, 19:08 Quote
Just to flaunt my ignorance, I simply can not get over this obsession with Minecraft. I'm not arguing that it's a game worth frittering away some hours on because the game play itself has merit. But for analyzing graphics performance of hardware? COME ON! It's like going back to my old NES copy of Dragon Warrior and saying to myself "You know what would really be great? If every texture in this game was more blocky!" Who in their right mind thinks that it's actually worth it to review the graphics performance of hardware based upon this game. It's simply ridiculous. Just because it gives you a simulated three dimensional world to run around in doesn't excuse it's complete lack of any decent texturing. Again, I'm NOT SAYING Minecraft is a bad game! I'm simply saying that it's a terrible game GRAPHICALLY.
Timmy_the_tortoise 27th January 2011, 19:16 Quote
Quote:
Originally Posted by Scootiep
Just to flaunt my ignorance, I simply can not get over this obsession with Minecraft. I'm not arguing that it's a game worth frittering away some hours on because the game play itself has merit. But for analyzing graphics performance of hardware? COME ON! It's like going back to my old NES copy of Dragon Warrior and saying to myself "You know what would really be great? If every texture in this game was more blocky!" Who in their right mind thinks that it's actually worth it to review the graphics performance of hardware based upon this game. It's simply ridiculous. Just because it gives you a simulated three dimensional world to run around in doesn't excuse it's complete lack of any decent texturing. Again, I'm NOT SAYING Minecraft is a bad game! I'm simply saying that it's a terrible game GRAPHICALLY.

I've never played it, but as far as I know you can build some pretty huge stuff with it. I'd imagine that could push a graphics chip pretty hard.
Kingsley813 27th January 2011, 19:16 Quote
Quote:
Originally Posted by Scootiep
Just to flaunt my ignorance, I simply can not get over this obsession with Minecraft. I'm not arguing that it's a game worth frittering away some hours on because the game play itself has merit. But for analyzing graphics performance of hardware? COME ON! It's like going back to my old NES copy of Dragon Warrior and saying to myself "You know what would really be great? If every texture in this game was more blocky!" Who in their right mind thinks that it's actually worth it to review the graphics performance of hardware based upon this game. It's simply ridiculous. Just because it gives you a simulated three dimensional world to run around in doesn't excuse it's complete lack of any decent texturing. Again, I'm NOT SAYING Minecraft is a bad game! I'm simply saying that it's a terrible game GRAPHICALLY.

Yeah, agreed. Oh, and while they're at it, dig out a copy of Wolfenstein 3D and test the GFX on that too.
Narishma 27th January 2011, 20:10 Quote
A good thing about Minecraft is that it uses OpenGL and as you can see in the benchmarks it shows that Intel still has poor support for it in their drivers compared to Direct3D.
SimonStern 27th January 2011, 20:33 Quote
Quote:
Originally Posted by wuyanxu
Z68 is pretty much a well known upcoming product, or an educated guess?

none of the leaked slides mention it :(

I'd like to know when those Z boards are coming out myself. I see the "second quarter of this year" part but I wonder when that might be. I haven't pulled the trigger on sandybridge yet and when I do (soon I hope) I'll be wondering should I wait for those or just go with a P67 and lose the use of the graphics.

I still think it's pretty stupid they did it like that.
mi1ez 27th January 2011, 20:39 Quote
24Hz support?
jimmyjj 27th January 2011, 21:36 Quote
How about subjective image quality tests. Previous Intel chips have not performed well here?
Elton 28th January 2011, 06:59 Quote
Well it's quite good seeing as Intel never has had anything worth buying in terms of IGPs.

This will definitively be an interesting part for laptops, seeing as it shouldn't be too difficult to implement
yakyb 28th January 2011, 09:53 Quote
Quote:
Originally Posted by Scootiep
Just to flaunt my ignorance, I simply can not get over this obsession with Minecraft. I'm not arguing that it's a game worth frittering away some hours on because the game play itself has merit. But for analyzing graphics performance of hardware? COME ON! It's like going back to my old NES copy of Dragon Warrior and saying to myself "You know what would really be great? If every texture in this game was more blocky!" Who in their right mind thinks that it's actually worth it to review the graphics performance of hardware based upon this game. It's simply ridiculous. Just because it gives you a simulated three dimensional world to run around in doesn't excuse it's complete lack of any decent texturing. Again, I'm NOT SAYING Minecraft is a bad game! I'm simply saying that it's a terrible game GRAPHICALLY.

some laptops / netbooks are unable to run it, personally i would like the one i bought to be able to as that is probably the only game i would put on a netbook therefore benchmarking it is very useful to me
Scootiep 28th January 2011, 15:14 Quote
Quote:
Originally Posted by yakyb
Quote:
Originally Posted by Scootiep
Just to flaunt my ignorance, I simply can not get over this obsession with Minecraft. I'm not arguing that it's a game worth frittering away some hours on because the game play itself has merit. But for analyzing graphics performance of hardware? COME ON! It's like going back to my old NES copy of Dragon Warrior and saying to myself "You know what would really be great? If every texture in this game was more blocky!" Who in their right mind thinks that it's actually worth it to review the graphics performance of hardware based upon this game. It's simply ridiculous. Just because it gives you a simulated three dimensional world to run around in doesn't excuse it's complete lack of any decent texturing. Again, I'm NOT SAYING Minecraft is a bad game! I'm simply saying that it's a terrible game GRAPHICALLY.

some laptops / netbooks are unable to run it, personally i would like the one i bought to be able to as that is probably the only game i would put on a netbook therefore benchmarking it is very useful to me

But the point here is that they're already throwing Starcraft II and L4D2 at it. Putting Minecraft into the mix does nothing for the testbed. If it can run either of the first two titles, being able to run Minecraft is a foregone conclusion. Take a look at the test results. In every single case, the performance numbers for Minecraft are far and away better than SC2 and L4D2. Adding Minecraft to the testing does nothing useful.
xinaes 28th January 2011, 19:44 Quote
I agree that Intel need to focus on their drivers... as a Java OpenGL developer, I'd argue that the Java part of the equation is not really relevant to graphics drivers; it's just OpenGL they need to sort out to get Minecraft, for example, running better.

Minecraft has a stronger graphic style for being bold and retro than it would do if it had mediocre amateurish last-gen graphics. It's also a classic example of the kind of game that people who might use integrated graphics are likely to want to play.
Penfolduk01 29th January 2011, 01:02 Quote
Probably showing my ignorance here, but Intel hamstringing all but the K series in terms of graphics performance may be for two reasons. Firstly, they want to make a killing selling the K series to punters.
Secondly, I wouldn't be surprised if they are deliberately lowering the horsepower to avoid any charges of anti-competitive behaviour. Hell, it might be to do with the recent Robocop-style "classified directive" agreement between Intel and nVidia. Intel may have agreed not to cut nVidia's legs from under it, so they can share technology that can go up against AMDs ATI expertise.
Wwhat 4th February 2011, 06:22 Quote
On-board (& on-die) graphics are for business use (and the very poor maybe?) and in business you don't allow games, if alone since all of them introduce DRM stuff that completely ruin the security of a system.
But I can't deny I was curious about game performance though as many with me I'm sure.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums