bit-tech.net

Intel Haswell vs AMD Richland - the GPU test

Comments 1 to 25 of 53

Reply
GuilleAcoustic 2nd July 2013, 12:12 Quote
Nice comparison. I'd be interested to see the same test with faster memory (2400MHz for example). The 1600MHz RAM quite limits the AMD APU IGP.
maverik-sg1 2nd July 2013, 12:24 Quote
GuilleAcoustic makes a good point in the sense that having the knowledge to choosing and setting up the correct RAM for AMD's apu will have a positive net effect on the performance of their package (maybe same for Intel?.

However, having to purchase more expensive RAM is something that might effect the decision making process.....that is to say, a more cost effective solution might be buying i3 and a £60 gpu off of ebay(GTX560?).

It's worth investigating
rollo 2nd July 2013, 13:02 Quote
With ram prices on the rise again is 2400mhz ram really that readily available and cheap enough to really recommend on a budget system we are talking about here. £20 premium for 2133 stuff. ( £68 for 2133 £48 for 1600mhz 8gb) £77 for the 2400 stuff on scan. Thats nearly double the cost of the 1600stuff.

£300-£400 budget does not allow you to double the memory cost for the same amount which is the budget range for an AMD 6800 system. ( that assumes you have monitor mouse keyboard OS as its basically difficult to even build that on that budget if you need those items as well)

If you aproach £500 a cheap I3 with a 7850 gpu is going to blow away any onboard option either way.
Stanley Tweedle 2nd July 2013, 13:17 Quote
Giant LOL at the 17 fps max FPS on Heaven. At this rate we'll be looking at 5 years before cGPU reach playable frame rate in HD.
jrs77 2nd July 2013, 13:19 Quote
The IGP still isn't powerful enough, if you want to play games in 1080p. And I'm not talking about browsergames or very old titles.
For office and work, the IGP isn't that interesting either, as most stuff is still processed by the CPU.

So in the end the best option is still to go with an intel-CPU and pair it with a dedicated GPU, if you want to play some games from time to time.
Funky 2nd July 2013, 14:20 Quote
Quote:
Originally Posted by GuilleAcoustic
Nice comparison. I'd be interested to see the same test with faster memory (2400MHz for example). The 1600MHz RAM quite limits the AMD APU IGP.

Me to. I think 1866 memory would have been the sweet spot for price/performance though.
SchizoFrog 2nd July 2013, 14:34 Quote
I am still not convinced that building a 'gaming machine' of any description is worth while with onboard graphics. I would much rather have an Intel CPU and a separate GPU, even if it is a little more expensive. The long term benefits make it worth while to me.
teppic 2nd July 2013, 15:18 Quote
1866MHz RAM is doable in a budget machine. Amazon has 2x2GB 1866MHz at £23.

These APUs do perfectly fine at medium settings on most games, and at high settings on many older ones. There's certainly a market for that.

The FX-6300 is a better option if a midrange graphics card is wanted.
schmidtbag 2nd July 2013, 16:32 Quote
Though I don't know the amount of transistors in the GPUs, it is nice to see that AMD uses a higher fab size and lower frequency while still outperforming Intel.
AlienwareAndy 2nd July 2013, 16:40 Quote
Why is it that reviewers and magazines insist on doing things with Crossfire knowing that it is broken?

Seriously, it's starting to piss me the chuff off now. How can you review something that simply doesn't work properly without doing a FCAT analysis and showing what's [I]really happening?[/I]

A couple of months ago I bought a magazine from WHsmith (was at the hospital). Any way, in this magazine there was a huge write up about multiple GPU set ups and it was pretty much SLI vs CFX for three screens. Now this is an area that every one knows (or should if they have an hour spare to read Pcper's FCAT articles) that AMD is truly, utterly borked in.

Not only does CFX suck on one screen but the problem becomes even worse on three. Yet this article totally trashed SLI and bot-licked CFX suggesting that every one use a pair of 7970s.

Why? FFS, don't they realise it just doesn't work properly?

Bit-tech. You have been around for years now. Maybe it's time to invest in a capture card and RAID array of SSDs so that you can actually keep up with the times and use FCAT to tell your loyal readers the real deal? If not? tben it'll be just another article where my palm meets my face.

Get with the times FFS.
schmidtbag 2nd July 2013, 16:50 Quote
Quote:
Originally Posted by AlienwareAndy
Why is it that reviewers and magazines insist on doing things with Crossfire knowing that it is broken?

Seriously, it's starting to piss me the chuff off now. How can you review something that simply doesn't work properly without doing a FCAT analysis and showing what's [I]really happening?[/I]

A couple of months ago I bought a magazine from WHsmith (was at the hospital). Any way, in this magazine there was a huge write up about multiple GPU set ups and it was pretty much SLI vs CFX for three screens. Now this is an area that every one knows (or should if they have an hour spare to read Pcper's FCAT articles) that AMD is truly, utterly borked in.

Not only does CFX suck on one screen but the problem becomes even worse on three. Yet this article totally trashed SLI and bot-licked CFX suggesting that every one use a pair of 7970s.

Why? FFS, don't they realise it just doesn't work properly?

Bit-tech. You have been around for years now. Maybe it's time to invest in a capture card and RAID array of SSDs so that you can actually keep up with the times and use FCAT to tell your loyal readers the real deal? If not? tben it'll be just another article where my palm meets my face.

Get with the times FFS.

I don't think Crossfire, or SLi for that matter, are anywhere near as bad as you think. If you want to bring up things "FFS", I'm using Crossfire in Linux, and you don't hear me complaining! While there's no way to force-enable it for all games, the only issue I encounter is I can't enable IOMMU and use CF at the same time. In Windows, I don't get many issues either, unless I force-enable it for games that don't need it. Nvidia has supposedly fixed their microstutter problem recently, and AMD is about to as well. Microstutter is, from what I heard, the main usage problem with multi-GPU systems.
AlienwareAndy 2nd July 2013, 17:09 Quote
Quote:
Originally Posted by schmidtbag
I don't think Crossfire, or SLi for that matter, are anywhere near as bad as you think. If you want to bring up things "FFS", I'm using Crossfire in Linux, and you don't hear me complaining! While there's no way to force-enable it for all games, the only issue I encounter is I can't enable IOMMU and use CF at the same time. In Windows, I don't get many issues either, unless I force-enable it for games that don't need it. Nvidia has supposedly fixed their microstutter problem recently, and AMD is about to as well. Microstutter is, from what I heard, the main usage problem with multi-GPU systems.

I run SLI.

Crossfire, however, is completely borked and will be until AMD get a driver out that can fix the runt and dropped frames - note - none of which were mentioned during testing.

So that means that at a FRAPs level CFX displays some impressive frame counts. However, what you see isn't what you get.

That's the point I'm making and until it's confirmed that AMD have sorted it out once and for all I think that showing any kind of Crossfire set up in a positive light is just plain wrong.

Mostly because it encourages people to spend money on something that doesn't work which is awfully annoying. I know this only too well because I bought a pair of 5770s at launch (for £260) and they were absolutely bloody awful, yet, shown by reviewers to beat a 5870 'easily'.

So my point remains. As of right now showing Crossfire X in any positive light whatsoever is wrong.
DbD 2nd July 2013, 17:57 Quote
In reality there is basically 0 market for ultra budget igp gaming desktops. Either you don't really care and any integrated is fine, or you care and you buy a cheap discrete card. Even poor gamers would be best off stumping up the cash for an i3 and cheap discrete gpu (even if they have to buy it off ebay).. The upgrade path is so much better as if in the future you have a few £££ you can easily then plug in a much faster i5 or higher end gpu.

Most of the market wouldn't even consider that - they'd just buy a console. Why fight with the settings when you can buy something that will play games perfectly every time.

The only gaming market for onboard gpu's is budget laptops - that's about the only place these make sense. Even then the market is pretty limited.
Gareth Halfacree 2nd July 2013, 18:05 Quote
Quote:
Originally Posted by DbD
In reality there is basically 0 market for ultra budget igp gaming desktops. Either you don't really care and any integrated is fine, or you care and you buy a cheap discrete card.
I beg to differ. You've missed off a small market segment there - and I know you have, because I'm in that very segment: people who want a low-power system capable of playing a few games.

My desktop is an AMD A10-5800K, because I wanted something that drew less power than my old Intel and GeForce combo yet would still allow me to play games from the Humble Bundle and similar - even if I have to turn the settings down a bit, or run at a non-native resolution. In that, I excelled: for £500, I got a system with specifications that matched my requirements (A10-5800K, motherboard with at least one PCI slot for a legacy card I use, 16GB of reasonable-speed RAM, SSD, SATA optical drive to replace my old IDE one, nice quiet case and a few extra cooling fans) and performs brilliantly. More importantly, it draws significantly less power than my old system, both at idle and under load.

Job done, happy customer. If it starts to struggle with the type of game I play, I may stick a cheap graphics card in there - a passively cooled one, for preference, as my rig is near-silent at the moment and I wouldn't want that to change - but for now I'm gold.
Shirty 2nd July 2013, 18:08 Quote
I read a statistic somewhere that 86% of personal computers in the world are not used for gaming in any meaningful capacity. Of the 14% that are, only 21% have a dedicated GPU.

DISCLAIMER: I may not have read this anywhere, but it's probably true.
schmidtbag 2nd July 2013, 18:24 Quote
Quote:
Originally Posted by Gareth Halfacree
I beg to differ. You've missed off a small market segment there - and I know you have, because I'm in that very segment: people who want a low-power system capable of playing a few games.

My desktop is an AMD A10-5800K, because I wanted something that drew less power than my old Intel and GeForce combo yet would still allow me to play games from the Humble Bundle and similar - even if I have to turn the settings down a bit, or run at a non-native resolution. In that, I excelled: for £500, I got a system with specifications that matched my requirements (A10-5800K, motherboard with at least one PCI slot for a legacy card I use, 16GB of reasonable-speed RAM, SSD, SATA optical drive to replace my old IDE one, nice quiet case and a few extra cooling fans) and performs brilliantly. More importantly, it draws significantly less power than my old system, both at idle and under load.

Job done, happy customer. If it starts to struggle with the type of game I play, I may stick a cheap graphics card in there - a passively cooled one, for preference, as my rig is near-silent at the moment and I wouldn't want that to change - but for now I'm gold.

I completely agree with you. Even Intel IGPs are ok for casual games and indie games. As the benchmarks in this article have shown, the A10 was still very capable of playing L4D2, which is graphically modest. Besides, not everyone cares about having max details or playing a game at >60FPS. If AMD can improve their CPU power efficiency, APUs will make excellent mid-range gaming laptops, and with a discrete GPU in CFX, APUs make great mid-range desktops.
teppic 2nd July 2013, 18:33 Quote
I agree too. The performance of the APU means access to most games at playable rates, ones which aren't an option with the i3 IGP. If you're thinking about buying a £150+ graphics card, this obviously isn't the target market.

Also, with a mild overclock and faster RAM, the scores would have been boosted quite noticeably.
PCBuilderSven 2nd July 2013, 18:38 Quote
Quote:
Originally Posted by AlienwareAndy
Crossfire, however, is completely borked and will be until AMD get a driver out that can fix the runt and dropped frames ... So my point remains. As of right now showing Crossfire X in any positive light whatsoever is wrong.

That's just wrong, I've got a laptop with an A8-3520M and a Radeon 7450M in Crossfire with which I have no problems. So either dual graphics on APUs is different (I don't believe that it is), has better drivers, or the problems have been fixed.
Meanmotion 2nd July 2013, 18:57 Quote
Quote:
Originally Posted by AlienwareAndy
Quote:
Originally Posted by schmidtbag
I don't think Crossfire, or SLi for that matter, are anywhere near as bad as you think. If you want to bring up things "FFS", I'm using Crossfire in Linux, and you don't hear me complaining! While there's no way to force-enable it for all games, the only issue I encounter is I can't enable IOMMU and use CF at the same time. In Windows, I don't get many issues either, unless I force-enable it for games that don't need it. Nvidia has supposedly fixed their microstutter problem recently, and AMD is about to as well. Microstutter is, from what I heard, the main usage problem with multi-GPU systems.

I run SLI.

Crossfire, however, is completely borked and will be until AMD get a driver out that can fix the runt and dropped frames - note - none of which were mentioned during testing.

So that means that at a FRAPs level CFX displays some impressive frame counts. However, what you see isn't what you get.

That's the point I'm making and until it's confirmed that AMD have sorted it out once and for all I think that showing any kind of Crossfire set up in a positive light is just plain wrong.

Mostly because it encourages people to spend money on something that doesn't work which is awfully annoying. I know this only too well because I bought a pair of 5770s at launch (for £260) and they were absolutely bloody awful, yet, shown by reviewers to beat a 5870 'easily'.

So my point remains. As of right now showing Crossfire X in any positive light whatsoever is wrong.

As schmidtbag said, you're blowing it completely out of proportion. I've run Crossfire systems for ages without any issues. Sure there may have been the odd stutter but not detrimentally so. Much more of a problem still is whether games/apps support dual-GPU at all. But moreover, given the context of this feature, talking about FCAT in Crossfire is totally overblown.
teppic 2nd July 2013, 19:15 Quote
I saw that in the bit-tech review of the A10, figures are given for the APU overclocked to 4.7GHz, and the APU at stock but with 2133MHz memory. Each gives a sizeable (~18%) increase, so combined they should give a very good boost. Any chance of that in the benchmarks?
billysielu 2nd July 2013, 19:53 Quote
frame rate for dual graphics? really?
...

frame rating...
rollo 2nd July 2013, 20:21 Quote
Not really sure how this got to discussing cf and SLI issues. Bit techs testing on SLI and cf needs updating but they do say in the article that micro stutter is an issue.

There is a market for this sort of stuff but its a small market. The market for dual graphics will be close to 0 though. What does the AMD CPU + that Gpu cost and does it cost more than a budget intel + 7770 id imagine they are close. But the intel solution will be alot faster for pure gaming. ( intel CPU i3 3220 is £94 AMD CPU is aprox £113. Cheapest instock 6670 is £51 total cost aprox £164. Cheapest 7770 is £81 total cost £175)

Now that's a chart I'd like to see. For £11 the iintel solution at least in the benches I've seen will blow away any onboard Gpu including intels new 5200 r chip that's oem only. So your target market does not include any gamer as I'd spec the later intel system every day.

A 7850 is only £30 more than the 7770. And would offer near enough 5-6 times the performance of these onboard cards would actually allow 1080 gaming for a start.
Harlequin 2nd July 2013, 20:41 Quote
Thank you bit tech

as someone else said - the majority of pc` sold in the world have on board (now on chip) graphics , and your article shows that yes , the onchip iGP (HD 8670) can do light gaming - and when coupled with a 6670 can actually get reasonable frame rates in some titles! thank you again for the time.



just because we have pc`s with radeon ultra bollox or nvida uber crap - doesn't mean most of the people who have a pc do.


btw the ram speed `thing` - the 5800k *only* officially supports 1866 ram - the 6800 raises that to 2133 ; it would be awesome to see 2133 ram , but cas9 1866 is the sweet spot now imo
jrs77 2nd July 2013, 20:47 Quote
Quote:
Originally Posted by Shirty
I read a statistic somewhere that 86% of personal computers in the world are not used for gaming in any meaningful capacity. Of the 14% that are, only 21% have a dedicated GPU.

DISCLAIMER: I may not have read this anywhere, but it's probably true.

Nah, you're totally right about this one actually.

The market for dedicated GPUs is only about 5% of the whole PC-market (including laptops and desktops that is).

I'm trying to tell this to the enthusiasts for like allmost 10 years in several forums and sites allready, but people are really subjective when it comes to this.

PC-gaming is a really small niche, especially when talking about games that require dedicated GPUs. Most people buy a PC between €/$300-500 and that's it. These days there's more laptops sold to endconsumers then desktops anyways. Most gamers play on a console since the days of the Atari 500. Even back then people laughed at me when I was playing around with a C64/C128 instead.

The IGP will get there sooner or later, especially when talking about games that don't try to mimic realistic graphics. Good games don't rely on realistic graphics anyways, but good game-concepts and you can clearly see that when looking at the MMORPG-market where WoW is still the biggest one allthough the graphics don't even match what can be done on an iPad these days.

Anyways, the biggest problem with the AMD APU at the moment is the lack of CPU-power compare to intel and this is keeping alot people from switching to AMD allthough their IGP is way better currently.
teppic 2nd July 2013, 20:50 Quote
If you wanted a lower cost gaming system with a 7850, you could get an FX 6300 for less than an i3, and it'd easily outperform it in games. It also massively overclocks, while the i3 is locked.

The APUs offer dual graphics as an extra, it's not the focus of the marketing. I'd agree it's not worth it for many.

As for 1080 gaming, the A10-6800K APU can do >30fps in most games at medium settings and stock clock, and overclocked should manage it for almost any game.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums