bit-tech.net

AMD announces its first 5GHz CPU

AMD announces its first 5GHz CPU

AMD's new FX-9590 is the company's - though not the world's - first 5GHz processor, although it reaches these heady heights only under Turbo Core conditions.

AMD had a surprise for those attending the Electronics Entertainment Expo (E3) last night: the unveiling of two new flagship entries in its FX family of processors, including what the company claims is the world's first 5GHz processor.

Now, before you get too excited, there's a caveat there: while the Piledriver-based eight-core FX-9590 does indeed state on its box that it runs at 5GHz, it does so only under conditions suited to AMD's Turbo Core 3.0. The speed at which the chip runs when Turbo Core 3.0 can't be used - such as when all eight of the processing cores are fully loaded, or when the temperature of the chip reaches too high a level - is somewhat lower than that headline-grabbing figure.

It's also not the world's first 5GHz processor, despite AMD's claims to the contrary: back in 2007 IBM released a dual-core 5GHz Power 6 chip for the high-performance computing (HPC) market, followed by the 5.2GHz IBM z196 in 2010. 'This is another proud innovation for AMD in delivering the world’s first commercially available 5GHz processor,' crowed AMD's Bernd Lienhard at the event, unfortunately neglecting to point out its status as the world's first commercially available 5GHz x86 processor.

Those clarifications aside, on to AMD's announcement of what is still a pretty impressive achievement. 'At E3 this week, AMD demonstrated why it is at the core of gaming,' claimed Lienhard. 'The new FX 5 GHz processor is an emphatic performance statement to the most demanding gamers seeking ultra-high resolution experiences including AMD Eyefinity technology.'

The FX-9590, the company's flagship entry in the FX series, is the model with a headline-grabbing 5GHz Turbo Core clockspeed and an as-yet unconfirmed base clock of 4.7GHz. The FX-9370, meanwhile, runs at 4.7GHz under Turbo Core and a rumoured 4.4GHz otherwise. Both models include eight Piledriver processing cores, unlocked multipliers and 8MB of cache memory, with neither including accelerated processing unit (APU)-style integrated graphics. Thermal design profile (TDP) on either chip has yet to be revealed.

If you're hankering for an upgrade, however, there's a disappointment to come: AMD is concentrating on releasing the parts to original equipment manufacturers (OEMs) first, with no news yet as to when the parts will come to retail nor how much they will cost when they do. Systems based on both chips are expected to appear from the usual companies this summer, meaning a retail release for the CPUs themselves is unlikely to occur before the third quarter.

79 Comments

Discuss in the forums Reply
tonyd223 12th June 2013, 11:17 Quote
#bored
Parge 12th June 2013, 11:20 Quote
Quote:
Originally Posted by tonyd223
#bored

Cool story bro.

I literally can't wait to find out the TDP on this chip. To be honest it almost makes me want one so my watercooling loop actually has something to do!
Snips 12th June 2013, 11:23 Quote
Yet still get's owned by an i3-2120 ;)
Shirty 12th June 2013, 11:26 Quote
Quote:
Originally Posted by Parge
Cool story bro.

I literally can't wait to find out the TDP on this chip. To be honest it almost makes me want one so my watercooling loop actually has something to do!

How about 220W?
jinq-sea 12th June 2013, 11:26 Quote
Quote:
Originally Posted by Shirty
How about 220W?

Could cook an egg on that...
SAimNE 12th June 2013, 11:36 Quote
... so any architecture changes.... cuz the 8 core 5ghz is what overclockers have had for a while now.... what was broken was the chip itself, not its clock :|
Parge 12th June 2013, 11:54 Quote
Quote:
Originally Posted by Shirty
How about 220W?

I heard that, then this article....
Harlequin 12th June 2013, 13:42 Quote
hmmm 8350 @ 5ghz arnt 220w tdp chips though , barely breaking 200.
ChromeX 12th June 2013, 15:06 Quote
Quote:
Originally Posted by Harlequin
hmmm 8350 @ 5ghz arnt 220w tdp chips though , barely breaking 200.

The TDP for an 8350 at stock is much closer to 100 than 200 watts. But guess what, this isn't a 8350, it's the 9590! Comparing the two and saying because the older model operates around 200w at 5GHz doesn't mean the new one will.
AlienwareAndy 12th June 2013, 15:08 Quote
Quote:
Originally Posted by ChromeX
The TDP for an 8350 at stock is much closer to 100 than 200 watts. But guess what, this isn't a 8350, it's the 9590! Comparing the two and saying because the older model operates around 200w at 5GHz doesn't mean the new one will.

Well in fairness the spec does state a 220w TDP....

Not sure what to make of it really. I guess reviews and benchmarks will tell :)
Shirty 12th June 2013, 15:12 Quote
Check the author's update at the end of this.

It's going to be a beast to tame.
AlienwareAndy 12th June 2013, 15:21 Quote
Quote:
Originally Posted by Shirty
Check the author's update at the end of this.

It's going to be a beast to tame.

Any uber CPU needs taming. The 990x for example chomped through power and needed pretty extreme cooling to overclock it well. IIRC Bit Tech used their chiller.... They also used the Rampage Extreme.

But no doubt haters gonna hate because it's an AMD. *sigh*.

Personally I don't care. I've already danced with the devil and put a FX 8320 in my PC and quelle surprise ! it's actually a damned good CPU, was cheap as the proverbial and games like a monster.
Shirty 12th June 2013, 16:13 Quote
I have no hate for AMD, and if they release a chip which is objectively better than Intel's offering at the same price point I'll make the jump back (assuming I need to upgrade).
Parge 12th June 2013, 16:26 Quote
Quote:
Originally Posted by AlienwareAndy

But no doubt haters gonna hate because it's an AMD. *sigh*.

People don't hate AMD, they just make inferior processors at the kind of level most enthusiasts are buying. If anything, most people want AMD to succeed.

I guess you are a little sensitive because you've had a lot of people say your CPU isn't as good as theirs, which makes it feel like you are being 'hated' on.
rayson 12th June 2013, 17:59 Quote
Quote:
Originally Posted by jinq-sea
Quote:
Originally Posted by Shirty
How about 220W?

Could cook an egg on that...

i'd doubt i can't boil water properly on my 1500w kettle than thing is mean't to heat water.
(i know you where being sarcastic really wish you could fry eggs on that since i brought a bucket load of eggs for half price.)
tonyd223 12th June 2013, 18:15 Quote
Quote:
Originally Posted by Parge
Quote:
Originally Posted by AlienwareAndy

But no doubt haters gonna hate because it's an AMD. *sigh*.

People don't hate AMD, they just make inferior processors at the kind of level most enthusiasts are buying. If anything, most people want AMD to succeed.

I guess you are a little sensitive because you've had a lot of people say your CPU isn't as good as theirs, which makes it feel like you are being 'hated' on.

All my PC's at home (except 1 I inherited *cough*) are AMD - I really want them to succeed, but they have no game! Look at the power useage (as stated by EVERYONE) for the performance... Intel make them look silly. If it was cars, you'd never buy an AMD...
AlienwareAndy 12th June 2013, 22:44 Quote
Quote:
Originally Posted by Parge
People don't hate AMD, they just make inferior processors at the kind of level most enthusiasts are buying. If anything, most people want AMD to succeed.

I guess you are a little sensitive because you've had a lot of people say your CPU isn't as good as theirs, which makes it feel like you are being 'hated' on.

Are you a psychologist per chance ?

You're wrong of course on pretty much all counts.

See, as a bit of a nerd and one that refuses to follow fashion I know that technically from a pure geek sort of prowess that AMD's CPUs are only inferior if one takes an extremely skewed attitude when comparing them.

I'm not the sort to compare,say, a one legged horse to say, one with four legs. What I'm saying is that I'm not a short sighted moron who refuses to read the signs or the writing on the wall.

As for the rest of your psychological profiling of me ? Sorry to inform you, but you're wrong there too.

I also own a Sandybridge CPU and the AMD beats it in every game released in the past six months.

But you hang onto that single core performance for a bit longer if it makes you feel like you belong.
Shirty 13th June 2013, 00:40 Quote
Let's have the numbers then :)

Convince me to ditch my Sandy rig for AMD, because I really want to.
AlienwareAndy 13th June 2013, 00:56 Quote
Quote:
Originally Posted by Shirty
Let's have the numbers then :)

Convince me to ditch my Sandy rig for AMD, because I really want to.

I'm away from home armed only with a crap phone atm. Check out the 8 core thread in the hardware section as I more than made my point there.

In short though whilst Intel hold the single threaded crown one of their cores does not equal two of AMDs.

No point upgrading for you to upgrade given you paid cash for the ability to overclock but saying that 8 core unlocked fun is cheap so maybe have a go any way?

I posted some 8 core supported benchmarks and game results and they are undeniably impressive.
CrapBag 13th June 2013, 01:09 Quote
As usual the achievement is over looked by the Intel Vs Amd arguement.

5ghz, 8 core processor is pretty impressive, have Intel done anything similar??

P.S. I jumped ship from Amd to Intel about a year ago but I'm still impressed!!!!
Shirty 13th June 2013, 01:09 Quote
I'll take a good look ;)
Otis1337 13th June 2013, 01:59 Quote
Quote:
Originally Posted by AlienwareAndy
Are you a psychologist per chance ?

You're wrong of course on pretty much all counts.

See, as a bit of a nerd and one that refuses to follow fashion I know that technically from a pure geek sort of prowess that AMD's CPUs are only inferior if one takes an extremely skewed attitude when comparing them.

I'm not the sort to compare,say, a one legged horse to say, one with four legs. What I'm saying is that I'm not a short sighted moron who refuses to read the signs or the writing on the wall.

As for the rest of your psychological profiling of me ? Sorry to inform you, but you're wrong there too.

I also own a Sandybridge CPU and the AMD beats it in every game released in the past six months.

But you hang onto that single core performance for a bit longer if it makes you feel like you belong.

Wasnt you temp banned (or at least a got your thread locked) a while ago for talking total sh!t before?
If i remember correctly it had to do with your fanboyism for alienware... Try not to make the same mistake again...
Anfield 13th June 2013, 03:06 Quote
Quote:
Originally Posted by AlienwareAndy

I'm away from home armed only with a crap phone atm. Check out the 8 core thread in the hardware section as I more than made my point there.

Lets just up the difficulty a bit, prove how AMD Cpus are better when the difference between running an oc'd AMD Cpu and running an oc'd Intel cpu comes in the form of a £30 difference in the monthly electricity bill.
sub routine 13th June 2013, 07:38 Quote
Pushing to I boats the market is something intel seem to have forgotten. The big q is how much cooling will be required to max this at 5ghz all the time. I'm very interested to find out.
sub routine 13th June 2013, 07:41 Quote
Hmmm I boat = innovate. Please excuse the fat thumbs.
Hustler 13th June 2013, 11:21 Quote
so...core for core, clock for clock, a 5Ghz FX chip is only as fast as a 3570,4670K @ 4Ghz.

..not very impressive when you look at it like that is it.

Pointless CPU from an increasingly pointless CPU manufacturer unfortunately, I want them to succeed, I really do, we desperately need the competition, but releasing stuff like this will not help them or the consumer.
benji2412 13th June 2013, 12:02 Quote
Quote:
Originally Posted by AlienwareAndy
Are you a psychologist per chance ?

You're wrong of course on pretty much all counts.

See, as a bit of a nerd and one that refuses to follow fashion I know that technically from a pure geek sort of prowess that AMD's CPUs are only inferior if one takes an extremely skewed attitude when comparing them.

I'm not the sort to compare,say, a one legged horse to say, one with four legs. What I'm saying is that I'm not a short sighted moron who refuses to read the signs or the writing on the wall.

As for the rest of your psychological profiling of me ? Sorry to inform you, but you're wrong there too.

I also own a Sandybridge CPU and the AMD beats it in every game released in the past six months.

But you hang onto that single core performance for a bit longer if it makes you feel like you belong.

This sort of attitude really shouldn't be welcome on this forum, it's usually a decent debate, this is just pure attitude. I'm assuming you're not a scientist, otherwise you'd have supported your argument with fact and reason instead of saying you're on a 'crappy' phone.

So I found it out for you:
Quote:
Originally Posted by AlienwareAndy

I can tell you now I put my 8320 up against a 2550k in purely cherry picked benchmarks* and not once did the 2550k win. My mate even pushed it to 4.9ghz and still couldn't beat me in Firestrike.

* Given that very few of the fun apps and games we use support 8 cores we decided for a laugh to run Firestrike out of 3dmark 13 and Crysis 3. I won every time.

It's going to be an interesting six months, that's for sure. Whilst you won't get huge gains out of switching to AMD you will get proper support for the architecture, and let's face it this isn't the first time AMD have released something two years too early (see also X64 CPUs, here we are now all running X64 Oses) and it won't be the last.

They have a very autistic way of going about things, but usually get there in the end ;)

So you biased your test to favour the CPU you're a fan boy for?

Seems legit. Oh and I find the final sentence slightly offensive with reference to people who are autistic.
Shirty 13th June 2013, 12:30 Quote
Thanks for doing my job for me, I just couldn't be bothered in the end.
benji2412 13th June 2013, 12:35 Quote
Quote:
Originally Posted by Shirty
Thanks for doing my job for me, I just couldn't be bothered in the end.

No problem, I can smell crazy a mile away.
spolsh 13th June 2013, 15:50 Quote
Hard to comment on how good or anything it is until we get benches. But 5ghz out the box is a substantial jump, and if they've made some other tweaks too, this could just make them competitive in performance and not just price. I'm looking forward to more info, and hope AMD have created something phenom-inal (he he )
AlienwareAndy 13th June 2013, 17:19 Quote
Holy crap ! Smithers ! release the hounds !
Quote:
Originally Posted by Otis1337
Wasnt you temp banned (or at least a got your thread locked) a while ago for talking total sh!t before?
If i remember correctly it had to do with your fanboyism for alienware... Try not to make the same mistake again...

No actually I wasn't. I was set upon by a pack of rabid hounds for voicing my distaste at Coolit for selling coolers that leak, but no bans and no Alienware arguments because arguing over what case some one decides to use is rather stupid.

Thanks for your warning though very helpful (note the hint of sarcasm there) I actually received some heart warming PMs from people during that time.
Quote:
Originally Posted by Anfield
Lets just up the difficulty a bit, prove how AMD Cpus are better when the difference between running an oc'd AMD Cpu and running an oc'd Intel cpu comes in the form of a £30 difference in the monthly electricity bill.

Let's just up the crap you mean? Let me put this scientifically and logically instead of blowing it out of all exaggerated proportion shall I?

At the last time I checked a 100w light bulb costs 32p to run for 24 hours, going on kwh. Now it's also fair and quite scientifically correct to say that an overclocked 8 core AMD CPU uses roughly around 100w more at full load than the comparative Intel CPU when it too is overclocked. So, if you ran your PC at full load it would use -

32p a day, 24 hours a day, 365 days a year = £116 a year. However, let's break that down to more realistic figures shall we?

I game for three hours a day, pretty much every day. So, let's find out how much it costs an hour to run a 100w more power hungry CPU under full load, shall we?

32 divided by 24 hours is 1.3 pence per hour. So if we gamed using 100w more per hour than a comparative Intel CPU we would use 3.9p per day.

3.9p x 365 days in a year = £14.23 a year. Given that the price differential between the I5 and 8320 is £87 it would take 14.23 x 'X' where X is the year to make up the price differential. Off the top of my head that is about six years in order to give your power argument any weight at all.

Look, if you're going to come out with such statements then have the science and the logic on tap to back them up.
Quote:
Originally Posted by Hustler
so...core for core, clock for clock, a 5Ghz FX chip is only as fast as a 3570,4670K @ 4Ghz.

..not very impressive when you look at it like that is it.

Pointless CPU from an increasingly pointless CPU manufacturer unfortunately, I want them to succeed, I really do, we desperately need the competition, but releasing stuff like this will not help them or the consumer.

That isn't true either. Clock for clock (as in ghz) fact dictates that the AMD, when all 8 cores are being used is faster than the I5 3570k, and the Sandybridge 2500k. I have not had a chance to compare any data for Haswell yet, but given the right circumstances the 8350 can duke it out with the 3770k so I will assume that it can put up a decent fight against Haswell, again when supported properly and all 8 cores are being used.

It's never been a secret that logically and scientifically AMD's 8 core CPUs, when given the correct software, have been very good. In games? not so good. Not up until we were told we were getting 8 cored consoles, and that all of the people writing the games for those consoles would be using the same techniques needed for AMD's CPUs. You can read about that here.

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

And from my own findings when using any apps or games that make use of all of those threads? the results are the same. The AMD on sheer core count out muscles the comparative Intel CPUs costing nearly twice as much.

So I guess it depends on what you see as impressive.
Quote:
Originally Posted by benji2412
No problem, I can smell crazy a mile away.

Can you really? why haven't you offered up any facts then? There's a thread on this forum that I've been posting facts and figures to based from my own testing that no one has bothered to respond to. If you want to prove it clock your CPU to 4.2ghz and then run the same tests and the same benchmarks. In fact, given your extremely ironic reply below, don't bother.
Quote:
Originally Posted by benji2412

Seems legit. Oh and I find the final sentence slightly offensive with reference to people who are autistic.

I'm going to ignore everything else you've said based on that final line there. If you find that sentence offensive to people with autism? I wouldn't worry, given that you know? I'm autistic (well actually Asperger Syndrome, AKA ASD, AKA Autistic Spectrum Disorder). Which would be why I said what I did, given that AMD released a 8 core CPU *IMO* two years early when it really wasn't needed or utilised. That, however, is set to change.
I find it quite astonishing that you DARE get upon your high horse and "stick up" for people with autism when you have the audacity to go around saying things such as -
Quote:
Originally Posted by benji2412
No problem, I can smell crazy a mile away.

Which as someone who is autistic I find INCREDIBLY offensive and rude given that it's the usual thick ass Neurotypical attitude to some one such as myself.
If you can't think in science and logic don't bother talking to me. Random false facts and lies about AMD won't wash, given that I am able to see that 8 AMD cores are clearly faster than 4 Intel ones no matter how good they are.
Harlequin 13th June 2013, 17:31 Quote
ASD is a spectrum - imagine a rainbow , someday you might have alot of red tendencies others it might be blue.... but its a wide range.


and I hate the labels that society puts on things other than the considered `normal`.
Shirty 13th June 2013, 17:39 Quote
I think the problem you are going to have is to convince the predominantly gamer community on here that AMD is a better proposition. Sure, if you work a lot with productivity apps which are heavily multithreaded, then your CPU is a no-brainer - especially at the price point. But when it comes to gaming, there are only a handful of games that can even address four cores, let alone more. The majority still only really benefit from dual core. So the per-core argument really holds water here.

But if it does what you need it to then good for you! Don't let other people's opinions put you off enjoying your rig.

Personally I'd like to see an architecture overhaul from AMD, where they offer six or eight cores with similar per-core performance to Intel, for a similar price. Then they will get my money again.
AlienwareAndy 13th June 2013, 17:43 Quote
Quote:
Originally Posted by Harlequin
ASD is a spectrum - imagine a rainbow , someday you might have alot of red tendencies others it might be blue.... but its a wide range.


and I hate the labels that society puts on things other than the considered `normal`.

There is no such thing as 'normal'. I've met many people in my life from different countries and god knows what else and I can attest that I've met some quite frankly barking mad people running businesses and god knows what else.

When AMD released Bulldozer I saw what it could do technically and I was very impressed. Technically as in a scientific experiment sort of way. Then a part of me said "Dear lord, what on earth are you doing?!?!"

The world simply wasn't ready for 8 core desktop processors.

Now though? things are changing. Even if we look back at Bulldozer and see what it could do in say, Winzip, we were clearly given a taste of what it would and could do when supported properly.

THAT is how I view 8 core AMD CPUs. I can't sit here and lie and say they cost £30 more a month because the fact is they just don't.

One thing I have learned about technology during my life is that it never sits still. Buying something and then hoping it will remain top of the pile never happens, we have to look forward.

Given that it's absolutely no secret AT ALL - and that I hope we can all agree that we have been hand fed second hand console slop for YEARS now, then we can all find it logically fair to assume that once the console games are being developed for an 8 core AMD CPU which is very similar then THAT is what we will get.

Will they take the time to ensure that Intel CPUs are used fairly ? haha, have they bothered supporting SLI or Crossfire or anything else that a console doesn't use? A PC gamer will always have to just take the second hand slop handed over by the game devs. PC games are nothing but a second pay day and one they want minimal fuss or work to get their hands on.
Quote:
Originally Posted by Shirty
I think the problem you are going to have is to convince the predominantly gamer community on here that AMD is a better proposition. Sure, if you work a lot with productivity apps which are heavily multithreaded, then your CPU is a no-brainer - especially at the price point. But when it comes to gaming, there are only a handful of games that can even address four cores, let alone more. The majority still only really benefit from dual core. So the per-core argument really holds water here.

But if it does what you need it to then good for you! Don't let other people's opinions put you off enjoying your rig.

Personally I'd like to see an architecture overhaul from AMD, where they offer six or eight cores with similar per-core performance to Intel, for a similar price. Then they will get my money again.

When every single game developer asked says - "I'd go with the AMD" then there has to be logic and reason for it. THIS was what I was referring to with my 'writing on the wall' comment.

Those devs know better than nearly all of us, given they have their hands on the tech that is forthcoming.

Just like Winzip, there are a tiny handful of other apps and games that support 8 cores and use them all. And when they do? that's when we find out that Intel's incredibly superior single threaded performance falters (taking close note that I am acknowledging facts? IE - Intel's single threaded performance is quite incredible?)

Bottom line? as I keep saying, over and over, Intel's single threaded performance may be amazing and may batter AMD's. Just a fact, that. However, it's being shown slowly but surely that one of their superior cores does not equal or better two of AMD's.

So when it comes down to the most important thing of all - PRICE - AMD are kicking ass.
benji2412 13th June 2013, 18:20 Quote
Scientifically correct to work out your power consumption working backwards from an approximate cost of running a 100W bulb for 24hrs?

No, it isn't. You'd look up the price per kWh and calculate it correctly. Stop talking s***
Harlequin 13th June 2013, 18:26 Quote
energysavingtrust calculates current average electricity cost at 15.3p per kwh

so a 100w bulb switched on for 10 hours costs 15.3p.
AlienwareAndy 13th June 2013, 18:27 Quote
Quote:
Originally Posted by benji2412
Scientifically correct to work out your power consumption working backwards from an approximate cost of running a 100W bulb for 24hrs?

No, it isn't. You'd look up the price per kWh and calculate it correctly. Stop talking s***


Given that the power differential is 100w or there abouts between the two CPUs then there is nothing wrong with my calculations, based on the fact that I DID calculate that 100w differential by using kwh to figure it out based on current charges for electricity.

I could have laid it out per hour yes, but what's the point? if a 100w bulb costs 32p a day to run for 24 hours then dividing up that 32p by 24 works perfectly well.

Fact is, if a 100w bulb costs 32p to run for 24 hours then when broken down it costs 1.3p per hour.

If you don't have anything else to add or anything worth reading then kindly, you know? stop replying to me.

I find you incredibly rude and offensive.
AlienwareAndy 13th June 2013, 18:31 Quote
Quote:
Originally Posted by Harlequin
energysavingtrust calculates current average electricity cost at 15.3p per kwh

so a 100w bulb switched on for 1 hour costs 15.3p.

British Gas charge 12.745 apparently. I'm checking it now and a kwh is 1000w per hour. A 100w light bulb uses 1/10 of that, so 0.1kW x 10h = 1 kWh.

That means it costs 12.745 for ten hours, so my 32p a day was rather generous. It's actually less than 32p.

Edit. Actually using your rate (the first one I used before looking on BG) it's pretty much bang on 32p for 24 hours.
benji2412 13th June 2013, 18:34 Quote
I'm replying because you're using the word scientific a lot when it's not. It's very annoying, so is your condescension.
Harlequin 13th June 2013, 18:36 Quote
Quote:
Originally Posted by AlienwareAndy
British Gas charge 12.745 apparently. I'm checking it now and a kwh is 1000w per hour. A 100w light bulb uses 1/10 of that, so 0.1kW x 10h = 1 kWh.

That means it costs 12.745 for ten hours, so my 32p a day was rather generous. It's actually less than 32p.

Edit. Actually using your rate (the first one I used before looking on BG) it's pretty much bang on 32p for 24 hours.

I changed the pricing since my math was out , as its kilowatt hours not watt hours..... therefore recheck my post ;)
AlienwareAndy 13th June 2013, 18:39 Quote
Quote:
Originally Posted by benji2412
I'm replying because you're using the word scientific a lot when it's not. It's very annoying, so is your condescension.

Really? well given that I wasn't initially responding to you then it doesn't matter, does it?

What I was responding to was this.
Quote:
Originally Posted by Parge
People don't hate AMD, they just make inferior processors at the kind of level most enthusiasts are buying. If anything, most people want AMD to succeed.

I guess you are a little sensitive because you've had a lot of people say your CPU isn't as good as theirs, which makes it feel like you are being 'hated' on.

Which is terribly childish and condescending. Why you felt you had to get involved when my reply was not toward you OR had anything to do with you is a bloody mystery. However, when you did you then started with the insults.

Maybe the next time you get stuck into some one you'll figure out if it has anything to do with you first.
benji2412 13th June 2013, 18:44 Quote
Actually I thought you misread Parge's comment entirely. It's quite common for people to defend their purchase, such the way you are. He was just highlighting that.

Anyway I just didn't like the way you seem to justify your arguments with the word scientifically. Especially in that thread for arguing the pros of an 8 core AMD CPU you biased your method to validate your hypothesis.
Ljs 13th June 2013, 18:48 Quote
Arguing on the internet is cool again apparently.
AlienwareAndy 13th June 2013, 18:53 Quote
Quote:
Originally Posted by benji2412
Actually I thought you misread Parge's comment entirely. It's quite common for people to defend their purchase, such the way you are. He was just highlighting that.

Anyway I just didn't like the way you seem to justify your arguments with the word scientifically. Especially in that thread for arguing the pros of an 8 core AMD CPU you biased your method to validate your hypothesis.

And what you need to understand is that I am a 40 year old man who is perfectly capable of making his own mind up.

Usually in the face of adversity and pure ignorance, when people give you the "There there !" routine when they see you as of subnormal intelligence given you are autistic.

As I have said over and over, to you and any one else - I own both an AMD FX 8320 *and* a Intel Xeon that is based on the I5 2400. It has more cache than the I5 however, and no annoying under powered GPU aboard.

The comparative chip from Intel currently costs around the £140 mark. I paid £149 for mine in a sale ages ago.

In not one single test can it compete with the AMD (edit - when the test is fair). In fact, I saw the AMD for what it is - an upgrade.

As for my method? no, sorry I don't see anything wrong with that, given that the CPU I have compared it to costs more than the AMD part. I've welcomed people to run the same benchmarks as I have (and even posted mine and links to them) and they haven't bothered.

Which there wouldn't be a lot of point to given that my findings are quite correct. IE - using 3Dmark Firestrike as an example of something that eats all 8 cores? then clock for clock the FX is faster than the Intel part.

Which would be down to core count, which is something that AMD have not yet had the credit for. So far all people have been able to say is "Well the AMD uses about three thousand pounds more electricity !" and "Intel have the better single core performance !".

Some of which is true, some of which isn't. However, it's no secret that when an application or game can make use of all of those threads? then AMD most certainly have a place in the market.

Times are changing. Give it a year? it'll be "But Intel simply can't compete with AMD's sheer amount of cores" and "But the AMD only costs about 1p more a decade to run !".

You know? false myths and all that crap that tends to happen.

I've never, in all of my 40 years, EVER been a fanboy. I don't care who makes what, I'm just passionately in love with the technology. And technically the AMD FX CPUs are pretty incredible.

First to market again ! just like the first X64 desktop CPU that every one (Intel included) laughed at, yet here we all are sitting on X64 OSes with 16gb of ram.

Funny that !
AlienwareAndy 13th June 2013, 18:55 Quote
Oops DP sorry !
Otis1337 14th June 2013, 01:40 Quote
are you saying that bit-tech are lying about there benchmarks then, because AMD are at the bottom for everything all of the time
AlienwareAndy 14th June 2013, 02:12 Quote
Quote:
Originally Posted by Otis1337
are you saying that bit-tech are lying about there benchmarks then, because AMD are at the bottom for everything all of the time

All of the time? funny that because I just checked over the review for the 8350 again and the only two games they tested the 8350 in were TESV (famed for hating more than two cores) and Total War: Shogun 2.

There's no other games tested there, not even BF3 which is known to support 8 cores and fare much better on the AMD chips.

If I were to rewind the clock back to November 2012 when they initially reviewed the 8350? there was one or two games then that supported the AMD CPUs properly. However, neither of them were tested then but looking around BF3 is an 8 core lover, as is Crysis 3.

Not only that, but since then two hot fixes have been released for Windows 7 (8 has them natively) that drastically improved performance. They were to do with core parking and caching issues. Basically the 8 core AMDs were dumping their cache mid operation causing lag. You can find them both here.

http://support.microsoft.com/kb/2645594

Currently, the CPU scheduling techniques that are used by Windows 7 and Windows Server 2008 R2 are not optimized for the AMD Bulldozer module architecture. This architecture is found on AMD FX series, AMD Opteron 4200/4300 Series, and AMD Opteron 6200/6300 Series processors. Therefore, multithreaded workloads may not be optimally distributed on computers that have one of these processors installed in a lightly-threaded environment. This may result in decreased system performance for some applications.

And.

http://support.microsoft.com/kb/2646060

Which I would hazard a guess and say were not installed at the time of review, given they don't download automatically. AMD's fault? yeah pretty much. Releasing a CPU that isn't supported properly, then release a fix.

However, what I am focusing on more is what has happened since then. We're slowly seeing more and more apps and games coming along that DO support 8 cores. 3Dmark (13) for example. When pitted against the quad core Intels the AMD comes out ahead. Not by miles, but then the AMD CPUs cost almost half of the Intels. Then as I mentioned there is BF3, where AMD do not suffer at all, and in Crysis 3? take a look here.

http://www.pcgameshardware.de/Crysis-3-PC-235317/Tests/Crysis-3-Test-CPU-Benchmark-1056578/

The 8350 comes out ahead of the 3770k.

Then of course one needs to realise that pretty much every console port we get will be fully optmised to see and use AMD's architecture properly. It doesn't just end there, either. AMD have been spending their time and money getting into bed with as many game devs as possible, and getting them aboard their "Gaming Evolved" strategy.

Don't think for one minute that I'm saying any one should dump their Intel CPU and get an AMD one, I'm just saying that as a new purchase they are a choice again, instead of something to just avoid. They're also far cheaper than the comparative Intel solution.
Spreadie 14th June 2013, 09:58 Quote
Quote:
Originally Posted by jinq-sea
Could cook an egg on that...
Quote:
Originally Posted by rayson
i'd doubt i can't boil water properly on my 1500w kettle than thing is mean't to heat water.
(i know you where being sarcastic really wish you could fry eggs on that since i brought a bucket load of eggs for half price.)
I read that and thought of this :)
jinq-sea 14th June 2013, 11:51 Quote
Quote:
Originally Posted by Spreadie
I read that and thought of this :)

I'm sure I saw that years ago actually - it's brilliant! That brightened up my morning - cheers Dave!
rayson 14th June 2013, 12:33 Quote
How do I cry in tapatalk
AlienwareAndy 14th June 2013, 14:24 Quote
Couple of videos.. First up, Battlefield 3, 8350 vs 3570k.

a2EcXrgJLY0

Video is dated pre hot fix, but whatever. There's 1 FPS in it. Of course, it's not worth buying the 8350 as the 8320 is about £40 less and pretty much identical. You probably wouldn't get the same extreme overclock out of the 8320 but you'd definitely get a good "day to day" OC.

Then onto Crysis 3 and Far Cry 3 (both using the 8 core munching Cryengine 3). 3770k vs 8350 both overclocked.

rIVGwj1_Qno

And again there's literally nothing in it to separate the two.

Which pretty much goes along with the Crysis 3 tests I performed.
Shirty 14th June 2013, 14:57 Quote
AMD have definitely closed the gap a fair bi for the first time since Core 2 architecture revolutionised the game, and there is a lot to like about their latest processors (particularly the cost). But be completely honest with yourself: if they were the best processors at the price points do you no think that the general consensus on all the review sites on the net, hardware mags and user opinions would be reflective of this?

As it stands, the overwhelming majority still come back to recommending Intel for the time being, despite how much we all want AMD to come out on top. However hard you try to defend your purchase, you are always going to come up against this brick wall unfortunately.
AlienwareAndy 14th June 2013, 15:25 Quote
Quote:
Originally Posted by Shirty
AMD have definitely closed the gap a fair bi for the first time since Core 2 architecture revolutionised the game, and there is a lot to like about their latest processors (particularly the cost). But be completely honest with yourself: if they were the best processors at the price points do you no think that the general consensus on all the review sites on the net, hardware mags and user opinions would be reflective of this?

Right now as of this second? depends. If websites took another look at the 8 core CPUs and then decided to prospect a little like I have? then yes, right now you can, with your hand on your heart, say that certain AMD CPUs are worth having.

PCformat now recommend a 6300 over the I3 because of two factors. One - forthcoming (and existing though very small !) core support and two - it can be overclocked.

I'm not being funny here, but when you have a 6 core unlocked CPU that costs less than the locked derped I3?

http://www.aria.co.uk/SuperSpecials/Other+products/AMD+%28Piledriver%29+FX-6300+3.50GHz+%284.10GHz+Turbo%29+Socket+AM3%2B+6-Core+Processor+-+Retail+?productId=52724

vs

http://www.aria.co.uk/Products/Components/Processors/Intel+CPUs/Core+i3+-+Socket+1155+%28Ivy%29/Intel+Core+i3-3220+3.30GHz+%28Ivy+Bridge%29+Socket+LGA1155+Processor+-+Retail+?productId=52002

There really is no contest at all. Then we move up to their next offering, the 8 core 8320, priced around £113. The cheapest locked up entry level I5 costs around £130.

The 8 core AMD can be easily pushed, even by an overclocking amateur, to 4.2ghz on a crap board with 4+1 power stages. At that level (as I have shown) when 8 cores are in use the FX 8320 absolutely wees over the Intel price for price counterpart (that being my Xeon E3 1220, r I5 2400, or Ivy whatever they're all about the same).

Spend a little more on the board? 4.6ghz is usually a given on the FX 8320 given that both the multi and FSB can be used for overclocking. Me? I used the FSB as I find it much more challenging, and here we have my overclock stable 24/7.

http://valid.canardpc.com/2811220

And when I overclocked my Xeon.

http://valid.canardpc.com/2786471

You can note by the similarities and the usernames that they are both legit and both belong to me. And the fact is? the 8320 absolutely smashes the Xeon to bits in games that support 8 cores, and even dukes it out (thanks to the free 700mhz overclock !) with the Intel even on games that support less threads.

Linus from LTT did a couple of great 3570k vs 8350 videos (pre patch but whatever) and again, hardly anything in it, six of one half a dozen of the other, AMD always provided enough FPS to cover your minimums more than acceptably.
Quote:
Originally Posted by Shirty
As it stands, the overwhelming majority still come back to recommending Intel for the time being, despite how much we all want AMD to come out on top. However hard you try to defend your purchase, you are always going to come up against this brick wall unfortunately.

I can completely understand that. I'm not trying to get people to dump their Intel systems and move over to AMD (some will out of sheer boredom though). All I'm trying to do here is -

1. Show the true hard facts about what happens when AMD CPUs are being used properly and

2. Try to get the message across that times have changed. Within a year every game released (simply only because the devs are bloody lazy !!) will support 8 AMD cores, fully supporting their architecture, and not running quite as well on Intel CPUs.

I also think (regardless of the power consumption because nobody cares !*) that we should be excited that AMD are pushing their technology to new found levels of speed (5ghz barrier) and more excited that we will all be able to afford their offerings unlike Intel's range where you either buy one of their two top end models or end up with a derped locked down processor.

As of the news today? they've now removed their turbo bin overclock (4 cycles) meaning you can't even get 400mhz out of a non K CPU. They're pushing people to AMD.

* Intel 9xx series. My 950 used an eye watering 208w under load when overclocked using Prime.

Nvidia Fermi. Yes, they revised it, but even Bit-tech had to report that the 590 actually broke PCIE spec and tore it up and threw it out of the window by using more than 300w.

Nobody cares, man. All they want is fun with overclocking and brute force performance.
captain caveman 14th June 2013, 19:41 Quote
Slightly puzzled as as yet I haven't seen any official comparisons of the AMD fx 9590 to a intel so can somebody provide a link
Harlequin 14th June 2013, 19:54 Quote
haven't seen any reviews , anywhere of the 9590 yet
AlienwareAndy 14th June 2013, 21:07 Quote
That would be because it's not out yet :D

Not sure if they're even going to send out any review samples.. They seem to have been really weird with that. Finding a review of the 8320 is like trying to find rocking horse poo.
rollo 14th June 2013, 21:15 Quote
That's because its a paper launch that has not hit retail channels same old AMD different week. Not even engineering samples of said chips, it will be another ultra rare ultra cost chip they made one before that was £300+ that hit a pretty high level of ghz.

Wish Apple / Samsung would just buy them out already so we don't need to listen to there different supposed launches that arive 2-4 weeks after said launch.

Andy is pro AMD enough he has yet to post his core activity logs for his CPUs for some reason wonder if that's because every game he tested capped at 4 cores 0 threads. Think most would like to see him disable SLI and run the same tests on a single 670 so that others with a intel CPU and a single 670 could beat his scores.

Is AMDs 8 core still effectively 4 real cores and 4 smaller cores ? Or are we finally in real 8 core land.
Harlequin 14th June 2013, 21:23 Quote
same old NVidia with paper launches.... or


AMD have said its OEM only so no chance of seeing it retail.....


steamroller looks very interesting with the on die ARM core for low power work
AlienwareAndy 14th June 2013, 21:29 Quote
Quote:
Originally Posted by rollo
That's because its a paper launch that has not hit retail channels same old AMD different week. Not even engineering samples of said chips, it will be another ultra rare ultra cost chip they made one before that was £300+ that hit a pretty high level of ghz.

Wish Apple / Samsung would just buy them out already so we don't need to listen to there different supposed launches that arive 2-4 weeks after said launch.

Andy is pro AMD enough he has yet to post his core activity logs for his CPUs for some reason wonder if that's because every game he tested capped at 4 cores 0 threads. Think most would like to see him disable SLI and run the same tests on a single 670 so that others with a intel CPU and a single 670 could beat his scores.

Is AMDs 8 core still effectively 4 real cores and 4 smaller cores ? Or are we finally in real 8 core land.

3Dmark physics score for me was actually higher when running one 670. I have no idea why. It also isn't affected by turning it to Extreme mode. Thus there isn't much point given that the Physics score is the one we should be focusing on, not the graphics score.

I posted benchmarks that showed the difference between the AMD CPU and the Intel price for price equivalent (Oh go on then, I didn't, given the Intel still costs more) showing Crysis 3.

Cryengine 3 has been developed mostly for the PS4 and Xbone. As such it comes as no surprise that when AMD 8 core CPUs are used the results are very favourable. I posted two videos today, go and take a look. Both support 8 cores and both support what I have said in this thread. If the 8350 can duke it out with a 3770k in a game that supports it properly then there's really no need for me to keep posting benchmarks of my Xeon, given they'll be considerably lower than a overclocked I7 3770k.

However, just in case you wanted to see it again. Here is the Physics score from a £130+ Intel CPU that has been forcibly overclocked.

http://s72.photobucket.com/user/timmahtiburon/media/670SLIbenchmarks/3DM13.jpg.html

Noting the score of 5844 and that the CPU is forced to run at 3.4ghz when it runs 3.2 on 4 cores at stock.

Now let's move to the 8 core AMD with the 4.2ghz overclock.

http://s72.photobucket.com/user/timmahtiburon/media/Alienstuff/firestrike.jpg.html

Physics score is 8163 and is completely unaffected by SLI.

3Dmark is free to download, and you are free to run your CPU whatever it may be at 4.2ghz and post your results, paying attention to the Physics score.

But as I said earlier, when 8 cores are being used then clock for clock the AMD gets higher scores.
AlienwareAndy 14th June 2013, 21:32 Quote
Quote:
Originally Posted by Harlequin


steamroller looks very interesting with the on die ARM core for low power work

And will use the existing AM3+ socket. Which was a huge part of the reason why I decided to jump ship because I was sick of having to buy new bloody boards all the time.
AlienwareAndy 14th June 2013, 21:49 Quote
Right here you go then.

http://s72.photobucket.com/user/timmahtiburon/media/Alienstuff/singlegpu.jpg.html

The score has dropped by 100 points which is pretty much a margin of error. I wasn't about to run it over and again over the sake of 100 points.

So there, feel free to run the benchmark (trying to stay within the same ghz range) on any CPU and let's see how it stacks up to the £113 8320.

There's no point in my benchmarking my game collection as -

I have hundreds, including all of the latest

Games cost money, 3Dmark is free and a good way to get a rough idea of how a CPU performs by its physics score.

I've posted conclusive proof (at least I see it that way given I'm not dishonest) that my 8320 clearly beats my Xeon when 8 cores are being used. Which stacks up with the videos I posted today.
Harlequin 14th June 2013, 22:33 Quote
http://www.3dmark.com/fs/260551

physics score of 7870 - on my `old` X6 oO
rollo 14th June 2013, 23:17 Quote
An x6 + 7870 beats a fx8350 and a 670 hmm WTF lol. Even if it is by a crappy 80 points.

Harl is ahead on the averages thats pretty surprising.
Harlequin 14th June 2013, 23:35 Quote
that card wasn't an `LE` either just a regular 7870

would like to see a full compare link from andy
AlienwareAndy 14th June 2013, 23:43 Quote
Quote:
Originally Posted by rollo
An x6 + 7870 beats a fx8350 and a 670 hmm WTF lol. Even if it is by a crappy 80 points.

Harl is ahead on the averages thats pretty surprising.

Nice to know you've paid attention to this thread. I don't run a 8350 I run a 8320.

My CPU scores higher so I'm not sure what you mean. I also don't have my GPU overclocked as there's no point what with having SLI and all. He's got his overclocked to within an inch of its life :D

Edit. 3570k @ 4.2ghz.

http://s72.photobucket.com/user/timmahtiburon/media/Alienstuff/3dmark.png.html

See? dead level. Only difference is the price.
rollo 15th June 2013, 00:08 Quote
is all you care about physics score?
CrapBag 15th June 2013, 00:16 Quote
Yeh I too have noticed the over emphasis on physics score.

Im far more interested in overall gpu score than physics.
AlienwareAndy 15th June 2013, 00:17 Quote
Quote:
Originally Posted by rollo
is all you care about physics score?

Seeing that it's the score that pertains to the CPU, yes.

I said earlier in this thread that clock for clock in ghz when all 8 cores on AMD's CPUs are used they are faster than the four cores of the Intel 3570k at the same clock speed.

The same translates into games, too. The 8320 and 8350 clock per clock are up there with the I7, when supported properly.

Which has been my point throughout this entire thread. Well, that and this -

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

We approached a number of developers on and off the record - each of whom has helped to ship multi-million-selling, triple-A titles - asking them whether an Intel or AMD processor offers the best way to future-proof a games PC built in the here and now. Bearing in mind the historical dominance Intel has enjoyed, the results are intriguing - all of them opted for the FX-8350 over the current default enthusiast's choice, the Core i5 3570K.

And of course this.

as Avalanche Studios' Chief Technical Office, Linus Blomberg, tells us.

"I'd go for the FX-8350, for two reasons. Firstly, it's the same hardware vendor as PS4 and there are always some compatibility issues that devs will have to work around (particularly in SIMD coding), potentially leading to an inferior implementation on other systems - not very likely a big problem in practice though," he says.

Quote:
Originally Posted by Harlequin


would like to see a full compare link from andy

That's a bit of a faff for me because my desktop isn't connected to the internet. That's why my scores are not being sent up. I'll see what I can do but getting it online is a major faff.
AlienwareAndy 15th June 2013, 01:42 Quote
Well after a serious faff I managed to get the PC online.

http://www.3dmark.com/3dm/780243

Which says two things.

1. Disabling SLI in a profile doesn't work properly (I forcibly disabled it for that run)

2. As I mentioned earlier my physics score is higher when I only run one card. It's now where it should be vs the I5.
captain caveman 15th June 2013, 02:14 Quote
Quote:
Originally Posted by Harlequin
haven't seen any reviews , anywhere of the 9590 yet

thought as much, just puzzled by all the comments saying it will be slower than intel 3570.
I do hope its good if nothing else but to kick intel out of its mediocre ways and gives faster chips
AlienwareAndy 15th June 2013, 02:28 Quote
Quote:
Originally Posted by captain caveman
thought as much, just puzzled by all the comments saying it will be slower than intel 3570.
I do hope its good if nothing else but to kick intel out of its mediocre ways and gives faster chips

It will be faster than the 3570 by miles, when supported properly.

So expect lots of reviews slagging it off for poor single core performance, because at the end of the day it is an 8350 with a stupid high stock speed.

I mean don't get me wrong I wouldn't expect people to only review it using multi threaded apps and software, but give it time.. Might be a bit premature for some but once the consoles come along it'll come into its own :)

I would just Google 8350 5ghz. There are plenty out there running at that speed :)
Harlequin 15th June 2013, 09:12 Quote
Quote:
Originally Posted by AlienwareAndy
Well after a serious faff I managed to get the PC online.

http://www.3dmark.com/3dm/780243

Which says two things.

1. Disabling SLI in a profile doesn't work properly (I forcibly disabled it for that run)

2. As I mentioned earlier my physics score is higher when I only run one card. It's now where it should be vs the I5.

thank you for uploading a compare :D

seems the latest patch for 3dmark bumped physics scores up - mines at 8048 now (yours being 8215), but then again I am 2 cores and 500mhz lower!
konstantine 15th June 2013, 14:39 Quote
This is such a cheap move by AMD. Instead of wasting all that silicon on producing those crappy chips, why don't they use GLOFO's 32nm to manufacture GPUs, instead of buying more expensive silicon from TSMC..?

The Kabini quad core is pretty complex, logic wise, and consumes quite a bit less power than the competing dual cores..

Actually, why don't they just dump their moronic design philosophy and build quad issue cores with hyper-threading and no shared parts between neighboring cores? A 3-issue 10.5 Stars core is quite a bit faster than a Piledriver core at the same clock speed.
I mean hyper threading is such a great approach to maximize performance at a very low cost.

AMD is becoming the degenerate of the industry.. I hope they survive for the sake of keeping GPU prices down. Not that we're going to get any decent games in the future, cuz I've seen 'em those shitty upcoming games and they all look crippeld with alright visuals and crappy primitive mechanics.
Battlefield 4 particularly was such a disappointment.
AlienwareAndy 15th June 2013, 16:48 Quote
Quote:
Originally Posted by Harlequin
thank you for uploading a compare :D

seems the latest patch for 3dmark bumped physics scores up - mines at 8048 now (yours being 8215), but then again I am 2 cores and 500mhz lower!

Not surprising given that your cores are faster than Piledrivers :)

Obviously they're all being used properly too. Shame that AMD couldn't get support in for their mass cored CPUs earlier really.

Never mind. Do you still have it? haha will be worth hanging onto that, given it'll be right up there with the best for gaming :)
Shirty 16th June 2013, 11:49 Quote
220W proves to be true. Also, OEM only confirmed.
rollo 16th June 2013, 12:43 Quote
http://www.3dmark.com/3dm/787665?

my old i7950 rig for a comparison on physics. cpu at 3.06ghz stock speed. ( despite what it says of 2.8ghz lol 3dmark is still fubar for me )

For all of AMDS improvements its less than 200 points ahead of a 4 year old cpu.

If all you do is gaming would anyone upgrade for 200 points. Which is the point many have made why spend money on a CPU that you dont need when you can upgrade your gpus and push the graphics higher and higher.
AlienwareAndy 16th June 2013, 14:56 Quote
Quote:
Originally Posted by rollo
http://www.3dmark.com/3dm/787665?

my old i7950 rig for a comparison on physics. cpu at 3.06ghz stock speed. ( despite what it says of 2.8ghz lol 3dmark is still fubar for me )

For all of AMDS improvements its less than 200 points ahead of a 4 year old cpu.

If all you do is gaming would anyone upgrade for 200 points. Which is the point many have made why spend money on a CPU that you dont need when you can upgrade your gpus and push the graphics higher and higher.

But no one is telling any one to spend money on a CPU they don't need dude. Let's say you were buying a new PC or upgrading something like a Core 2 Duo?

Intel I5 = £204 (as of checking Aria this minute)
AMD FX 8320 + Asrock 990FX Extreme3 = £200.

In gaming? well we can agree surely that the most important part is the GPU yes? so the £100 you save over going with the FX 8320 and board that will allow a 4.2ghz overclock can then be put into a GPU. Something like a 7950 instead of the 7850 you would get if you went with Intel.

And the overall difference? so small (in CPU terms in gaming) that you'd have to be Clark Kent to tell the difference.

The I7 950 at one point cost nearly £300. I should know, I paid £270 for mine just as SB was about to launch (I was put off by the chipset problems with SB). The FX 8320 cost £130 at launch, and the I7 is still one hell of a CPU. More than enough for gaming I'm sure we can agree.

So if you can get a board and CPU that perform the same why spend £100 more?

I'm pretty certain that when devs all make the switch and start to fully optimise their engines for AMD's architecture that the difference will become even more pronounced. In which case why spend a ton more for something the same or not as good?

It's just like the AMD XP days. Cheap CPUs able to duke it out with, or better, Intel's far more expensive CPUs.

And something about your score doesn't add up either, given this.

http://www.3dmark.com/3dm/81405
rollo 16th June 2013, 15:35 Quote
Older version andy they patched it recently and the physics scores have gone up alot since that patch no idea why.
AlienwareAndy 16th June 2013, 15:41 Quote
Ah OK thanks for that !

I did some digging and yeah, appears there was a HT issue with the first revision.
Quote:
Originally Posted by Shirty
220W proves to be true. Also, OEM only confirmed.


I didn't doubt it if I'm honest. I must get myself a power meter...
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums