bit-tech.net

AMD FX-8120 review

Comments 1 to 25 of 59

Reply
Harlequin 27th July 2012, 08:18 Quote
A question - did you apply both bulldozer hotfixes to windows?
longweight 27th July 2012, 08:25 Quote
Oh dear AMD, I'd love it if they could produce something that at least provides a bit of competition for Intel!

Nice review :)
SMIFFYDUDE 27th July 2012, 08:28 Quote
Intel could go on holiday for 5 years, come back and still be beating AMD's new cpus.
How old is the i7 920 now?
Shame seeing the FX brand used on chips like this.
greigaitken 27th July 2012, 08:45 Quote
saw amd review and thought 'oh dear lets see what they got this time' and clicked on link. Turns out it's still poor little amd cant climb up to the tree house.
Harlequin 27th July 2012, 08:46 Quote
http://ark.intel.com/products/37147/Intel-Core-i7-920-Processor-(8M-Cache-2_66-GHz-4_80-GTs-Intel-QPI)


3 1/2 years old ;)

whilst at the top end AMD havent got a chance - with APU`s the roles are reversed - intel are still playing catchup to AMD


edit:

OP

http://support.microsoft.com/kb/2646060

^^ First apply this bulldozer hotfix then

http://support.microsoft.com/kb/2645594

^^ second hotfix to be applied
themassau 27th July 2012, 08:47 Quote
acurally in my country a i5 costs as much as a 8150. but i bought the 8120 because it was 50 euros cheaper. alse am3+ motherboards are more higher quality at the same price point. so i bought the saberthoot.

i also want to learn parallel programming so i could check how it scales at more cores. my previeus cpu was a athlon 6000+ so it feels like an upgrade.

but i will upgrade again when ddr4 is out en when descent APU's are on the market.
MjFrosty 27th July 2012, 10:42 Quote
FX 55, what a CPU that was.

They need to recruit some new engineers me thinks.
.//TuNdRa 27th July 2012, 11:42 Quote
It's not the engineers. AMD is using artificially generated CPU Silicon designs, not the original Hand-designed, carefully planned pathways that wound up more efficient and smaller. Which is letting them down in some areas.

So; I suppose it is the engineers, or at least; the lack of them, which is causing issues like this.

Bulldozer's Ridiculously long development cycle didn't help the chip. Considering AMD had to take a shot at the future from, what? Six years ago? They've not falling that far off the mark.

I'm amazed that, for once, A reviewer didn't get a chip that'd do crazy things, though. I can hit 5ghz on this chip. Albiet; not 100% stably. 4.8-4.9 seems to be the ceiling, so far as I've tested.

I'm never going to hear the end of this review, though. I could break out in the usual "Oh! Cinebench is optimized for Intel chips!" and all that crap, but it wouldn't change the other results. What really ruins the performance in things like the Image Editing is probably the memory speeds. Even the i3 2100 is capable of achieving higher memory bandwidth than the FX 8120.
Harlequin 27th July 2012, 12:28 Quote
http://www.insideris.com/amd-spreads-propaganda-ex-employee-speaks-out/
Quote:
Now, we did some digging and found out quite few interesting things. Here is a quote from the AMD ex employee himself, who posted few comments about the whole situation some time ago:


On paper bulldozer is a lovely chip. Bulldozer was on the drawing board (people were even working on it) even back when I was there. All I can say is that by the time you see silicon for sale, it will be a lot less impressive, both in its own terms and when compared to what Intel will be offering. (Because I have no faith AMD knows how to actually design chips anymore). I don’t really want to reveal what I know about Bulldozer from my time at AMD.


What did happen is that management decided there SHOULD BE such cross-engineering ,which meant we had to stop hand-crafting our CPU designs and switch to an SoC design style. This results in giving up a lot of performance, chip area, and efficiency. The reason DEC Alphas were always much faster than anything else is they designed each transistor by hand. Intel and AMD had always done so at least for the critical parts of the chip. That changed before I left – they started to rely on synthesis tools, automatic place and route tools, etc. I had been in charge of our design flow in the years before I left, and I had tested these tools by asking the companies who sold them to design blocks (adders, multipliers, etc.) using their tools. I let them take as long as they wanted. They always came back to me with designs that were 20% bigger, and 20% slower than our hand-crafted designs, and which suffered from electromigration and other problems.

That is now how AMD designs chips. I’m sure it will turn out well for them [/sarcasm]



BTW, you ask how AMD could have competed? Well, for one thing, the could have leveraged K8 and the K8 team’s success and design techniques instead of wasting years of time on a project that eventually got cancelled using people that had never achieved any success. It took Intel years to come out with Nehalem, and AMD could have been so far ahead by that point that they’d have enough money in the bank that they wouldn’t have to accept a low-ball settlement offer in the antitrust suit and they wouldn’t have to sell off their fabs.

Gives a totally different perspective on why Bulldozer failed, doesn’t


that italic parts are from 2010 - and the article i linked to is from 2011!
bagman 27th July 2012, 12:30 Quote
Quote:
Originally Posted by .//TuNdRa

Bulldozer's Ridiculously long development cycle didn't help the chip. Considering AMD had to take a shot at the future from, what? Six years ago? They've not falling that far off the mark.

Intel start to develop their chips 10 years in advance. But you also have to consider that the bulldozer chips were made under a time of uncertainty at AMD with new (was it new CEOs like every month or something because the share holders weren't happy?).

What I don't understand is how inefficient the bulldozer chips are. 579W @4.65GHz for a chip far from the performance of a i7 920 @4GHz which draws 411W.
Jhodas 27th July 2012, 12:46 Quote
AMD made some weird design choices. Follow this link for a comparison of the X6 1100, X4 980 and FX 8150. The L1 data cache size is smaller on the new FX for a start. (16k vs 64k)
AMD has also gone for massive 64-way associativity on the FXs L3 Cache. Associativity reduces cache misses, but increases cache latency. Ivy's L1 and L2 cache is 8-way and L3 is 16-way, which appears to be a better compromise (FX 8150 has 4-way L1 data, 2-way L1-instruction and 16-way L2 data).

Overall, Intel's cache design seems to be simpler and more effective. I think this is one of the main issues Bulldozer has besides the memory controller.
Bkid 27th July 2012, 13:17 Quote
Will be interesting to see, if AMD can make a dual core processor with 4 or 5 GHZ stock speed without any overclocking. Because i'll be the first one to buy it. :-)
Hustler 27th July 2012, 13:49 Quote
Still only a Quad Core with some fancy new Hyper Threading thrown in.

Dishonest marketing from AMD.
Zinfandel 27th July 2012, 13:55 Quote
Why do they even bother?
Harlequin 27th July 2012, 13:57 Quote
Quote:
Originally Posted by Zinfandel
Why do they even bother?

well if they didnt your fancy new ivy bridge i5 replacement would start at £500 and head upwards.....
[USRF]Obiwan 27th July 2012, 14:46 Quote
Their biggest mistake AMD made was to make these new cpu's for a new socket. Instead they could have used it on the AM3 (non plus) and a lot of customers would have bought it as replacement.
tad2008 27th July 2012, 15:00 Quote
AMD's stuff has certainly been a bit hit and miss over the years and desperately feel the need to upgrade as I am still using an AMD 64 X2 6000+ CPU on the AM2 socket with a mere 2GB ram and a Radeon HD6850 graphics card under XP.

I had thought of going with AMD's FM1 socket and the Athlon II X4 651K Black Edition CPU til I realised that the FM1 socket is already obsolete and would leave no future upgrade path and simply can't afford the extra layout for an Intel i3 2100 let alone an i5 2400.

Soo frustrating...
.//TuNdRa 27th July 2012, 15:13 Quote
Quote:
Originally Posted by Hustler
Still only a Quad Core with some fancy new Hyper Threading thrown in.

Dishonest marketing from AMD.

Not entirely. Some parts of it are completely doubled. It's more like 1.7 cores per module. It's just the Integer units aren't doubled, which means Bulldozer Tanks in any mathematical tests that stress more than four cores.

Quote:
Originally Posted by bagman
Intel start to develop their chips 10 years in advance. But you also have to consider that the bulldozer chips were made under a time of uncertainty at AMD with new (was it new CEOs like every month or something because the share holders weren't happy?).

What I don't understand is how inefficient the bulldozer chips are. 579W @4.65GHz for a chip far from the performance of a i7 920 @4GHz which draws 411W.

Chip production issues. This current batch are very, very badly produced. Bulldozer would have been much better if it didn't have this much in the way of silicon leakage. The new Stepping AMD is planning to roll out at some point (Although I'm not sure if that's been canned in favour of Piledriver) is said not to have as bad a case of these issues, but I don't hold much hope.

I'm committed now, with the amount of money i've sunk into this, and I will fight tooth and nail to prove that it's not crap, but I'm never, ever going to claim it's the best, or anywhere near.
Action_Parsnip 27th July 2012, 17:05 Quote
There is no B3 stepping of Bulldozer coming. The next version will be the piledriver version.
.//TuNdRa 27th July 2012, 17:11 Quote
I was suspecting that was the case, from the absolute lack of information i'd heard about a different stepping, while AMD was busy pushing Piledriver as hard as they could.
Anfield 27th July 2012, 17:11 Quote
A 3770k oc'd to 4.8Ghz uses 335!!! Watt less than a 8120 running at 4.5ghz, yes, the 3770k was in a matx board, but even if you knock off 20W thats still more than 300W difference.
Sure, if you are in a country with cheap electricity thats not relevant, but for example in Northern Ireland that is pretty much the equivalent of Bulldozer being banned from sale as you'd need a second mortgage to pay for the electricity it wastes.
bulldogjeff 27th July 2012, 17:35 Quote
I was reading this and thinking, no reason to throw my 1100T in the bin just yet and I got to the last sentence and it summed it all up. Meh!!!
Somer_Himpson 27th July 2012, 17:38 Quote
Why the **** do AMD even bother with this shite anymore.
Never, ever compares to Intel on price, performance, pretty much everything that matters.

Are the engineers deluded when they make and try and market this crap?
Harlequin 27th July 2012, 17:46 Quote
again without AMD in this market Intel would charge what ever they want to - so how does a £500 i3 sound to you?
.//TuNdRa 27th July 2012, 18:10 Quote
Wouldn't happen, Harlequin. Any time AMD is at risk of going under; Intel takes the rap for it. They need to outperform, but they literally cannot afford to shut them out of the market. Intel is not allowed to become the single distributor of CPUs, otherwise they'd do exactly as you said; sell existing parts for exuberant prices.

As is: AMD keeps going out of vain hope, perhaps they can turn the Orochi architecture into something good, but Bulldozer isn't, and i'm disappointed to have bought it, even if the whole setup was cheaper than a 2500K & an equivalent SLI-capable P67 board at the time when I did purchase.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums