bit-tech.net

AMD says Fusion CPU and GPU will ship this year

AMD says Fusion CPU and GPU will ship this year

We'll see Fusion CPUs in the shops in 2011, but system builders should get them before Dec 31st 2010.

Fusion is AMD’s forthcoming CPU + GPU product and until we had the chance to catch up with AMD at a recent briefing, we believed it was on track for a 2011 launch. It appears the company is confident that it will now be able to release the chips this year.

AMD has been talking about Fusion for years, and Intel has already beaten AMD by launching its Clarkdale Core i3 and Core i5 CPUs with integrated graphics, but these use separate CPU and GPU dies combined in a single CPU package.

AMD however, is determined to combine everything into one piece of silicon. We asked spokesperson Bob Grim why that was: " I don’t think there’s a simple answer to that" said Grim, "if you look at the history of AMD, when we came out with dual-core processors, we built a true dual-core processor. When we came out with quad-cores, we built a true quad-core processor. What our competitors did was an MCM solution – taking two chips and gluing them together.

"There was a fair bit of engineering work involved too, but we just have a tradition of building a piece of silicon from the ground up, in fact the only MCM (Multi-Chip Module) solution I’m aware of that we’ve ever done is on the server side with our 12-core product (the Opteron 6174). [Regardless], what we’ll be launching with Fusion is definitely all on one die."

When asked if building a CPU/GPU hybrid chip on a single piece of silicon would yield any advantages beyond speed, Grim replied, "We hope so. We’ve just got the silicon in and we’re going through the paces right now – the engineers are taking a look at it. But it should have power and performance advantages. Going to the 32nm [manufacturing process] is also going to help. I believe Dirk (Meyer, CEO) has gone on record saying we’re going to ship some to customers this year, so hopefully we’ll be able to deliver on that promise. I’m confident that we will be - the silicon looks good. I don’t see any reason why we wouldn’t [hit Meyer’s target] based on where Fusion is now.

What do you think? Do you care that all the elements of a CPU are combined in a single piece of silicon or does it not matter as long as the thing works? And are you excited by the prospect of AMD's Fusion CPU? Thoughts in the forums please.

58 Comments

Discuss in the forums Reply
Teq 15th May 2010, 10:35 Quote
I'm keeping an eye on this project, it could drop the cost of a HTPC a little with possible performance gains, to early to say though but I'm optimistic :)
MrGumby 15th May 2010, 10:36 Quote
Surely this whole CPU/GPU package concept is best consigned to the laptop/HTC market?
azrael- 15th May 2010, 10:37 Quote
Well, Fusion (and Fusion-like tech) will definitely spell the end for integrated graphics. Apart from that I can't quite see what kind of impact it'll have on computer systems. It'll probably make it cheaper to do entry-level systems, though. Right now, I'm mostly having a "meh" moment.
NuTech 15th May 2010, 10:46 Quote
I can definitely see a use case for this technology in desktop PCs.

If they make a great chip for gaming that allows you to disable the on-board GPU, when it comes time to upgrade your CPU/motherboard, you can enable the GPU and turn it into a server rig or second computer.

Actually I would like to see more motherboard manufacturers integrate graphics on their high end products for the same reason.
Adnoctum 15th May 2010, 10:46 Quote
Am I excited? Hell yes!

This is the new FPU. Integrating a faster, more capable math crunching unit into all AMD CPUs. If developers can rely on all CPUs having decent GPGPU capabilities, can you imagine how software and it's use will change the way we use the computer?

Best thing? Intel will be FORCED to create graphics that doesn't blow!
Second best thing? AMD has been saying that the Fusion graphics core will be annually updated, so the core will be near current. No more ancient GMA950 in your netbook.
Adnoctum 15th May 2010, 11:18 Quote
Quote:
Originally Posted by azrael-
Apart from that I can't quite see what kind of impact it'll have on computer systems. <snip> Right now, I'm mostly having a "meh" moment.

I bet the were many people who were going "Meh" when the FPU was being integrated, but where would you be now without one?

I think too many people are looking at this as an integrated graphics core, and not as the stream-processing core it is.
The fact is we still don't really know where we are going with GPGPU and what we can do with it. It isn't all HD encoding and transcoding. I think that the brake on development has been the poor state of integrated graphics (ie. Intel) in 75% of systems.

Imagine every computer, from bottom to top, having a capable GPGPU core? Software developers would be able to count on it being there, just like they can count on a x86 CPU having a FPU.

On a side note: what of Nvidia's GPGPU strategy when every full-fat Opteron has one or more of these cores on die? What of Intel's when such Opterons are spanking the Xeons in database operations (Larrabee to the rescue...)?

I think this is the most exciting CPU development in a long while. The fact that we don't really know what is going to happen is great. It means that there is room for this to change everything, not just a speed bump or a process shrink.

Or maybe it will fall flat on its face? :?
rickysio 15th May 2010, 11:21 Quote
Intel's current batch of graphics are already on the level of AMD's.

I do wonder whether SandyBridge will launch earlier or not.
LightningPete 15th May 2010, 11:23 Quote
AMD intergrated graphics solutions usually are better performing chips than Intels. And AMD HD3200 integrated chip versus the X4500 Intel chip for example.
High end part of fusion could see an entry level gaming system?
Would be nice to get entry level systems down in price though. System builders are still charging like 350-500 for basic level systems.
Bindibadgi 15th May 2010, 11:42 Quote
HD 3200 versus X4500 wasn't far off, only drivers separated them but in terms of video playback Intel's ClearVideo is fantastic.

As for GMA-HD, imo it's ahead of the latest 880G from AMD overall, so I hope AMD pulls something great out the bag with the Fusion GPU core.
Pete J 15th May 2010, 11:55 Quote
Why do AMD make such a big deal about everything being on one bit of silicon? IIRC, the first quad cores from Intel were two separate dual cores - and it destroyed anything AMD had to offer.
firex 15th May 2010, 12:03 Quote
Why are we talking about the Fusion GPU core as if it's like an integrated graphics core? I'm pretty sure I've read time and again that it will be using the ATI 4000 or 5000 series core...that would definitely beat GMA-HD or any integrated graphics solution for the foreseeable future...

AMD's approach to build X cores on single silicon brings lots of advantages (on paper). However, the fact that the original Phenoms lag behind Core 2 Quads is because of AMD 'reusing' the old K8 architecture...whereas Intel uses a brand new architecture in Core 2. And AMD's very late launch of Barcelona make the performance difference look worse than it actually was.
aussiebear 15th May 2010, 12:05 Quote
Quote:
What do you think? Do you care that all the elements of a CPU are combined in a single piece of silicon or does it not matter as long as the thing works? And are you excited by the prospect of AMD's Fusion CPU?

Well, let's look at AMD's first Fusion processor: Currently codenamed Llano.

From what I know...

(1) It will be aimed for the mainstream. In fact, it replaces the Athlon II line in 2011. Which suggests it will be reasonably affordable by many.

(2) Based on a highly tweaked version of the Phenom II for its CPU part. (32nm process). They've dropped the L3 cache, and upped the L2 cache to 1MB per core. It will start from 3Ghz or higher. And it will be coming in dual, triple, and quad-core versions; operating at 0.8v to 1.3v.

(3) Will introduce power gating (similiar to that of the Core i-series) and other power saving features. I hear the whole processor is rated to have a TDP of 20W to 59W. (Starts at 20W for Notebook versions; while Desktop versions will start at 30W.)

(4) The IGP element of the processor is said to be based on Radeon HD 5xxx consisting of 400 stream processors. So I'm guessing we can expect Radeon HD 55xx to 56xx performance from it. Somewhere around there.

(5) It will require a new motherboard, as the entire northbridge is now on the CPU. The motherboard will only house the "Hudson-D" southbridge.


I'm excited for a number of reasons.

* It sets the first step for an affordable heterogeneous processor that can actually do GPGPU work.

Intel's HD Graphics (found in current Clarkdale CPUs) is really an enhanced X4500 series IGP. It offers very little GPGPU capability. Intel's next generation "Sandy Bridge" uses an enhanced version of the current HD Graphics found in Clarkdale. So again, it has little GPGPU capability; but it will be very good for HD playback role. (As that is what Intel is focusing with their IGPs.)

...And while 2nd generation Larrabee is still being worked on, (as the first generation has missed its window of opportunity); I doubt we'll see an IGP variant until 2yrs+ later.

* This processor would be perfect for OpenCL. (As that doesn't care what type of processor is available; as long as it can be used.)...ATI's Stream SDK for software developers is being improved to support Llano for a reason. ;)

* Its also the first step to gradually reduce the FPU in favour of GPU-like stream processors. AMD's 2nd generation (2015?) will actually combine core elements of GPU into CPU. There won't be any distinct GPU and CPU modules in the future.
=> http://www.xbitlabs.com/news/cpu/display/20100512150105_Second_Iteration_of_AMD_Fusion_Chips_Due_in_2015_AMD.html

The way the "Bulldozer" architecture is arranged, I'm guessing AMD will eventually replace the K10.5 cores in Llano with "Bulldozer" in the next 2 years.

* While I don't expect Llano to best "Sandy Bridge" (let alone the current Intel Clarkdale processors) in a clock-for-clock manner as its still based on the Phenom II; I do expect that AMD will raise the bar to IGP performance. It means Intel is going to have to up their IGP game...Result? End-users will benefit from improved IGPs! (Game developers will have more room to play with!)

* AMD makes a better attempt at addressing its fundamental issue for the mobile market...Power consumption and resulting battery life.

...While I don't expect it to match Intel's notebook solutions in battery life; I do expect to see a notable improvement over the current AMD based notebook solutions.

* Assuming AMD follows the current pricing trend they have with the Athlon II line; AMD's first Fusion processor will be affordable. It'll be a stepping stone to encourage software developers to start looking into using OpenCL, DirectCompute, etc in a more serious perspective.

And lastly...
* I'm still hanging on to this dinky little single-core 1.6Ghz@2.0Ghz Sempron (Socket 754, 65W).

I want to upgrade it to quad core (at least 3.2Ghz) that is rated at 45W TDP. :)

I think its possible with 32nm; given that AMD will release Athlon II X4 615e by the end of this year. (That's 2.5Ghz quad-core; rated at 45W TDP.)
aussiebear 15th May 2010, 12:16 Quote
Quote:
Originally Posted by Pete J
Why do AMD make such a big deal about everything being on one bit of silicon? IIRC, the first quad cores from Intel were two separate dual cores - and it destroyed anything AMD had to offer.

Because AMD designs processors for the server/supercomputing roles. Integration is especially important where you start scaling up to 4, 8, 16, etc processor sockets.

These features don't mean crap to the typical desktop user because:
(1) They only use 1 CPU socket.
(2) They don't use their computers in an intensive manner such that it requires huge bandwidth.

Intel knows this, so its cheaper/quicker to slap together things and push it into market.

AMD tries to design things elegantly from an engineering standpoint; as they don't have the resources to throw around. (With the K8/K10/K10.5 series; they made one architecture, and then trickled it down to different markets.)

Where as, Intel dumps huge engineering resources/talent and brute forces a solution with the best features they can shove in. (Then they pull them out again to address the affordable/low-end markets.)...Of course, they also have enough resources to accommodate multiple architectures at the same time.

AMD couldn't do this previously. But that looks to change in 2011.
Low end (netbook/nettop) => Bobcat
Mainstream (Desktop/Notebook) => Llano
Enthusiast/Performance/Workstation/Server => Bulldozer
StoneyMahoney 15th May 2010, 12:19 Quote
Intel's decision to integrate CPU and GPU into the same package cemented the Intel 1-2 combination in the future of every PC sold to a business for the next god-only-knows-how-many years. That's where the real money is - it's how Intel sold the overwhelming majority of all GPUs last decade - and every initiative AMD has come up with to crack into the volume corporate market has only got as far as a brief flirtation.

How much of that is up to performance economics and how much is up to Intel being naughty (and thus pulling in a world record-breaking anti-competition fine in the EU courts) is questionable, but the fact remains that AMD has been hopping along behind Intel for some time now and can do nothing but react to Intel's releases.

Integrating the CPU and GPU does nothing significant to performance, so it's purely a business/economics decision. Until some killer must-have GPU-accelerated business app appears (I'm thinking some kind of real-time business intelligence analysis software?) then GPU performance will continue to be irrelevant to the majority of the PC market. Even when something does turn up, how will AMD take advantage of it's appearance when the difference in performance between Intel's and their own integrated GPUs is so marginal, especially compared to the performance of an add-in card?
Arj12 15th May 2010, 12:39 Quote
Well seeing as the CPU and GPU is going to be on a smaller die compared to intel's current ones (well the GPU anyway!) this should mean the chip will be more power efficient and produce less heat =D Can't wait for the release now as I am in the market for a new laptop soon!
Autti 15th May 2010, 12:55 Quote
Sorry but what is the difference between having the two chips on one piece of silicon, compared to have the two chips on two different pieces but still linked together.
I don't get why this is so big... in fact unless there is an interface boost it is a very bad idea as its more expensive.
failure rate of the gpu creation and cpu creation will now be combined in a single piece of silicon, where as with intel, each parts are independent and hence create higher yields during fabrication,
rickysio 15th May 2010, 13:02 Quote
So I went back to : http://www.bit-tech.net/hardware/cpus/2010/04/21/intel-sandy-bridge-details-of-the-next-gen/1

Seems that 2011 will be the Year of Integrated Graphics. :/
l3v1ck 15th May 2010, 14:07 Quote
This has me worried. All AMD want to talk about is the integrated GPU. I thought part of fusion was that they'd be ditching their K8 (or at least K8 circa 2003 derived) architecture, and bringing out a totally new one. The fact that they're not talking a lot about a new architecture makes me think there isn't one. Or if there is, it's not good enough to compete with Intel's Nehalem. Either way it's bad news for AMD and consumers.
javaman 15th May 2010, 14:27 Quote
Im excited about this but Im worried about upgrades. While integrating basic graphics into the CPU is a great idea for lower power usage, Come higher end gaming machines if you want more GPU horse power you have to upgrade the whole processor. I don't feel total integration is the way to go. I also wonder if these will also offer a similar thing to hybrid crossfire.
Zurechial 15th May 2010, 14:55 Quote
If both AMD and Intel were to integrate decent GPUs in the majority of their lineup then the PC gaming industry would get a huge boost by not being hampered by the lowest common denominator of shitty Intel GMA chips.

Alas, I can't see that happening.
Burnout21 15th May 2010, 15:52 Quote
definitely interested in this for a future server/HTPC i've been drooling to build ever since the i3/i5 integrated GPU was released. However H55 plateform just seems so expensive for the performance given.

Future laptops, mmmmm hopefully AMD will think fast and build a CPU with integrated FirePro workstation level graphics for moblie workstation laptops.
Phil Rhodes 15th May 2010, 15:59 Quote
If you do an MCM you can combine speed grades more easily, which means you don't have to sell a whole slice of silicon as a B just because one part of it is too slow to be an A.

On the other hand you can probably make a single chip go faster and soak less power.

Choices, choices.
greigaitken 15th May 2010, 16:50 Quote
right now you can dissipate 200w cpu and 400w gfx with those numbers going up. IT would be really tricky to combine cpu & gfx with one cooler on the high end side in future so i guess this will be mainstream only for many years.
Skiddywinks 15th May 2010, 19:02 Quote
I should imagine native designs also carry a decent advantage in terms of latency.
Blackie Chan 15th May 2010, 19:22 Quote
Psssh, we settled this argument years ago. We don't need discrete GPUs, just single powerful processors. Right? right?

Anyone I'm gonna go play some starcraft.
technogiant 15th May 2010, 19:26 Quote
I think it will be used in different way on different platforms, for the low end it will be used to provide lower power consumption, price and improved graphics capability. On high end machines a discrete gpu will still be needed in which case the fusion gpu will be used as an opencl co processor freeing the discrete gpu for graphically orientated tasks.
Dragunover 15th May 2010, 19:38 Quote
I look forward to getting one of these systems for my lowest end desktop PC.
Looking forward to a specific date, availability, and pricing.
technogiant 15th May 2010, 20:19 Quote
Read here:- http://www.tech-news-daily.info/analysis-amd-bulldozer-the-fightback-begins-in-2011.html that
the bulldozer architechture may be lacking in floating point power when compared to sandybridge, is bulldozer getting the fusion make over? If so a 5000 series gpu integrated into its structure would well be the answer to that problem.
NethLyn 15th May 2010, 20:49 Quote
Quote:
Originally Posted by MrGumby
Surely this whole CPU/GPU package concept is best consigned to the laptop/HTC market?

I wish it was, something the mag would benchmark and then we'd have some idea how good a portable you have.

When looking around for laptops an AMD Turion doesn't tell me jack about its potential performance compared to a desktop Athlon II or Phenon II, same for the QLxx series, hence sticking with Intel for my rellies' portables where I knew it would be good as long as they stuck with the Core 2 family. Next Gen there will be Core i5 in the middle or even a high i3, immediately I can go back an old issue and compare the benchmarks. If the graphics performance is good, Fusion will sell by the truckload.
Iorek 15th May 2010, 21:23 Quote
For mobile, I guess its a good thing.. for office / media pc's... maybe if it saves on power / heat etc. Small yet still usable office computers / media machines can't be a bad thing - not like a media pc needs a full on graphics card so long as it can output certain formats.

For gaming PC's, certainly not - I'd much rather have separate chips for both bits - that way one can be upgraded without the other.

My worry I think is the enforced "you will by a CPU with this" and then disable it if you don't want it... so long as there continues to be the option to buy a chip without it.
HourBeforeDawn 15th May 2010, 22:37 Quote
Well with AMD and ATI being the graphics Im sure this will be an area where AMD will do MUCH better then what Intel is offering in terms of on board graphics so Im looking forward to this and some implementations of this into a netbook style device.
Action_Parsnip 15th May 2010, 23:39 Quote
Quote:
Originally Posted by Pete J
Why do AMD make such a big deal about everything being on one bit of silicon? IIRC, the first quad cores from Intel were two separate dual cores - and it destroyed anything AMD had to offer.

Well intel went the same route too, so I can't really see what point your making there. Plus on the server side even 65nm phenom often bested 45nm intel quads. Communicating with seperate dies can under certain circumstances be a real drrrrraaaaaaggggggggggg ug ug ug ug
Action_Parsnip 15th May 2010, 23:42 Quote
Quote:
Originally Posted by Phil Rhodes
If you do an MCM you can combine speed grades more easily, which means you don't have to sell a whole slice of silicon as a B just because one part of it is too slow to be an A.

On the other hand you can probably make a single chip go faster and soak less power.

Choices, choices.

I think in this case tho. these chips will be relatively eensy weensy, with a mega omg lol wtf ftw chip grading 2 slices would be an idea.
Pharago 16th May 2010, 05:09 Quote
I don't know how they are going to work yet, but I can see an advantage if the integrated gpu is fast enough, with enough stream processors, basically concerning the access to ram.

DDR3 ram is cheap and people is starting to be able to use 16GB etc, that's a lot of room and if it can be shared by a gpu without having to process stuff and store it first on the vram (1-2 GB) and then pass it through the PCIe channel so it can be processed by the CPU.
Adnoctum 16th May 2010, 09:52 Quote
Quote:
Originally Posted by Autti
Sorry but what is the difference between having the two chips on one piece of silicon, compared to have the two chips on two different pieces but still linked together. I don't get why this is so big... in fact unless there is an interface boost it is a very bad idea as its more expensive.

1) Cost of manufacture. An MCM processor requires extra processes during manufacture. A two or more chip design costs more for a motherboard manufacturer, and the consumer.
2) Interconnect speeds between dies is slower than data transfer within a single die.
Quote:
Originally Posted by Autti
Failure rate of the gpu creation and cpu creation will now be combined in a single piece of silicon, where as with intel, each parts are independent and hence create higher yields during fabrication,

What happens with CPUs and GPUs now in the event of a flaw? The area will be fused off or bypassed and the product sold with different specifications. This is a non-issue.
Quote:
Originally Posted by l3v1ck
All AMD want to talk about is the integrated GPU. I thought part of fusion was that they'd be ditching their K8 (or at least K8 circa 2003 derived) architecture, and bringing out a totally new one.

Have a look at the rest of this thread, this is what most people are talking about. Naturally, AMD is addressing this issue, as most people, even enthusiasts, aren't looking at the potential change this may have on how data is processed.

Aussiebear has written elsewhere about Bulldozer, which is AMD's new high end CPU to take on Intel. Eventually, Fusion will come to the high end, but the most need is at the mainstream end. Llano, which is AMD's first Fusion chip and the one discussed here, is for mainstream and laptops. It is just a Phenom II core shrunk to 32nm and a GPU core integrated. The original design was for it to use the Bulldozer architecture, but for wherever reason, AMD has chosen to debut with the Phenom II core, probably for time, cost or resource reasons. They probably don't want Llano to delay the Bulldozer introduction by splitting resources.
Is it really that big a deal? Does anyone really need more than a Phenom II-class dual/quad processor in their laptop/mainstream desktop?
Quote:
Originally Posted by Iorek
For mobile, I guess its a good thing.. for office / media pc's... maybe if it saves on power / heat etc.
For gaming PC's, certainly not - I'd much rather have separate chips for both bits - that way one can be upgraded without the other.

But you are still focusing on "integrated graphics". It is that, but it is also a very capable math "co-processor" for stream processing tasks. Discrete GPUs are not going to disappear, and these aren't designed to replace them. But it will replace all those HDx200 - HDx500 cards. As for gaming, how about your Fusion CPU doing all those physics calculations without impacting FPS?
Quote:
Originally Posted by Iorek
My worry I think is the enforced "you will by a CPU with this" and then disable it if you don't want it... so long as there continues to be the option to buy a chip without it.

Why would you disable it? You wouldn't go and disable the FPU on your CPU, or your Memory Controller, or your L2 cache. This is Fusion Mk1, where the GPU side is still somewhat separate. In future, when Fusion is built into Bulldozer, the integration will be complete and it will be utilised for...well, that is up to developers.

We all have a limited concept on how this is going to pan out, and that is why AMD is selling the "integrated graphics" part. GPGPU computing really hasn't taken off yet, and it is hard to see where it is going to go and what we are going to do with it. But having a GPGPU-capable module within all CPUs will allow developers to really show us something new.

At the moment Intel is like many people here, only looking at the integrated graphics. AMD is looking past that, and if Larrabee is any indication, then perhaps Intel is too. If Intel can't sort Larrabee out, perhaps they should licence Nvidia graphics. Maybe Nvidia can be the ARM of the graphics world? ;)
technogiant 16th May 2010, 10:45 Quote
^ QFT
EvilRusk 16th May 2010, 11:13 Quote
If you are a gamer, an on-chip GPU could maybe mean disabling discrete GPU when not gaming (massive heat/power saving) as well as giving you a built-in physics processor when you are gaming with the discrete card. It would be great for games! We might see developers doing more with physics like the promises of a couple of years back!
bogie170 16th May 2010, 11:38 Quote
Will they use these chips in netbooks? If so i'm excited!
javaman 16th May 2010, 14:13 Quote
I've saw rumours Apple could be looking to use Llano
dyzophoria 16th May 2010, 18:31 Quote
Quote:
Originally Posted by javaman
I've saw rumours Apple could be looking to use Llano

most probably, llano will cost less to build ultimately, apple will then be able to either sell their systems at the same price as their offerings today and still maintain the same (or a bit better improvement from their current integrated graphics solutions) or sell them much more expensive because its new. lol, dont worry it will be "awesome" though.
crazyceo 16th May 2010, 20:39 Quote
Hang on, this is just more AMD hot air and white noise about nothing

"if you look at the history of AMD, when we came out with dual-core processors, we built a true dual-core processor. When we came out with quad-cores, we built a true quad-core processor. What our competitors did was an MCM solution – taking two chips and gluing them together"

And that little glue job from Intel kicked your ass so bad, you still haven't recovered from it. Q6600 anyone?


I'll take notice when Bit-Tech get their hands on it and give us the REAL deal instead the AMD BS PR machine.
Elton 16th May 2010, 21:42 Quote
Quote:
Originally Posted by crazyceo
Hang on, this is just more AMD hot air and white noise about nothing

"if you look at the history of AMD, when we came out with dual-core processors, we built a true dual-core processor. When we came out with quad-cores, we built a true quad-core processor. What our competitors did was an MCM solution – taking two chips and gluing them together"

And that little glue job from Intel kicked your ass so bad, you still haven't recovered from it. Q6600 anyone?


I'll take notice when Bit-Tech get their hands on it and give us the REAL deal instead the AMD BS PR machine.

You just love banging on AMD don't you? I'll admit their performance isn't up to par anymore, but they're still doing something, although not enough to be honest.
javaman 16th May 2010, 23:36 Quote
Quote:
Originally Posted by crazyceo
Hang on, this is just more AMD hot air and white noise about nothing

"if you look at the history of AMD, when we came out with dual-core processors, we built a true dual-core processor. When we came out with quad-cores, we built a true quad-core processor. What our competitors did was an MCM solution – taking two chips and gluing them together"

And that little glue job from Intel kicked your ass so bad, you still haven't recovered from it. Q6600 anyone?


I'll take notice when Bit-Tech get their hands on it and give us the REAL deal instead the AMD BS PR machine.

Yea, Pentium D's worked really well
Adnoctum 17th May 2010, 04:35 Quote
Quote:
Originally Posted by bogie170
Will they use these chips in netbooks? If so i'm excited!

Probably, but netbooks could also use a different chip, the architecture codenamed "Bobcat", and will debut with the Ontario core (they say 1H 2011, but...?). It is a smaller, lighter CPU in the <1-10W class for UMPCs, handhelds and tablets, set-top boxes and all the way up to netbooks, basically an Atom competitor. You might get Atom-like performance with a battery life of 12+ hours.

I'd prefer Llano, TBH.
Quote:
Originally Posted by crazyceo
<SNIP>

I think we can all agree that it is a load of marketing-speak that means little. Thanks for pointing it out or I might have missed it and been taken in by AMD's lying PR!!

Only, they are technically correct, aren't they. :) And as we have all learnt from Bureaucrat 1.0 from Futurama, technically is the best kind of correct.
crazyceo 17th May 2010, 08:26 Quote
but when it comes to Futurama, I only follow one law......and that's "Brannigan's Law!"
streamcomputing 17th May 2010, 08:43 Quote
A week ago I blogged about it (my company does OpenCL, acceleration with GPUs): http://www.streamcomputing.nl/blog/2010-05-04/x86-soc-and-gpgpu

I did not expect this to happen so soon, although it looks like we still have to wait half a year. This is really HUGE for GPU-accelerated computing!
alpha0ne23 17th May 2010, 09:13 Quote
I've been waiting for this for too long now and its getting hard not to look @ an Intel m-itx solution using the h55 chipset and i3 CPU

Please hurry AMD
crazyceo 17th May 2010, 09:32 Quote
Quote:
Originally Posted by alpha0ne23
I've been waiting for this for too long now and its getting hard not to look @ an Intel m-itx solution using the h55 chipset and i3 CPU

Please hurry AMD

I had to build a rig from scratch last month and was about to use the Bit-Tech budget build as the basis. This currently and has for a few months used the Athlon II X2 250 and a 770-C45 board. However when I checked the CustomPC guide it showed the i3-530 and the H55M-UD2H. When I priced it up from a number of sources the i3 option only added an extra £54 to the build but offered so much more in performance.

Now the anti-AMD in me was in a bit of a pickle. Do I put my tail between my legs and go and stand in the naughty corner and buy the 250 and C45 combo or go for the new kid on the block i3 - H55M?

Trust me it wasn't an easy decision to make but the blue colours just tilted it in their favour.

Will Fusion make a difference to that decision in future?

Well nothing above or when it was stated first a few months back will change my thinking until we have REAL independent benchtests from Bit-Tech.
Jamie 17th May 2010, 09:59 Quote
Question is, if this takes off in a big way, could we see Apple moving to AMD in their Macbooks?
rickysio 17th May 2010, 11:20 Quote
Quote:
Originally Posted by Jamie
Question is, if this takes off in a big way, could we see Apple moving to AMD in their Macbooks?

Perhaps, perhaps not.

SJ's reality distortion field is screwing with my prediction abilities.
frontline 17th May 2010, 18:50 Quote
This looks promising, look forward to seeing some reviews soon
javaman 17th May 2010, 23:22 Quote
Quote:
Originally Posted by Jamie
Question is, if this takes off in a big way, could we see Apple moving to AMD in their Macbooks?

To quote myself
Quote:
Originally Posted by javaman
I've saw rumours Apple could be looking to use Llano


Source

Finally found the article =) Guess that means Apple have had access to AMD's road maps and future plans beyond what we know. Lets hope they keep it as safe as their prototype iphone 4G's
Sloth 17th May 2010, 23:51 Quote
Quote:
Hang on, this is just more AMD hot air and white noise about nothing

"if you look at the history of AMD, when we came out with dual-core processors, we built a true dual-core processor. When we came out with quad-cores, we built a true quad-core processor. What our competitors did was an MCM solution – taking two chips and gluing them together"

And that little glue job from Intel kicked your ass so bad, you still haven't recovered from it. Q6600 anyone?


I'll take notice when Bit-Tech get their hands on it and give us the REAL deal instead the AMD BS PR machine.
One day you're going to realize the irony of complaining about AMD propaganda. Which company dumps millions into spewing out commercials to trick people into buying Celerons?

The fact is that Intel was not able to engineer an architecture capable of being a single chip multi-core solution until some time later. AMD has some bright engineers and is proud of their forward thinking and dedication to creating a complete solution rather than a quick fix. A year from now Intel's probably going to be waving around their own single chip solution, AMD's just getting their time to shine until then.

This news also makes me excited for my dreams of a mini-ITX machine, portable desktop light gaming machine ftw! Some extra performance over current solutions would be welcome.
Adnoctum 18th May 2010, 09:04 Quote
Quote:
Originally Posted by crazyceo
but when it comes to Futurama, I only follow one law......and that's "Brannigan's Law!"

You mean you do it hard and fast?
I am more the technical type who pays attention to the details. To each his own!

The Fusion graphics core is based on the HD5000-series (Evergreen) and the specification rumours are suggesting that it will have 400 stream processors. Based on this and nothing else (it could be anything. Remember AMD sandbagging the specs on the HD48xx before release?), this would put the performance at a similar level to the HD5600 discrete graphics card. Do you think Intel will have anything close to this in the near future?
Quote:
Originally Posted by alpha0ne23
I've been waiting for this for too long now and its getting hard not to look @ an Intel m-itx solution using the h55 chipset and i3 CPU. Please hurry AMD

If you are already eyeing an i3/H55 setup, then you aren't going to wait for Llano. AMD has already released the Llano schedule: Engineering Samples to partners now, full production Q4 2010, availability Q1 2011. Can you wait that long?
crazyceo 18th May 2010, 09:34 Quote
Quote:
Originally Posted by Sloth
The fact is that Intel was not able to engineer an architecture capable of being a single chip multi-core solution until some time later. AMD has some bright engineers and is proud of their forward thinking and dedication to creating a complete solution rather than a quick fix. A year from now Intel's probably going to be waving around their own single chip solution, AMD's just getting their time to shine until then.

Well the irony here is that AMD are shouting from the rooftops their so called excellence at not doing "a quick fix!" by then DOING "a quick fix!" on their Opteron 6174, just to get there first! which is their most expensive product line. So why is it ok there to allow the, what did someone else say "drrrrraaaaaaggggggggggg ug ug ug ug" here but Intel can't for their older chips which despite having that "drrrrraaaaaaggggggggggg ug ug ug ug" still hammered the competition (used lightly).

As to the marketing of Intels Celeron range, what about AMD's complete Phenom II/Athlon II/Turion II range which has left AMD with no control over any market place in the industry. Hang on, hasn't that all just been put under the "Marketing" BS of AMD VISION, VISION Premium, VISION Ultimate, VISION Black. Looks like with all that lack of real vision they should have gone to specsavers!
Sloth 18th May 2010, 17:05 Quote
Quote:
Originally Posted by crazyceo
Well the irony here is that AMD are shouting from the rooftops their so called excellence at not doing "a quick fix!" by then DOING "a quick fix!" on their Opteron 6174, just to get there first! which is their most expensive product line. So why is it ok there to allow the, what did someone else say "drrrrraaaaaaggggggggggg ug ug ug ug" here but Intel can't for their older chips which despite having that "drrrrraaaaaaggggggggggg ug ug ug ug" still hammered the competition (used lightly).

As to the marketing of Intels Celeron range, what about AMD's complete Phenom II/Athlon II/Turion II range which has left AMD with no control over any market place in the industry. Hang on, hasn't that all just been put under the "Marketing" BS of AMD VISION, VISION Premium, VISION Ultimate, VISION Black. Looks like with all that lack of real vision they should have gone to specsavers!
That's not exactly a hidden scandal, the article even admits the Opteron 6174 was a slap-together fix. The difference being that it's a slap-together 12 core versus no other 12 core. In all other cases one party has had an MCM solution while the other has had a single chip solution. If Intel throw out a single chip 12 core then they will lay claim to the crown for 12 cores instead.

For Vision, congratz, they're now catching up to the level that Intel's been at for years, a small badge is pretty mild and nothing worse than, say, the i3,5,7 series badges. Does Average Joe know what it means? No, but bigger numbers are good! Just like Vision's labelling which seems to be taking a note from gas stations (Vision aka 82 Octane, Vision Premium aka 86 Octane, Vision Ultimage 91 Octane )
alpha0ne23 19th May 2010, 04:08 Quote
Quote:
Originally Posted by crazyceo

As to the marketing of Intels Celeron range, what about AMD's complete Phenom II/Athlon II/Turion II range which has left AMD with no control over any market place in the industry. Hang on, hasn't that all just been put under the "Marketing" BS of AMD VISION, VISION Premium, VISION Ultimate, VISION Black. Looks like with all that lack of real vision they should have gone to specsavers!

Not wanting to sound like an AMD apologist but I can tell you with all seriousness that 95% of ppl would not know the difference if they are using an Athlon x2 240 or an Intel i7 960 :|

Even 'lowly' Athlon/Phenom cores offer more than enough performance to satisfy whatever 95% of ppl throw at them, we dont all spend hours in front of a screen torturing err I mean benching.........well at least not any more :D
azrael- 19th May 2010, 08:17 Quote
Quote:
Originally Posted by alpha0ne23
Not wanting to sound like an AMD apologist but I can tell you with all seriousness that 95% of ppl would not know the difference if they are using an Athlon x2 240 or an Intel i7 960 :|

Even 'lowly' Athlon/Phenom cores offer more than enough performance to satisfy whatever 95% of ppl throw at them, we dont all spend hours in front of a screen torturing err I mean benching.........well at least not any more :D
You can use statistics to prove anything. 40% of all people know that. :)
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums