bit-tech.net

AMD unites CPU and GPU development teams

AMD unites CPU and GPU development teams

AMD is now focusing its efforts on developing CPUs with integrated graphics, which was the original goal of the Fusion project before it got hijacked by the marketing guys.

Nearly three years after AMD revealed its plans to buy ATI, it looks as though the two companies have now officially been united. The result is anew streamlined business model for AMD, which will see a single division of AMD working on both CPUs and GPUs.

The new product group is one of four groups that form a part of AMD’s new business structure. The three other groups will focus on technology, marketing and customers. According to AMD, the union of the graphics and processor groups will “better optimise AMD’s operations to drive industry-leading performance graphics and microprocessors and further integrate the company’s x86 processor and graphics technologies.”

AMD’s president and CEO, Dirk Meyer, explained that “The next generation of innovation in the computing industry will be grounded in the fusion of microprocessor and graphics technologies. With these changes, we are putting the right organization in place to help enable the future of computing.”

The new products group will be headed by Rick Bergman, who worked for ATI before the takeover, and now has the job of fusing AMD’s CPU and GPU development groups into a single organisation. As well as this, AMD has also announced that the former senior vice president of AMD’s Computing Solutions Group, Randy Allen, has decided to leave AMD, although no other details were given about the resignation. Meyer commented on Allen’s departure, saying that he “has been an important engineering and business leader who has played a key role in many of AMD’s most significant achievements in recent years.”

It’s clear that AMD is now clearly committed to its original Fusion project, with the goal of focusing on creating CPUs with integrated GPUs. Intel has already demonstrated its own CPUs with integrated graphics, but AMD could potentially come up with something more interesting with ATI’s expertise on the GPU side. With support for GPGPU APIs such as ATI Stream and OpenCL, an AMD CPU with ATI graphics could offer some worthwhile benefits over Intel’s equivalent CPU.

Of course, this also brings up the question of whether AMD will still want to commit a large number of resources to developing high-end GPUs in the future. We also wonder how AMD’s CPU architecture might change with the guys from ATI giving their input too.

Is AMD right to concentrate on developing CPUs with integrated graphics? Let us know your thoughts in the forums.

17 Comments

Discuss in the forums Reply
D-Cyph3r 7th May 2009, 15:16 Quote
Fusion baby.
Star*Dagger 7th May 2009, 15:21 Quote
I have an ATI Radeon HD 4870x2. Riddle me this, how can you possible put that on a motherboard?!
If it is doable, do it, if not, I am happy to buy gpu and cpu separately.
I daresay that the integrated CPU/GPU will be for the low-end casual gamer while the Big Boys will still be taking the separate cpu from gpu approach. I cant see how they would cool such a beast, if it was a high-end solution.

S*D
Xtrafresh 7th May 2009, 15:27 Quote
I think they are bang on the money here.

Knowing how long it takes to develop discrete graphics cards (or any other form of high-end chip), i think we can safely assume that the HD 6xxx series is already on the drawingboard. after that, i think that miniturisation has come to a point where integrating CPU and GPU will offer vast advantages as opposed to building the two separately. I think that by the time the first CPU/GPU unit from AMD hits the market, they'll have built a complete high-end PC on an mATX sized piece of silicon.
seveneleven 7th May 2009, 15:30 Quote
Quote:
Originally Posted by Star*Dagger
I have an ATI Radeon HD 4870x2. Riddle me this, how can you possible put that on a motherboard?!
If it is doable, do it, if not, I am happy to buy gpu and cpu separately.
I daresay that the integrated CPU/GPU will be for the low-end casual gamer while the Big Boys will still be taking the separate cpu from gpu approach. I cant see how they would cool such a beast, if it was a high-end solution.

S*D

Folks in the 70s: A seperate controller for graphics?! Surely there will never be a need for such a thing!
bowman 7th May 2009, 15:37 Quote
Quote:
Originally Posted by Star*Dagger
I have an ATI Radeon HD 4870x2. Riddle me this, how can you possible put that on a motherboard?!
If it is doable, do it, if not, I am happy to buy gpu and cpu separately.
I daresay that the integrated CPU/GPU will be for the low-end casual gamer while the Big Boys will still be taking the separate cpu from gpu approach. I cant see how they would cool such a beast, if it was a high-end solution.

S*D

Of course it's just integrated. Fusion will be the equivalent of 780G, just on the CPU instead.

Ssh, don't let the marketing people hear you, though. TEH FOOTOOR IS FUUSHUN!
lp1988 7th May 2009, 16:30 Quote
Quote:
Originally Posted by Star*Dagger
I have an ATI Radeon HD 4870x2. Riddle me this, how can you possible put that on a motherboard?!
If it is doable, do it, if not, I am happy to buy gpu and cpu separately.
I daresay that the integrated CPU/GPU will be for the low-end casual gamer while the Big Boys will still be taking the separate cpu from gpu approach. I cant see how they would cool such a beast, if it was a high-end solution.

S*D

in the first many years we will properly only see these in labtops or PC's for buisnesses.
cheeriokilla 7th May 2009, 17:06 Quote
Awesome news! I have two rigs... This could put some interesting stuff in their future devolpment
JyX 7th May 2009, 17:24 Quote
Think of this... a GPU on CPU that works in WDDM (Windows) but in games, you can switch to dedicated graphics... that's the idea. Now, since the CPU is properly cooled, compared to the northbridge which most of them are passively cooled... a GPU could yield better performance in HD decoding and GPGPU applications, not to mention that powering off the discrete GPU could allow for overall lower power usage.

This could also free the chipset development for not needing to make separate IGPs also, just standalone chipsets. This would be beneficial on mobile platforms and also on servers.

Besides, it's inevitable... Intel markets Westmere as a CPU-GPU multi packaging solution so people will follow and demand the same thing from AMD, even though AMD's should perform better at a lower power draw... based on the 780G/790GX TDP of 11W compared to IGP's from Intel of 28W.
mayhem 7th May 2009, 17:29 Quote
What would be good would be a motherboard with a socket for the cpu and a socket for the GPU then you have scale of GPU's as well as CPU's and you just throw in what ever GPU you fancy .....
EvilRusk 7th May 2009, 18:52 Quote
Quote:
Originally Posted by mayhem
What would be good would be a motherboard with a socket for the cpu and a socket for the GPU then you have scale of GPU's as well as CPU's and you just throw in what ever GPU you fancy .....

Why stop there, you could put the GPU with it's own dedicated ram on it's own board and plug it in to a slot... oh wait...
HourBeforeDawn 7th May 2009, 20:53 Quote
this isnt anything really new, I mean in a way this old news, everyone knew this is what AMD was heading for to begin with but nice to see them cracking down and really pushing forward with it .
JyX 7th May 2009, 22:25 Quote
That was before their stocks degraded to the current levels... and as the articles says, "AMD is now focusing its efforts on developing CPUs with integrated graphics, which was the original goal of the Fusion project before it got hijacked by the marketing guys.". They turned FUSION into some application suite that optimizes current gen AMD platform for games... when in fact it's this.
Kudos 8th May 2009, 06:48 Quote
Granted, in the early stages after (if?) this happens it'll be low end graphics... business/granny checking emails ect.

But whats to stop them from, say, dropping 2 cores from a quad and replacing them with 2 graphics cores as the tech develops? Heat will surely be an issue, but this would likely be an enthusiast gpu/cpu, so they expect watercooling at the very least.

Could be interesting to watch how it devolps
DaMightyMouse 8th May 2009, 09:08 Quote
Quote:
Originally Posted by EvilRusk
Quote:
Originally Posted by mayhem
What would be good would be a motherboard with a socket for the cpu and a socket for the GPU then you have scale of GPU's as well as CPU's and you just throw in what ever GPU you fancy .....

Why stop there, you could put the GPU with it's own dedicated ram on it's own board and plug it in to a slot... oh wait...

LOL!
Crazy Buddhist 19th November 2009, 17:19 Quote
Saw this coming some time ago. ( ref: http://www.thebestcasescenario.com/forum/showthread.php?t=15096 )

Next BIG thing ..... ? My $ goes on ...

Within 18 months from now (Q2 2011) Nvidia will be marketing powerful CPU/GPU combination chips.
Turbotab 19th November 2009, 17:26 Quote
It would be ironic if Intel after spending a fortune on Larrabee, end up with products with a superior CPU, yet vastly inferior GPU offering, just like the days of Intel GMA.
Crazy Buddhist 19th November 2009, 19:05 Quote
Turbo
...

I think that is the way it will be. Their knowledge of leveraging parallel processing was certainly improved on the software level when they bought Havok but on the hardware side of parallel processing they are kids compared with both AMD and Nvidia due to their competitors outstanding GPU teams.

It would not surprise me to see Intel hit a technology limit (wafer's can only get so small) that slows their leading edge, only to be surpassed in the mid term by better integration technologies from Nvidia and AMD/ATI. 2 - 3 years from now everyone might not be buying Intel.

Matthew
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums