bit-tech.net

AMD to lay off 10 per cent of its workforce

AMD to lay off 10 per cent of its workforce

AMD's latest releases have failed to live up to expectation.

AMD today announced that it’s ‘optimizing its cost structure’ - a nice way of saying that it’s going to be making 10 per cent of its global workforce redundant while terminating 'existing contractual commitments'.

The reduction in staff is expected to save the company up to $118 million over the course of 2012 which, when added to the $90 million AMD says it’s going to save through ‘operational efficiencies’, means the company should be shaving over $200 million from its costs next year.

AMD has stated that it expects the restructuring plan ‘to take place primarily during the fourth quarter of 2011, with some activites extending into 2012’. Interestingly, it also announced that it expects the plan to actually cost $105 million to implement, a cost presumably made up of redundancy cheques and relocation expenses.

Obviously, AMD are remaining externally upbeat about the whole thing, saying the plan will ‘strengthen the company’s competitive positioning’ and ‘rebalance [its] global workforce skillsets, helping AMD to continue delivering industry-leading products’.

It’s unclear whether this move has been in the works for a while or if it’s a direct result of the worldwide panning that AMDs new Bulldozer based FX processors endured. This coupled with what we’re hearing is low demand for its Llano-based chips (which we really like, by the way) doesn’t make for a rosy outlook for the company.

Do you think AMD's belt-tightening will keep them competitive? Will its new Bulldozer-based Opterons save its bacon? Let us know your thoughts in the forum.

49 Comments

Discuss in the forums Reply
Spreadie 4th November 2011, 12:37 Quote
This was inevitable, given how poorly Bulldozer was received.

The redundancies are a tragedy all the same, but many of us have been through that in the last few years. I hope the cost savings help them rebuild and become more competitive.
bulldogjeff 4th November 2011, 12:57 Quote
I think it's a bit hard to read to much in to this one. it might have something that was on the cards no matter good or bad bulldozer was. It does how ever look like a knee jerk reaction to a poor showing from all the new products.
Spreadie 4th November 2011, 13:08 Quote
Quote:
Originally Posted by bulldogjeff
I think it's a bit hard to read to much in to this one. it might have something that was on the cards no matter good or bad bulldozer was. It does how ever look like a knee jerk reaction to a poor showing from all the new products.

Yes, they could've been waiting for Bulldozer before deciding just how many to lay off. Perhaps if it didn't receive such a panning, they would be looking at a smaller cull.
bulldogjeff 4th November 2011, 13:17 Quote
I still think they'll get it right with Bulldozer or something based on it. It's a brand new architecture, they might not bust Intels balls but could end up offering a very good all round chip.
azazel1024 4th November 2011, 13:51 Quote
They might get it right eventually, but they deffinitely don't have it right, right now.

2 billion transistors versus about a billion for sandy bridge...and it is quite a bit slower in a lot of measures of performance that matter to most desktop users and even a fair number of sever users.
PQuiff 4th November 2011, 13:57 Quote
AMD are tools. it doesnt take a rocket scientist to work out that if you keep people waiting for a product that is vastly inferior to your competitors your gonna end up in financial trouble.

AMD need a chip that blows intel out the water, this will get its name back on top, then it needs to start releasing chips that compete or are better than intel for a reasonable price. Releasing a server chip for a consumer market that still relies on fast single thread apps is madness.

I wish them luck, dont see them unfortunately keeping up at this rate
Lenderz 4th November 2011, 14:06 Quote
Quote:
Originally Posted by PQuiff
AMD are tools. it doesnt take a rocket scientist to work out that if you keep people waiting for a product that is vastly inferior to your competitors your gonna end up in financial trouble.

AMD need a chip that blows intel out the water, this will get its name back on top, then it needs to start releasing chips that compete or are better than intel for a reasonable price. Releasing a server chip for a consumer market that still relies on fast single thread apps is madness.

I wish them luck, dont see them unfortunately keeping up at this rate

Now now, don't you think that thats a little bit harsh?

AMD has vastly less resources than Intel, Intel has even played dirty in the past when AMD did have a superior product to stop it selling (and has paid some compensation as a result). Making a processor is hard, they can't just "make it fast and brilliant and stuff" it takes years of R&D and planning and work to make a good chip and they didn't set out to disappoint with BD, the architecture even shows promise, its just not done yet, and as it stands Intel is a generation or two ahead.

I think that those people working at AMD work to make competitive products as best they can and in this instance they didn't get it right first time. Calling them tools whilst they or their colleagues are losing their jobs is in poor taste. None of them set out to launch a disappointing product.
Jehla 4th November 2011, 14:15 Quote
Quote:
Originally Posted by PQuiff

AMD need a chip that blows intel out the water, this will get its name back on top, then it needs to start releasing chips that compete

And intel needs a competitor to ARM, they can't just pull it out a hat though.

Can intel afford to let amd dwindle and die or can we expect something like the ms/apple deal?
Madness_3d 4th November 2011, 14:17 Quote
Come on AMD... You can do it...
Xir 4th November 2011, 15:21 Quote
Does this include GlobalFoundries or not?

I mean, they've already outsourced their manufacturing and the accompanying R&D, who's left to lay off that isn't desperately needed?
benji2412 4th November 2011, 15:32 Quote
Perhaps they should focus on making a new CPU with, I don't know, better performance than an intel CPU for the same price?
B1GBUD 4th November 2011, 15:39 Quote
Well they do insist on fighting Intel with one hand for CPU market, and Nvidia with the other hand for the GPU market.

They appear to be losing on both counts.
Lenderz 4th November 2011, 16:00 Quote
Quote:
Originally Posted by benji2412
Perhaps they should focus on making a new CPU with, I don't know, better performance than an intel CPU for the same price?

Genius! Why didn't they think if that, they should make you CEO post haste!

Please read my previous comment, making CPUs isn't easy it takes a long time and a lot of resources. They are trying to do just that. If you know something they don't about making a fast cheap CPU please let us all know.


Sent from my iPhone using Tapatalk
tonyd223 4th November 2011, 16:11 Quote
OK, this may not be such bad news for competition in the CPU market - if you don't produce products that people want to buy, then you go bust.

C'mon ARM!
Lenderz 4th November 2011, 16:52 Quote
Quote:
Originally Posted by tonyd223
OK, this may not be such bad news for competition in the CPU market - if you don't produce products that people want to buy, then you go bust.

C'mon ARM!

The limited X86 licences suggest it would be bad news, as since Cyrix died and VIA has been proven not to compete it means that nobody will be able to compete at a desktop level at all with intel. No completion is bad competition. Even intel knows this. Might give nvidia or some other a crack at a x86 licence though I guess. But that should have happened 5 years ago.


Sent from my iPhone using Tapatalk
bobwya 4th November 2011, 18:51 Quote
Oh great, so Radeon drivers are going to get even worse now - no doubt... :-(
Intel's "dirty tricks" with OEM's is leaving a very sour taste now - for everyone...
Aragon Speed 4th November 2011, 19:09 Quote
Quote:
Originally Posted by Xir
who's left to lay off that isn't desperately needed?
The 'stupid new names for our chips' commitee?
Lazy_Amp 4th November 2011, 20:25 Quote
Quote:
Originally Posted by Xir
Does this include GlobalFoundries or not?

I mean, they've already outsourced their manufacturing and the accompanying R&D, who's left to lay off that isn't desperately needed?

GloFo is a separate company, so no, AMD can't directly fire anyone from there.

Still, there's been a lot of pushback against Germany recently since they are still having MAJOR Yield issues on 32nm related to processing issues. GloFo has really never been criticized by Bit-Tech, probably because they're a privately held company that doesn't have to be in the spotlight.

It doesn't excuse Bulldozer I know, but AMD needs a lot more Llano in the marketplace to bring up margins. There is demand for Llano, but it's been tempered by a failure of GloFo to meet supply, which affects how laptop manufacturers feel about adding more Llano products to their range.
greypilgers 4th November 2011, 20:37 Quote
Quote:
Originally Posted by bobwya
Oh great, so Radeon drivers are going to get even worse now - no doubt... :-(
Intel's "dirty tricks" with OEM's is leaving a very sour taste now - for everyone...

Indeed, is it just me or does it seems as for every small laptop with an AMD APU chip in it you can buy, there are ten or twelve other models with Atoms and i3s? AMD APUs are like hens teeth!
mediapcAddict 4th November 2011, 20:56 Quote
sorry but amd processors is just dying.

heres my reconing on why llano is failing.

Fistly llano is failing because the graphics performance is very dependant on fast memory. AMD should have made an optional seperate lane for faster ddr3 graphics memory . that would have kept the cost of a build down.

secondly they should have had it released to market with seperately configurable overclocking for gpu and cpu. you have to spend £160+ with intel to get overclocking. intel are gifting the whole budget overclocking market to amd and amd does nothing.

thirdly they should have had a faster graphics processor than the 6550D. 6550d isn't good enough for a dedicated gamer or even media pc ( imo ) and who else knows which graphics cards matter outside of gamers and few others.

admit it who saw llano and thought "yeah 6550d thats my next build". had been a 6850 then they would had a chip to challenge intel. and a £40 cooler to including the retail packaging BUT none the less they would be back in the game.

it sucks that people are getting laid off but I can't say I'm shocked. just saddened
The Infamous Mr D 4th November 2011, 22:19 Quote
Given that desktop sales and laptop sales are considerably down, it's no surprise that 'traditional' processors aren't as much in demand. People are upgrading less, buying less new equipment and making do with their current kit much longer. Aside from gamers, tablets and smartphones are soaking up most of the casual technology sales and usage. I'm still running an overclocked Q6600 with an HD5870 for over 18 months, it's doing me just fine, and I'm not planning an upgrade until it's really necessary.
Snips 4th November 2011, 23:08 Quote
You can't blame Intels, how have some put it "Intel's "dirty tricks" with OEM's is leaving a very sour taste now - for everyone..." that's old news which Intel was forced to pay many $Millions to AMD after anti-competitiveness in the US and Europe.

The problem now is purely in AMD's court. If they lived up to their marketing departments hype, there wouldn't be a problem now. Under promise and over deliver, it's a simple strategy AMD, you may as well try it since nothing else is working.
Bede 5th November 2011, 00:29 Quote
I'm seeing a lot of lazy posting in this thread. One of the most amazing things humans create is the desktop CPU. They are now so outrageously complex and do so many things that we should be amazed that we even have them at all.

I am no fanboy for AMD - I like Intel chips and nVidia graphics cards - but to say that AMD are stupid is ridiculous; their engineers are good. They don't have as much money as Intel to sink into R&D, and with something as complex as a new microchip architecture that is where the difference is made - one man does not design a chip.

It is sad that they have to make so many people redundant however, with any luck, this will give them more money to put towards R&D.
Lenderz 5th November 2011, 01:11 Quote
Quote:
Originally Posted by Bede
I'm seeing a lot of lazy posting in this thread. One of the most amazing things humans create is the desktop CPU. They are now so outrageously complex and does so many things that we should be amazed that we even have them at all.

I am no fanboy for AMD - I like Intel chips and nVidia graphics cards - but to say that AMD are stupid is ridiculous; their engineers are good. They don't have as much money as Intel to sink into R&D, and with something as complex as a new microchip architecture that is where the difference is made - one man does not design a chip.

It is sad that they have to make so many people redundant however, with any luck, this will give them more money to put towards R&D.

Well said, you managed to put what I was trying to say much more eloquently thank you sir.
Krayzie_B.o.n.e. 5th November 2011, 01:41 Quote
AMD CPU's suck! End of story.
they were in the game for a while offering good performance at a good price but now Intel has a $200 CPU that kills anything AMD has ever made.

AMD GPU's are still a great value but soon (unless they get lucky and Apple buys them) AMD will be a GPU only company. I'm sure those laid off can get jobs at Nvidia Tegra or ARM.
rogerrabbits 5th November 2011, 02:46 Quote
Quote:
‘strengthen the company’s competitive positioning’ and ‘rebalance [its] global workforce skillsets, helping AMD to continue delivering industry-leading products’.

So in reality, "we are seriously struggling :/"
fluxtatic 5th November 2011, 06:36 Quote
I think, if they get through to the second generations of Llano and BD, they'll be all right. Don't get me wrong - god knows I posted a couple times here how disappointed I was with the BD release. However, both archs show a lot of promise. Win 8s optimizations will help immensely - it had been benchmarked and wasn't that impressive, but keep in mind it's the Developer's Preview. I had a chance to play with it the past few days, and it's pretty rough. Vishera (2nd gen BD) should be much better than BD has been so far. Same with Llano. Trinity will be based on the Piledriver core, same as Vishera, with a 6xxx GPU (the current Llano uses the Redwood core, so it's really a 5xxx GPU.) Vishera will also have a competitor for Quick Sync. Ripping off Intel? Maybe, maybe not. All I know is, I want fast hardware transcoding, but I won't buy Intel to get it.

Not to call Lenderz wrong, but I doubt now Nvidia's looking for an x86 license. If they really wanted it, they would buy Via - they've got the cash to do it. But why jump into that? They have one of the most popular ARM processors available today. They'll also be first to market with a quad-core ARM (it's actually got five cores - interesting reading if you haven't seen it yet.) Unless you've got brilliant x86 engineers, it's not worth it trying to compete with Intel in that arena. Nvidia obviously knows what they're doing with ARM, so why go another new direction now?

This news just sucks, though. Economy the way it is all over the world, I feel for these people. I wonder what positions are 'redundant', though? (God I hate that term - just tell us you're laying them off. 'Being made redundant' sounds even worse...although then it kinda sounds like it's not your fault, right? If you're laying them off, you must be struggling. If they're 'redundant', it's their fault for being useless duplicates, yeah?)
Snips 5th November 2011, 10:15 Quote
If AMD don't have the money to do R&D properly, then quit what you are crap at and only focus on what you are good at. Remind me again, what's AMD good at?
Marvin-HHGTTG 5th November 2011, 12:08 Quote
Quote:
Originally Posted by Snips
If AMD don't have the money to do R&D properly, then quit what you are crap at and only focus on what you are good at. Remind me again, what's AMD good at?

I can only assume you're deliberately trolling here, same as whoever thought Llano should have had a 6850 onboard.

Consistently over the last few generations AMD's GPUs have been fast, well priced, and fairly frugal. Nvidia's have been generally slightly faster, but more expensive, hotter and with frankly ridiculous power consumption.

To many that's far more important than having the absolute fastest card, as system builders would testify.Hence why AMD's GPU market share is now well above Nvidia's for the first time.
Paradigm Shifter 5th November 2011, 13:04 Quote
Quote:
Originally Posted by Bede
I'm seeing a lot of lazy posting in this thread. One of the most amazing things humans create is the desktop CPU. They are now so outrageously complex and does so many things that we should be amazed that we even have them at all.

<snip>
Agreed.

...

Other than that, it's refreshing to see that someone at Bit-Tech understands the correct meaning of 'decimation' - it is used wrongly in far too many places... the mainstream media being particularly guilty of it over the last 15 years or so. :)
Xir 5th November 2011, 13:21 Quote
Quote:
Originally Posted by Bede
It is sad that they have to make so many people redundant however, with any luck, this will give them more money to put towards R&D.
Well, thats what I meant, they already reduced themselves to R&D and Chip Design
(since manufacturing and manufacturing R&D are at GloFo)
Who are they reducing now that doesn't hurt their R&D or Chipdesign?
Quote:
Originally Posted by Aragon Speed
The 'stupid new names for our chips' commitee?
Hmmm, possibly, all manufacturers seem to have overhead in the BS-department;)
Bindibadgi 5th November 2011, 13:27 Quote
I'm waiting for the announement on the 9th. I highly suggest people read the Icrontic and Anandtech news posts on this too, and I would check to see if AMD exclusive GPU partners are nervous..
Bede 5th November 2011, 14:51 Quote
Quote:
Originally Posted by Bindibadgi
I'm waiting for the announement on the 9th. I highly suggest people read the Icrontic and Anandtech news posts on this too, and I would check to see if AMD exclusive GPU partners are nervous..

That is a little disturbing, AMD have been really winning on the GPU front recently (as far back as the 5870 imo), I hope they don't do anything stupid.
Lenderz 5th November 2011, 15:35 Quote
Quote:
Originally Posted by fluxtatic

Not to call Lenderz wrong, but I doubt now Nvidia's looking for an x86 license. If they really wanted it, they would buy Via - they've got the cash to do it. But why jump into that? They have one of the most popular ARM processors available today. They'll also be first to market with a quad-core ARM (it's actually got five cores - interesting reading if you haven't seen it yet.) Unless you've got brilliant x86 engineers, it's not worth it trying to compete with Intel in that arena. Nvidia obviously knows what they're doing with ARM, so why go another new direction now?

Sorry, perhaps I wasn't clear, long before Nvidia started playing with ARM (I'm talking 5-10 years ago or so) there were a series of rumours that Nvidia wanted in the X86 market and was trying to get hold of a licence. Nothing ever came of these rumours, but Intel is notoriously difficult to get to play ball in this regard. I did wonder if something may come of this recently when Nvidia and Intel signed a deal sharing certian technologies but nothings happened in that regard.

Anyway, I wasn't saying Nvidia is looking for the licence, just that I feel that we need competition in the X86 market thats all, and we need a capable team of engineers with a decent R&D budget to keep Intel on their toes, AMD did that for a long time where Cyrix and VIA didn't manage it. I just don't see anyone else other than Nvidia being able to move into the market, and I don't see them wanting to either with their concentration on ARM, I was trying to say "if that was a move they were going to make, they should have made it 5-10 years ago".

I wonder if Intel might prop up AMD for a while just so that it has a competitor like MS did with Apple all those years ago.


Edit
An example of the rumours I mention :

http://www.bit-tech.net/news/hardware/2009/03/04/nvidia-reveals-plans-for-x86-cpu/1
Arghnews 5th November 2011, 16:33 Quote
There have been examples in the past where architectures have improved hugely with updates. Hopefully Bulldozer can match this.

The main problem, though, is Bulldozer a server architecture. It's just not designed for the consumer market. Simple.

Someone mentioned the possibility of Windows 8 utilising the threads better, but, the problem is games. It's so difficult to design games to utilise 8 cores; more raw processing power with less cores>less power with more cores, IMO.
rogerrabbits 5th November 2011, 21:31 Quote
Well AMD have ups and downs. For a while nobody cared about them, and then they released the first mainstream dual core which was pretty popular. Then Intel took the lead again with that Conroe, and then I think AMD took the lead again in low energy chips for HTPC's and stuff. So this could just be another slump and maybe they will take the lead again soon. Or maybe this is the beginning of the end.
Aragon Speed 6th November 2011, 07:00 Quote
TBH the only thing that lets BD down is its low(er) IPC as far as I can see in most reviews. An improvement in that area alone for the next chip should see it pulling it's weight in the desktop arena imo.
Bindibadgi 6th November 2011, 07:23 Quote
Quote:
Originally Posted by Bede
That is a little disturbing, AMD have been really winning on the GPU front recently (as far back as the 5870 imo), I hope they don't do anything stupid.

It's not stupid from a business perspective. This man is all about saving cost and improving profitability. Look at the list of people who have left/were made redundant recently and look at the ROI of the GPU market in the last decade. The trend path is not good for graphics.

Even if you look at the trend Nvidia and AMD have been pushing publically the last two years - it's ALL to do with general computing first, graphics second (it's just that graphics is more popularised in media due to its relevance in our audience). The two have played friendly so far, but very soon I expect they will need to diverge as the GPGPU market grows and makes it more worthwhile than chasing gamers. In that sense AMD could get away with only making "average" low power CPUs with IGPs to act as IO hubs to far more powerful GPGPU PCIe cards in servers.
Bede 6th November 2011, 14:11 Quote
At the same time though Bindi (and I'm not really arguing with you as tbh my knowledge of the market is really quite limited :D) the market for discrete desktop graphics has grown. I think nvidia's marketing before BF3 was very interesting, it seemed to me to be they were testing how much worth there is left in the desktop market.

We may yet see what you fear come to pass, and the consumer market be offered rubbish graphics cards, but I think it would be a fool who ignored the potential of our market - after all, those who can afford an £800 computer can probably afford to (and are interested in) upgrading their graphics card every year or two.

There is also the next generation of consoles to consider - if they can push console graphics to current desktop PC levels then our systems will need to get more powerful as the absence of direct-to-metal coding on PC means we are a lot more inefficient. I'm actually quite looking forward to seeing how the new gen of consoles are put together :)
mediapcAddict 6th November 2011, 15:57 Quote
Quote:
Originally Posted by Marvin-HHGTTG
I can only assume you're deliberately trolling here, same as whoever thought Llano should have had a 6850 onboard.

actually when I suggested getting a 6850 on board I was serious.

If you don't believe me here's my thinking.

----------------------------
the problem is that the 6550 is in no mans land. it's not powerful enough for gamers yet it's over powered for average joes who just buy from dell etc.

the main problem is too much heat right. I don't design chips obviously ( who here does ) but here is my rough working.

http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units

The 6570 is 44 wats and the 6850 is 127 watts thats an additional 83 watts. llano is a 100 watt chip so were talking about releasing a a guestimate 200 watts chip . right ? too much ?.

there are two power savings that could bring this down. first the memory would be on the motherboard and not "with the chip" and secondly the 6850 on 32 nm ( instead of 40nm ) would consume I guess about 20% less power. so your looking at a 170/180 + watt chip before overclocking - still too much? for a £5 cooler ouch yes too much. BUT for a serious cooler

The thermaltake frio can cool "up to 220 watts"
http://thermaltakeusa.com/Product.aspx?S=1319&ID=1956

Even the modest coolers on the 6950 or 6970 cards are dealing with a tdp of 200 and 250 watts respectively.

If you still have doubts check this page out

http://www.bit-tech.net/hardware/cpus/2010/04/27/amd-phenom-ii-x6-1090t-black-edition/7

it's a 125 watt tdp chip. the 1090t black stock ( 3.2 Ghz) at full load the whole system uses 195 watts. when over clocked to 3.87 ghz it consumes 338 watts thats 140 watts OVER the 125 tdp. yes roughly 30% of that is system BUT the cooler a "Titan Fenrir " can probably deal with 200 watts. which is enough to cope with a llano chip AND a 6850 on 32 nm.


Seriously I recon it's possible if amd start thinking outside the box.

If they had a seperate lane for the fastest possible ddr3 graphics memory. and seperate gpu and cpu overclocking ( black edition style easy overlcocking ) amd might have a chip gamers would actually buy.

I know it's never been done before - but that's exactly what amd does when it's at it's best.

If it worked maybe AMD would be hiring more people
Snips 6th November 2011, 19:28 Quote
Quote:
Originally Posted by Marvin-HHGTTG
Quote:
Originally Posted by Snips
If AMD don't have the money to do R&D properly, then quit what you are crap at and only focus on what you are good at. Remind me again, what's AMD good at?

I can only assume you're deliberately trolling here, same as whoever thought Llano should have had a 6850 onboard.

Consistently over the last few generations AMD's GPUs have been fast, well priced, and fairly frugal. Nvidia's have been generally slightly faster, but more expensive, hotter and with frankly ridiculous power consumption.

To many that's far more important than having the absolute fastest card, as system builders would testify.Hence why AMD's GPU market share is now well above Nvidia's for the first time.

Why was that trolling? It's a valid question and you're deluding yourself if you think AMD has been the overall choice of GPU since the entire last series still has boot marks left by Nvidia all over it's backside. Your generalised statement of "hotter, and with frankly ridiculous power consumption" is outdated. You only have to look at the recent ASUS GTX580 ROG Matrix Platinum to see that.

So, remind me again, what's AMD good at?
Hovis 7th November 2011, 03:47 Quote
AMD will get back into the game I hope. Competition is vital for the consumers. Without competition we'd get absolutely screwed.
Xir 7th November 2011, 09:38 Quote
Quote:
Originally Posted by Arghnews
The main problem, though, is Bulldozer a server architecture. It's just not designed for the consumer market. Simple.
That's not the problem, it's the core of the business.
AMD (mostly) makes money on business systems, not (so much on) consumer products.
They're just not as much in the spotlight.
Kilmoor 8th November 2011, 08:42 Quote
Wake up! It's not about the chips!
You're reading too deeply into this by focusing on the technology.

This is Capitalism plain and simple. The company will cut 200 million by redundancy in the labor force. I promise you, with a new gen of chips flowing the company will gross the same... That means 200 million to doll out to the shareholders as profit. Profit at the cost of the laborer. Simple as that.
Bindibadgi 8th November 2011, 08:51 Quote
Quote:
Originally Posted by Bede
At the same time though Bindi (and I'm not really arguing with you as tbh my knowledge of the market is really quite limited :D) the market for discrete desktop graphics has grown. I think nvidia's marketing before BF3 was very interesting, it seemed to me to be they were testing how much worth there is left in the desktop market.

We may yet see what you fear come to pass, and the consumer market be offered rubbish graphics cards, but I think it would be a fool who ignored the potential of our market - after all, those who can afford an £800 computer can probably afford to (and are interested in) upgrading their graphics card every year or two.

There is also the next generation of consoles to consider - if they can push console graphics to current desktop PC levels then our systems will need to get more powerful as the absence of direct-to-metal coding on PC means we are a lot more inefficient. I'm actually quite looking forward to seeing how the new gen of consoles are put together :)

The discrete graphics card market is shrinking. Intel's processor graphics are better than ever (think of that statement in terms of your parents or business' using a PC) and AMD's Llano chips helped keep the company afloat last quarter - in fact, they were DOMINANT in the Chinese market (because they are fast enough for all those LAN cafe's that play low detail MMOs/RPGs). The expected market for discrete cards is shrinking in laptops as well as trends move towards thinner and thinner notebooks and tablets that just don't use them.

I expect the discrete market will remain indefinitely, but the timescale between updates will get larger and the costs will increase, unless there's some investment offset in other sectors like GPGPU. Ideally we could see Microsoft make DirectX more GPGPU-like so the change in architecture still applies to benefit gaming too ;)

Kilmoor is almost there but with the highly cynical view. AMD will likely have debts from bad quarters to pay off (I'm not sure how much it would of had in the bank), while a company needs to show it is growing and on the right path to encourage further investment in order to continue on that 'right path'. Shareholders are simply not patriotic to product lines, but they can also be knee-jerk to trends (as ever it's a double-edged sword). Investment is necessary for the massive R&D costs high tech companies face. From what I know from previous to working where I do now, AMD did have serious internal conflicts that it needs to sort out, but the high profile losses were not the ones causing the conflicts afaik.. I don't talk to AMD anymore since taking my current position.
stanonwheels 8th November 2011, 18:48 Quote
Sempron was ****, Phenom was ****, Athlon II was ****, Phenom II was ****, Bulldozer is ****.
AMD are indestructible. Long live hopeless chips with better names than intel.
mediapcAddict 10th November 2011, 02:40 Quote
Quote:
Originally Posted by Hovis
AMD will get back into the game I hope. Competition is vital for the consumers. Without competition we'd get absolutely screwed.

I think we are already seeing the signs of weak competition when you have to spend £160 on an intel chip before you can overclock properly.
I would guess if AMD had a chip competative to sandy bridge we would see cheaper dual core K series sandy bridge chips.

Besides does any one hold out any hope for AMD processors or it's employees.

I mean currenlty they AMD can't manufacture as well as intel ( intel on 22nm and amd just begining on 32nm)

They can't design as well as intel. By almost any measure ( instructions per clock, output per watt, frame rates in most games, encoding video) intel wins.

It's only price Amd can compete on.

To put it another way what would you say or do to convince someone to buy AMD instead of intel?
TC93 11th November 2011, 18:57 Quote
A company doesn't have to have the absolute fastest processor to make money. People will still buy it, depending on the value you get for your money. I know I never buy the absolute fastest, as that usually means its also the worst value (jacked up price) of any.

I also much prefer AMD/ATI Video cards.

My current AMD 1055T 6 core cpu is plenty fast enough for everything I do. I also have a AMD 6950 video card.
TC93 11th November 2011, 19:00 Quote
I should also add, that AMD/ATI is usually always ahead of Nvidia when it comes to adding technology to their cards. That is fact.

My next cpu whenever that is, most likely will be AMD again. You get the most for your money from AMD.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums