bit-tech.net

How it all shakes out

Comments 1 to 25 of 51

Reply
scq 24th July 2006, 20:24 Quote
I don't like the idea of GPU/CPU integration. While it may be easier for mainstream consumers to be able to play new games with little fuss, the GPU would no longer be easily upgradable, unless you upgrade the processor as well.

It would be like buying a $1000 FX with an integrated 7900GTX, and then having to buy another $1000FX which is only slighty faster, for a 8900GTX (whatever it may be called by then).
aggies11 24th July 2006, 20:30 Quote
Quote:
Originally Posted by scq
I don't like the idea of GPU/CPU integration. While it may be easier for mainstream consumers to be able to play new games with little fuss, the GPU would no longer be easily upgradable, unless you upgrade the processor as well.

Agreed. But the GPU-on-chip solution isn't really geared for the high end "niche" market that Wil talks about. It's more for the low-end mainstream to very middle-end medium core. These are the kind of people that don't upgrade often to begin with.

Look at the transitor count of the current GPU kings. They dwarf CPU's in complexity. You can't reasonably expect "sneak" that thing onto the CPU. It's arguably *more* complex. It's gonna be stand-alone for a while.

The end of the article is actually the most interesting, I think. The potential for this to make high-end gaming smaller, more niche.

I think this is exactly the sort of thing Epic's CEO was talking about recently when he identified Intel as gaming's biggest threat.

If they throw crappy GPU's on all their chips:
- Everybody can play games now, TONS more people, the gaming market explodes
- These GPU's suck and can't play high-end games. If your a business/ developer, do you make your game for the 1million highend gamers, or the 100million mass market? Bigger markets = more money. Gaming becomes dumbed down for the mainstream and high end gaming dies as we know it. I hope you like Sudoku... :p

Aggies
fev 24th July 2006, 20:42 Quote
i don't like the idea of AMD + ATi
it's always been, at least in my eyes,
green 'n' green (nvidia + AMD)
blue 'n' red (figure it out)
Da Dego 24th July 2006, 20:45 Quote
Quote:
Originally Posted by fev
i don't like the idea of AMD + ATi
it's always been, at least in my eyes,
green 'n' green (nvidia + AMD)
blue 'n' red (figure it out)
But now it's christmas all year long!
RTT 24th July 2006, 21:03 Quote
I'm biased, but i thought this article rocked. I consider me informed!
LoneArchon 24th July 2006, 21:09 Quote
Quote:
Originally Posted by scq
I don't like the idea of GPU/CPU integration. While it may be easier for mainstream consumers to be able to play new games with little fuss, the GPU would no longer be easily upgradable, unless you upgrade the processor as well.

It would be like buying a $1000 FX with an integrated 7900GTX, and then having to buy another $1000FX which is only slighty faster, for a 8900GTX (whatever it may be called by then).
Well one way they could take the tech is having a dedicated GPU socket or make the GPU the same socket as the CPU and have a dedicated chip for it. AMD talk about this by allowing co-processer to plug in a spare socket on a dual socket motherboard to comunicate over the hypertransport channel. If they go down this road it could make upgrades less expensive
Reins 24th July 2006, 21:09 Quote
Quote:
Originally Posted by fev
i don't like the idea of AMD + ATi
it's always been, at least in my eyes,
green 'n' green (nvidia + AMD)
blue 'n' red (figure it out)

I agree I can't find these ads anymore but I can remember seeing adds that say "...and remember the 7900GTX runs best with a AMD processor." or something to that effect.

Hopefully things won’t shake out the way the article says it will for gamers because the article makes me sad. :'(
Rising prices of GPU's =bad
Gaming becoming more niche =bad

Perhaps as time goes on there won’t be anymore high end pc gaming and all of the high end gaming will be done on consoles :shrug: I guess we always have them. Perhaps by that time I'll be out of my gaming phase.

I can't wait to see what Nividia does they have a couple of options in front of them as to how they want to play this.
K.I.T.T. 24th July 2006, 21:39 Quote
i like alot of the people having already posted hope that what is suggested in the article doesn't come true under any circumstances because i personally think its a poor idea, fair enough it will make workstations and your average family systems more compact for the capability but i realisticaaly despite the miniscule number of high-end gamers don't think the companies could kill off high end and bleeding edge PC gaming, besides the graphics systems would still have to be arond and be developed because someone has got to make the systems for consoles. if it does happen though it will be a sad time since everyone knows to play a good FPS you can't beat a keyboard and mouse and console gaming with the a KB and mouse just isn't the same and furthermore even high def TVs don't have the resolutions that computer TFT's and CRT's have and require even more power to be spent on AA to make the game look half decent after being masacred by scaling.


Nvidia don't have that many options...they have only two long terms furtures as i see it. they can either continue producing graphics systems and not have any mobos or very few to put them in cause all the other companies have gone intergrated and eventually be forced to close down or go intergrated and mid-range like the rest....OR.....they can start producing they're own CPU's, mobos and graphics solutions that work together but due to the incompatibility with other hardware and issues it would have to be very expensive and they'd probably forced into my 'alternative' furture in the end if they didn't ant to close down.
Bursar 24th July 2006, 22:08 Quote
Joy. So now my £200 mid range CPU becomes a £300 mid range CPU with a low end GPU tacked on...

And if you thought processor naming was confusing now, wait until you 4 have variations of the same CPU that include different GPU cores.
Jhonbus 24th July 2006, 22:17 Quote
I'm not sure I like this much. Sure, I want AMD and ATI both to stay in the game for the sake of creating competition (In fact I use AMD and ATI at the moment.) But if this comes at the cost of sacrificing our ability to choose our processors and graphics solutions independently, it's a very bad thing.
suicidal-kid 24th July 2006, 22:26 Quote
Well, as an ATI-Intel person, I'm out of luck.
valium 24th July 2006, 22:46 Quote
The High-End crowd is basically having their PC turned into an upgradeable console.

When this takes off I predict that AMD will decline even more in sales, due to this merger and its GPU/CPU unification that no longer allows us to choose our favourite brand of GPU.

On another note, what does this say about ATi? Is nVidia GPU sales really hurting them so bad they need to be assimilated by a bigger CPU vendor just to keep breathing? Has Crossfire failed to succeed in what is now a multi-GPU world? (imo,yes it has)

The question now is, will Intel forego its onboard GPU's and ally itself with nVidia? And if it does how much control will the nVidia team have over its design/r&d/etc?
jjsyht 24th July 2006, 22:53 Quote
Quote:
Originally Posted by suicidal-kid
Well, as an ATI-Intel person, I'm out of luck.
Same for me. AMD-nVidia.

Personally I wouldnt mind having the cpu+gfx combination, where the gfx is comparable to a 7600GT. Since a 7300 can be considered as already available integrated (6150 chipset) the 7600GT seems a good guess for a CPU integration.

EDIT: I just re-realised... its AMD+ATI, so a mid-range ATI gfx in the likes of 'I dont want an ATI gfx!!!'. A Conroe+7600GT in one package = ;)
CAL3H 24th July 2006, 23:05 Quote
Bah I'm a green+green person and dont like the idea of the merger. It looks like ATi have ended up just annoying Intel for the time being and nVidia are left in the dark with respect to their AMD link - 'you make the CPUs, we'll make the chipset and graphics'. Isn't this partly demonstrated in their refusal to provide Intel with their own SLi ability for the 96/975 boards?

I know its probably irrelevant here but looking at the Mac community also, the current Macbook Pro line contain X1600s - future driver concentration goes out the window and the new lineup of Macbook Pros that emerge will have either integrated Intel or some nVidia chip in them. I know there have been a few Macs with nVidia inside but the vast majority seem to have been ATi. If there is really going to be nVidia in the iPod (a rumour which may have come from this very merger originally being a rumour) then maybe Apple will switch to nVidia for serious graphics power. I see this as unlikely, however, as if Intel get their way I'm sure the Macs will all have Intel GPU/CPU combis - kentsfield modified 2xCPU, 2xGPU - is this a possibility?

The idea of a CPU & GPU manufacturer fixed partnership would take all the fun out of PC building - no more options (to a certain extent). I wonder how long this move has been in the waiting. For AMD it looks like a temporary check-mate. How soon will it backfire when Intel launch the Conroe at an even lower price, with more cores and nVidia give in to Intels wishes. Conroe SLi on intel boards - sure nVidia lose some chipset sales but they could boost on the SLi + Intel combination and smile at their integration into iPods (arguably the fastest selling gadget) and find their way in to the Mac market (which stays adopted to one brand for a long period).

AMD may have decided this one on an alcohol induced evening...
Flibblebot 24th July 2006, 23:08 Quote
Quote:
Originally Posted by aggies11
I think this is exactly the sort of thing Epic's CEO was talking about recently when he identified Intel as gaming's biggest threat.

If they throw crappy GPU's on all their chips:
- Everybody can play games now, TONS more people, the gaming market explodes
- These GPU's suck and can't play high-end games. If your a business/ developer, do you make your game for the 1million highend gamers, or the 100million mass market? Bigger markets = more money. Gaming becomes dumbed down for the mainstream and high end gaming dies as we know it. I hope you like Sudoku... :p

Aggies
But this has already happened. The vast majority of gamers (and I'm not talking hard-core here, I'm talking the kind of people who buy a PC to do the accounts on, then buy the odd game here and there) are playing games with integrated graphics. Remember when Sims2 came out? BBs everywhere were inundated with people trying to get it to work on their crappy Intel integrated graphics.
The kind of people who buy graphics cards are almost power users by default.

I don't think we're going to see games developed only for the bottom end of the market - everyone likes a challenge, and that includes developers!
K.I.T.T. 24th July 2006, 23:31 Quote
Quote:
Originally Posted by Flibblebot
But this has already happened. The vast majority of gamers (and I'm not talking hard-core here, I'm talking the kind of people who buy a PC to do the accounts on, then buy the odd game here and there) are playing games with integrated graphics. Remember when Sims2 came out? BBs everywhere were inundated with people trying to get it to work on their crappy Intel integrated graphics.
The kind of people who buy graphics cards are almost power users by default.

I don't think we're going to see games developed only for the bottom end of the market - everyone likes a challenge, and that includes developers!

then again you must bear in mind none of the comapnies no matter how enviromentally friendly and now matter how many baby foxes leap forth from their energy bills a big ol' nice guy company isn't going to make a game 'as a challenge' with the knowledge that they will lose money....it just doesn't happen
specofdust 24th July 2006, 23:36 Quote
Well, like I said in the other thread, I'm worried. The article was great, it really explained why this was done and how it'll effect the relevent parties, to an extent. But there is plenty of unknowable stuff, and there are plenty of things I am concerned regarding.

The Nvidia Nforce + AMD combination has produced some of the finest overclocking seen in recent years, will that now stop? ATi from what I understand have been a bit..well..crap, at making mobo chipsets, will AMD now solely use them? For the few generations of tech that I've been into PC's, people have just had to choose their CPU, the rest worked in any mobo. Things split a bit with SLI and Crossfire but most of us don't use dual cards so again that wasn't a huge deal. My major concern now would be that we start to see platforms, as Wil talked about, that lock you into partner technologies, or the other products from the same company.

So much is up in the air, even with all the talk this came totaly unexpected to me, and I imagine many others. It'll be very intresting to see how the dust settles, but untill it does I'll worried that we may be seeing the end of an era of unparrelled flexibility and choice.
r4tch3t 24th July 2006, 23:38 Quote
I am also a green-green fan. Although conroe is the king right now, AMD will come back. But now with the merger, we will have less choice.
I wouldn't have thought it would be AMD-ATI, if anything I thought AMD-nVidia would be more likely with the nForce chipsets.
The GPU on a CPU, deja-vu, first off CPUs were good for everything, then graphics became too much for it and the GPU daughter was born, now its moving back in?
DONT MOVE BACK IN WITH YOUR PARENTS!!!!!
Well thats my illogical $0.02
jjsyht 24th July 2006, 23:45 Quote
True(for KITT's post), but currently only lowest-end gfx are integrated. With OSes (Vista & MacOs) asking for high 3D capabilities, integration will involve 'better' gfx.
If intel keep pushing their crappy graphics chipset, its gonna kill the game market.
If intel improve their graphics offerings (4eg: CPU+GFX integration), its gonna explode the gaming market with proper GPUs capable of running new games at 'acceptable' gfx quality. Thus we can all have our 1-time-fix of NFS Most wanted.



as the article points out, it was the same for co-processor, etc. First they are too much for the CPU, but now it can be integrated. High-end GPU will NOT be integrated (well... who knows), but the other GPUs could, and hopefully, will.
Tim S 24th July 2006, 23:52 Quote
spec - I'm just waiting for some answers to some questions back from NVIDIA at the moment. I might give Intel and a few others a call tomorrow and see what they have got to say about the deal. :)
RotoSequence 24th July 2006, 23:58 Quote
Interesting view Wil; while an interesting vision, I perceive things going slightly differently. X86's weakness right now is its singular, monolithic design. The movement has been to multiple cores in order to enable higher performance via multiple data requests. Sun's Niagra did this, and its tons of stupid cores seem to work pretty well. Now, Intel is hell-bent on pushing instruction parallelism. Why?

From a raw executable standpoint, Intel's push doesnt appear to make sense. Ive mentioned previously that no one is going to use more than four cores with today's processing patterns effectively; the rest will be idle. However, the predicted next generation of computer processors is going to represent a technical left turn, that others are predicting to be comperable to the introduction of the Pentium processor itself. The Inquirer brought up the fact that you can refresh a product a lot more often when you create a relatively simple, efficient little design that can be cut and pasted in different patterns to deliver higher performance. GPUs have been doing this since their inception, and Intel wants to cash in on this; monolithic CPUs take a tremendous amount of effort and years of busywork Research and Developement to conceive. Conroe for example, probably took roughly five years to develop.

With the jump to 64 bit processing, we are no longer dealing with the imposed 4GB memory addressing limit. We can now fling information right left and center, and we are starting to see, in the console wars, integrated memory architecture for the graphics and processor. Conceivably, this will come for PCs too, and the RAM for both the processor and the graphics will be the same. There wont be a memory wall for the two either.

Within a few years, graphics themselves will be sucked into the metaphorical black hole that is the CPU. This is where I start to disagree with you, wil. CPUs arent going to pull in a monolithic graphics chip just for the sake of unity. Graphics itself will become X86 based altogether. I dont forsee a processor that contains a GPU; I see a processor that is so parallelized, that the GPU concept cant hold a candle to the raw X86 based processing power the new "general processing unit" would have to offer, thanks to Intel's parallelism efforts. The Video accellerator wont be necessary because the processor has the grunt to do all the work on its own.

Thats why AMD would need ATI to survive. AMD has great engineers, but they cant create parallel technology in such an extreme manner. ATI can; theyve been doing it with graphics for years. Intel has been developing it for years the hard way; AMD is just going to buy its way into this, and get the job done easier. Its much like Malcom's line from Jurassic park; "you stood on the shoulders of geniuses, and you took the next step". Why earn the knowledge the hard way? In business, it destroys you to do things the hard way.

Its a big gamble, but AMD/ATI could conceivably become hugely powerful over the next five to ten years, and Intel will be following suit. Where does this leave Nvidia? I honestly dont know; that is still up in the air. In the meantime though, things are going to continue much as they always have - that is, until the generation after Core 2's successor, Nehalem or the generation after, sees the light of day.

Sorry if this is a bit rambling :o
Silver51 24th July 2006, 23:59 Quote
I may be alone in this, but I have a feeling that integrating a GPU onto the CPU instead of the motherboard won’t change things that dramatically in the near future. Add in graphics solutions with features such as dedicated memory should be in demand for serious gaming for a while.

Actually, I’d be interested to know what Microsoft thinks about this.
jjsyht 25th July 2006, 00:10 Quote
I think MS would like to have a proper GPU into any pc sold - no limit on the abilities of Vista.
The GPU/CPU integration may be for the far future, but in the near future, isn't AMD just trying sell 'platform' like Intel?


Imagine a Conroe/7600 integrated, on a nano itx. HD Video, HD Audio, GigaEthernet - an above mid-range gaming pc smaller than a laptop.
Reins 25th July 2006, 00:20 Quote
Quote:
Originally Posted by r4tch3t
I am also a green-green fan. Although conroe is the king right now, AMD will come back. But now with the merger, we will have less choice.
I wouldn't have thought it would be AMD-ATI, if anything I thought AMD-nVidia would be more likely with the nForce chipsets.
The GPU on a CPU, deja-vu, first off CPUs were good for everything, then graphics became too much for it and the GPU daughter was born, now its moving back in?
DONT MOVE BACK IN WITH YOUR PARENTS!!!!!
Well thats my illogical $0.02

Lol, well put.
DXR_13KE 25th July 2006, 00:57 Quote
Quote:
Intel's GPU on CPU is also modular, meaning you can effectively create 'SLI cores' - for example, a 6-core processor could have four CPU cores and two graphics cores working in 'SLI'. This technology could also be spun off into a discrete card if Intel decided it wanted to move into that space to compete with ATI-AMD.

in this you mean that i can buy a...say....6 core intel cpu i can use all of them for processing power, and an extra graphics card for higher end gaming. i can use 5 cores for processing and 1 for low graphics power, 4 cores for processing, 1 for the graphics 1 for phisics. 5 cores for processing 1 for phisics and a extra graphics card for high end graphics..... etc....correct?

than this means that this rocks, you can have 3 cores for cpu and 3 cores for GPU and have enough power to play about any game. kewl

or even.... 3 cpu cores 1 ppu core 2 gpu cores + a uber graphics card. could it work?

this looks promising, it may be my future processor.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums