bit-tech.net

Interview: ATI and the Xbox 360

Comments 1 to 10 of 10

Reply
rupbert 10th June 2005, 11:46 Quote
Interesting.
M_D_K 10th June 2005, 17:18 Quote
Project gotham 3 looks fricking Sexy :), I'll defenatly be improting an xbox ;)




morgan.
Da Dego 10th June 2005, 20:10 Quote
Yeah, I have to say gotham 3 looks about lifelike. :)

I'm not sure about this whole unified architecture thing, though...I mean, aren't we really about due to replace this whole system anyways (I keep hoping someone will listen and start developing raytrace GPUs. ;))? Honestly, though...is it that any one pipe can do either task, and there are more pipes overall? or is it that there aren't as many pipes when you add in all the feature specific ones on your average graphics card?
rog27 11th June 2005, 21:45 Quote
This is an ATI marketing guy...he is full of sh*t. The truth of the matter is that NVidia has 6 months more devolopment time to finalize specs and make modifications. The other part of the truth is that the hardware development community has been aware of unified shaders for some time now and the idea is not new. The part where he says that "PS3 will most certainly be slower and less powerful" is 100% laughable as it will be a far newer, more capable piece of hardware (as logic implies newer technology always to be).

It is also worth noting that RSX is NOT the same as the PC-based NVidia Architecture. The shaders, while not unified, are different from (and much, much more powerful and efficient) than any shaders that have ever been used before. Hardware shaders (unlike software driven unified drivers found in ATI's pipelines), while less flexible, are many times more powerful than their software counterparts. And the reason ATI feels the future will be unified is because Microsoft is a Monopoly and will write the Direct X software to support their new partner in crime, not because it's the best way. The Memory architecture on the RSX is also unified, similar to Microsoft and ATI's new offering. Truth be told, ATI is a much weaker, slower (maybe slightly more flexible) GPU.

This is how the article should have read:

"The XBOX 360 will certainly be slower and less powerful than the PS3." Because, let's face it everyone, logic tells us it will be this way.
Piratetaco 11th June 2005, 21:54 Quote
Quote:
Originally Posted by rog27
This is an ATI marketing guy...he is full of sh*t.


Bingo give that man a prize. unfortunatly everyone is going to have to wait till the PS3 and xBox360 release dates to see which is the most powerful. its like every other console launch

"ours is the best . the other guys stuff sucks."

replace sony with microsoft and Playstation 3 with xbox 360
Tim S 12th June 2005, 14:29 Quote
Quote:
Originally Posted by rog27
It is also worth noting that RSX is NOT the same as the PC-based NVidia Architecture. The shaders, while not unified, are different from (and much, much more powerful and efficient) than any shaders that have ever been used before. Hardware shaders (unlike software driven unified drivers found in ATI's pipelines), while less flexible, are many times more powerful than their software counterparts. And the reason ATI feels the future will be unified is because Microsoft is a Monopoly and will write the Direct X software to support their new partner in crime, not because it's the best way. The Memory architecture on the RSX is also unified, similar to Microsoft and ATI's new offering. Truth be told, ATI is a much weaker, slower (maybe slightly more flexible) GPU.

This is how the article should have read:

"The XBOX 360 will certainly be slower and less powerful than the PS3." Because, let's face it everyone, logic tells us it will be this way.
Hi and welcome to the forums...

FWIW, G70 and RSX are from the same parent architecture.
blackerthanblack 12th June 2005, 23:07 Quote
Quote:
Originally Posted by rog27
"PS3 will most certainly be slower and less powerful"
he said "almost certainly".
rog27 13th June 2005, 20:22 Quote
Quote:
Originally Posted by bigz
Hi and welcome to the forums...

FWIW, G70 and RSX are from the same parent architecture.

Thank you for the welcome...

They are based off of the same parent technologies (core feature-set), but the RSX differs drastically the area of I/O (ability to address both pools of RAM simultaneously - CELL being able to do the same thing) and has some added functionality that allows it to work more synergystically with the Cell Processor. These have been alluded to by Ken himself (too lazy to find the article now...but its on Gamespot or IGN). The additions have not been clearly defined, I'm guessing because they afford some kind of competitive advantage.
Da Dego 13th June 2005, 20:59 Quote
Quote:
Originally Posted by rog27
These have been alluded to by Ken himself (too lazy to find the article now...but its on Gamespot or IGN). The additions have not been clearly defined, I'm guessing because they afford some kind of competitive advantage.

or disadvantage. By your own logic, the same people you're bashing for ATi you're now using as a reference for NVidia. ;)

Just thought I'd throw that out there...I'm not stating an actual opinion of one being better than another (I agree with taco), so no flames please. ;)
TheAnimus 13th June 2005, 21:24 Quote
Quote:
Originally Posted by Da Dego
or disadvantage. By your own logic, the same people you're bashing for ATi you're now using as a reference for NVidia. ;)

Just thought I'd throw that out there...I'm not stating an actual opinion of one being better than another (I agree with taco), so no flames please. ;)

i find it funny how the gfx people are saying "our cards are more exciting than your multicore CPUs" or words to that effect.

Polyhedra style compiling techniques show how easily some rather difficult problems can be broken down for multi proccessor.

Now if sony buy enough people, and work hard enough in making nice parralell compilers, their AI will kick arse, and they'll be able to produce some amazingly clever games. No matter what the graphics people say, something beed spat out on a non-HD screen is going to be ugly, but having said that 720p HDTV isn't *that* bad.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums