bit-tech.net

Farewell to DirectX?

Farewell to DirectX?


Despite what delusional forum chimps might tell you, we all know that the graphics hardware inside today's consoles looks like a meek albino gerbil compared with the healthy tiger you can get in a PC. Compare the GeForce GTX 580's count of 512 stream processors with the weedy 48 units found in the Xbox 360's Xenos GPU, not to mention the ageing GeForce 7-series architecture found inside the PS3.

It seems pretty amazing, then, that while PC games often look better than their console equivalents, they still don't beat console graphics into the ground. A part of this is undoubtedly down to the fact that many games are primarily developed for consoles and then ported over to the PC. However, according to AMD, this could potentially change if PC games developers were able to program PC hardware directly at a low-level, rather than having to go through an API, such as DirectX.

*Farewell to DirectX? Farewell to DirectX?
The Xbox 360's Xenos GPU has a less then a tenth of the processing power of a top-end PC GPU, so why don't PC games look ten times better?

'It's funny,' says AMD's worldwide developer relations manager of its GPU division, Richard Huddy. 'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.'

'I certainly hear this in my conversations with games developers,' he says, 'and I guess it was actually the primary appeal of Larrabee to developers – not the hardware, which was hot and slow and unimpressive, but the software – being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft – no doubt at all.'

Of course, there are many definite pros to using a standard 3D API. It's likely that your game will run on a wide range of hardware, and you'll get easy access to the latest shader technologies without having to muck around with scary low-level code. However, the performance overhead of DirectX, particularly on the PC architecture, is apparently becoming a frustrating concern for games developers speaking to AMD.

*Farewell to DirectX? Farewell to DirectX?
AMD's head of GPU developer relations, Richard Huddy, says games developers are asking AMD to 'make the API go away'

'Wrapping it up in a software layer gives you safety and security,' says Huddy, 'but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate.'

Hold on, you might be thinking, weren't shaders supposed to enable developers to be more innovative with their graphics anyway? Indeed they were, and the ability to run programs directly on the graphics hardware certainly enables some flexibility, particularly once we got past the fixed-function shaders of DirectX 8. However, with the exception of a few quirky-looking indie titles, there's no denying that many PC games look very much like one another.

'The funny thing about introducing shaders into games in 2002,' says Huddy, 'was that we expected that to create more visual variety in games, but actually people typically used shaders in the most obvious way. That means that they've used shaders to converge visually, and lots of games have the same kind of look and feel to them these days on the PC. If we drop the API, then people really can render everything they can imagine, not what they can see – and we'll probably see more visual innovation in that kind of situation.'