Interview: AMD on Game Development and DX11

January 6, 2010 | 12:08

Tags: #11 #5850 #5870 #batman #console #consolification #cypress #developer #directx #dirt2 #dx #gpu #huddy #interview #manager #physics #relations #richard #saboteur

Companies: #amd #games

bit-tech: Do you see DirectX 11 taking off like DirectX 9 did?

RH: More rapidly than DirectX 9 did. When I went to GDC in March 2009 I was impressed with the number of software developers that came to me and said DirectX 11 looks like it solves a few problems, so what do we need to do? I've never had that kind of conversation before, it's always been, "Look here's DirectX-number for you, let me explain."

DX11 solves at least two big problems; number one is that it lets you get rid of a previous version of DirectX because DirectX 11 runs on 11, 10 and 9 hardware, when you were thinking about shipping you no longer have to have a version for each separate operating system: 9 for XP, 10 for Vista, 11 for Windows 7. The number of builds is reduced to just one, unlike Vista and DirectX 10 competing with XP and DirectX 9.

There's no additional version of DirectX that has to be dealt with now - dump DirectX 10, you just go to DirectX 11, do the feature level support and you're away. So, instead of not knowing how popular Vista will be on launch - which was the case with DirectX 10 - you've now got a Vista install base and a Windows 7 one. If you were a company like Crytek who did decide to do a DirectX 10 version of Crysis, that meant you had to wait five months after DirectX 10 had released before it launched its game. With DirectX 11 there's no reason to wait because we know that on day one we've got some crazy big number of PCs already supporting DirectX 11 thanks to a Service Pack 2 update for Vista and the popularity of pre-orders and the release candidate.

Interview: AMD on Game Development and DX11 Microsoft, operating systems and CPUs
The DirectX 11-compatible Radeon HD 5870 has been helped by the success of Windows 7

bit-tech: Do you think that Microsoft's success with Windows 7 is also pivotal to the DirectX 11 uptake as well?

RH: Microsoft realised it had created too many difficulties with Vista and the transition from XP to Vista and they needed to get it right with Windows 7. Windows 7 is a great operating system - I had six months before I met my first bluescreen - which is unprecedented. I've bluescreened Linux more quickly than that as I use Ubuntu a bit at home. From a performance perspective Windows 7 is much better in how it uses resources so I'm not having to dedicate a colossal amount of video memory to images which are duplicated - now they are just 'in the right place' - it's a swift and responsive operating system and it finally doesn't spend all its time indexing in a way that confuses and annoys me.

With DirectX 11 you can take most of the lessons from DX10 and even DX9, and with a relatively small amount of rejigging you've got most of the knowledge to do a good job. The feature level support is also a stroke of genius that they've always resisted before - make DirectX 11 work on DirectX 9 hardware. NOT by faking a software tesselator because that would be horrendous, but by saying "look, here's what you can do on DirectX 9/Shader Model 3, DirectX 10/Shader Model 4 and then finally give the full glory of DirectX 11/Shader Model 5 where we expose everything." That's excellent and something Microsoft has resisted before because they just wanted the new DirectX to run on the new hardware.

bit-tech: But Microsoft had to start again anyway with DirectX 10 because of the driver change in Vista and how extra features could be added to DirectX 9?

RH: Well it's arguable that [Microsoft] could have gone in and changed the driver infrastructure on XP so that it could have supported DirectX 10, but they would have had to come up with a new driver model.

bit-tech: But it wasn't worth updating an OS from 2001 that would potentially affect sales of Vista.

RH: Yes, I agree, and they did something else too on the graphics side - they took multicore and multithreaded CPUs and there's now true multithreaded support for DirectX for running on as many cores as you like - certainly we've seen scaling up to 24 cores, which takes multi-processor as well as multicore processors into account. All these threads can undertake graphics related work at one time that then get consolidated into a single thread for the GPU. That makes for a much more efficient way of talking in CPU-land, to the graphics card. Like I say, on the graphics side all I care about is the pixels that are rendered, but as a CPU company too we are especially delighted about Microsoft's threading work as well because there are real-world benefits to buying two, three and four core CPUs we are shipping already.

bit-tech: As a (current) performance leader in graphics that must give you significant leverage in the industry. However, on the CPU front you're... struggling more, compared to the competition. Do you work in the same way with games developers for CPU optimisations as well?

RH: We do, although, games developers tend to see that a CPU is a CPU is a CPU, and they are not willing to invest in a colossal amount of effort in optimising for them.

bit-tech: What about particular cache architectures, instruction sets or the way it addresses memory?

RH: Well it's a bit hellish for them because they don't know how our cache architecture is going to be for the next generation and we've already got plenty for them to deal with-

bit-tech: -But for the last few generations: K8 (Athlon), K10 (Phenom), K10.5 (Phenom II), they are all related very close and so that's a huge install base already-

RH: Yes, it is, you're right-

bit-tech: -so why not say "this works, people bought it, you can optimise for this"?

RH: We do, but the biggest benefit we push is just to say, "use the cores." Take account of how many cores you have access to and write the code so it scales properly. Build worker maps for 2, 3, 4 cores etc. and handle that. That's the single best way to get scaling out of CPUs - whether it's Intel or AMD architectures makes much less of a difference. There are some things you can do; you can take advantage of the improved SSE instructions which we keep on extending, but they're primarily designed so you can do cool stuff with video. Generally the most important thing to be aware of with CPUs is to be efficient with the use of memory and use the cores so they don't fight.
Discuss this in the forums
YouTube logo
MSI MPG Velox 100R Chassis Review

October 14 2021 | 15:04

TOP STORIES

SUGGESTED FOR YOU