bit-tech.net

AMD will be first to DirectX 11

AMD will be first to DirectX 11

According to sources in Taiwan, AMD will beat Nvidia to the market with the world’s first DirectX 11 capable graphics cards.

COMPUTEX 2009: Taiwanese sources close to both AMD and Nvidia have confirmed that AMD will be first to market with DirectX 11 capable graphics cards and they’re currently expected to arrive in October of this year – right in line with Windows 7’s expected release.

The sources said that AMD is “essentially ready” to release the new family of GPUs – and has been for some time – but the company is waiting for the problems with TSMC’s 40nm process to be ironed out.

The new family of GPUs – with the flagship rumoured to be called RV870 (although not confirmed by our sources) – will follow the same strategy that AMD employed with the Radeon HD 3000 and 4000 series. This means we can expect AMD to double up and use a pair of its fastest GPUs to create a dual-GPU flagship product of a similar ilk to the Radeon HD 4870 X2, which dominated the high-end for many months.

Nvidia, on the other hand, is expected to release another big GPU, but it is unlikely to be a conservative effort like GT200 – we’re told the focus will very much be on maximising performance and efficiency when switching between graphics and general computing tasks (i.e. using the Compute Shader). It’s unclear whether it’ll be enough of a brute to match the performance of two smaller Radeon GPUs on a single board though.

With that said, our sources said that GT300 had taped out but Nvidia is being quite cagey about a release timeframe. It has been manufactured on TSMC’s 40nm node, which AMD has been having a lot of trouble with as RV740 chips are in “very short supply.” If the problems with the process aren’t ironed out, it could affect both companies which wouldn’t be good for us consumers.

Discuss in the forums.

18 Comments

Discuss in the forums Reply
l1nk45 29th May 2009, 11:58 Quote
woot, this is gonna be fun to see them battle again. Looking forward of getting the gen best graphics card =]
EvilRusk 29th May 2009, 12:14 Quote
But surely DirectX 11.1 will be released one week later and the Radeons will lose out as they don't support it! :P
mjm25 29th May 2009, 12:20 Quote
It was nVidia that had all the issues with 10.1, deferred rendering and anti aliasing...
Goty 29th May 2009, 13:05 Quote
See, the difference in that situation would be that NVIDIA would start paying developers to include the features instead of paying them to NOT include them, just so they look better in comparison again.
sear 29th May 2009, 13:09 Quote
Quote:
Originally Posted by mjm25
It was nVidia that had all the issues with 10.1, deferred rendering and anti aliasing...
Really? Then how come there are games that exist that don't support anti-aliasing at all on the AMD side, i.e. Dawn of War II? Don't want to throw mud, but just saying, a lot of it depends on what developers choose to support - and most haven't even bothered with DirectX 10.1. The only game I can think of is Assassin's Creed, and in that case they actually removed the support in a later patch because of the glitches it introduced.

We have made major strides in getting equal image quality and performance between different cards due to the adoption of APIs like DirectX and OpenGL, and in fact in most games you will be hard-pressed to tell any differences between cards these days outside of raw framerate numbers, the differences between which are often so small as to be unnoticeable. It really comes down to support, drivers and additional software. The advantages of going one way or the other are pretty slim these days outside of what driver control panel you want and whether or not you care about PhysX on your GPU (according to a recent Anandtech poll, most don't).
Goty 29th May 2009, 15:13 Quote
Quote:
...in that case they actually removed the support in a later patch because of the glitches it introduced.

Glitches it introduced on NVIDIA hardware, you mean.
knutjb 29th May 2009, 15:21 Quote
It appears to have been a good decision by AMD to experiment with the 40nm process so they can work the bugs out while readying the next gen chip. The history of whose on top has bounced between AMD/ATI and Nvidia and just because Nvidia has the very high end today but remains to be seen in October. They might still have the top card in the new bunch but AMD has been moving into the mid-low range where most people buy. Having the high end flagship certainly seems to have helped Nvidia so will AMD follow suit?

I buy whoever has the best bang for my buck at the time I buy, not the fanboy thing. Though I do want to see AMD put out very competitive cards because that keeps prices competitive and that's good for all of us.

Can someone show me why folding@home is such a big deal in a graphics card, does it improve my game playing in any way? So far I can't see it's importance to me or the reason for focusing on it, it looks like a Gee Whiz marketing gimmick.
smithyandco 29th May 2009, 15:28 Quote
I'm still on DX 9.0c!
Stopped my plans for an upgrade for now... then again as soon as I've bought a DX 11 Card MS will bring out DX 12! :(
Evildead666 29th May 2009, 16:06 Quote
A lot of the titles that accept AA on Nv but not on ATI are TWIMTBP games, ie Dawn of War II...
Its in NV's interest to say "look, AA doesn't work on ATI" mainly because Nv hardware uses deferred rendering, so that it is impossible to compare direct screenshots between the manufacturers...

The deferred rendering came in about when Nv was getting the sh*t kicked out of it for image quality issues....
thehippoz 29th May 2009, 19:06 Quote
and doesn't amd hold the new crown for single gpu
HourBeforeDawn 29th May 2009, 19:47 Quote
ATI is always first to adopt a new standard, just like AMD is typically always first to try something new for CPU, nVidia and Intel just sit back and see how it does before copying it.

Either way awesome.
Star*Dagger 29th May 2009, 20:37 Quote
Quote:
Originally Posted by smithyandco
I'm still on DX 9.0c!
Stopped my plans for an upgrade for now... then again as soon as I've bought a DX 11 Card MS will bring out DX 12! :(

Yeah it might just be best to Wait until HX7 is out, thats the direct X for holodecks!

Never wait in computing you just shorten the amount of time your system is on top.

I went from the 8800GTX to the Radeon HD 4870x2 and I will get one of these dual monsters in October!

S*D
Skiddywinks 29th May 2009, 20:41 Quote
It's usually best to just set a rough upgrade schedule. I base it around always going for the final version of a technology when I can (Usually a smaller manufacturing process, lower voltages and temperatures, and higher clocks).
outlawaol 30th May 2009, 00:05 Quote
This is why I haven't upgraded from my 8800GTX yet. DX11 cheesecake! Crappy high prices! Me have no money!

:)
cyrilthefish 30th May 2009, 01:38 Quote
should be interesting when it comes out

http://www.theinquirer.net/inquirer/news/1137331/a-look-nvidia-gt300-architecture

(i know it's the inquirer, so it is potentially iffy info )
Essence of the story:

-Nvidia whined at microsoft until they got some pretty big features removed from the DX10 spec so their cards would support it.
-ATI cards are designed to support the original higher spec

which leads to now
-ATI cards are roughly DX11 compliant in hardware (DX11 is mostly what DX10 was until Nvidia threw a tantrum)
-Nvidia cards will need to dedicate a lot of general purpose shaders to DX11 features as they don't have the hardware support, or completely redesign the chip in an incredibly short timeframe

All in all, i'm quite worried of a monopoly forming in the graphics area in the future if nvidia gets that far behind, not to mention if Intel's larrabee turns out decent, giving Intel a monopoly of such magnitude it'll be terrifying to watch o.O
Tim S 30th May 2009, 03:50 Quote
It's written by Charlie, who lives in his own little world where ATI can do no wrong and Nvidia gets everything wrong. My favourite article of his was when he claimed R600 would be better than G80 and would 'kill it' (or words to that effect) in every benchmark. We all know how that turned out.

He takes a lot for granted in that article, and I'd take it with a huge bucket of salt; he basically says that there's only a doomsday scenario and that Nvidia is dead in the water - I'm yet to see any evidence of that. That said, I am a little worried about their DX11 part and how it's going to turn out because it's quite a risky chip.
Elton 30th May 2009, 05:47 Quote
The only worry I have for Nvidia is the lack of a multi-purpose shader for the tesellator..

At any rate though Larabee, even if it's successful won't really catch on until it's affordable. Right now it looks really expensive(production wise) and if it was to continue, I'm guessing it would end up like the Caustic GPU.
docodine 31st May 2009, 07:24 Quote
Quote:
Originally Posted by Tim S
It's written by Charlie, who lives in his own little world where ATI can do no wrong and Nvidia gets everything wrong. My favourite article of his was when he claimed R600 would be better than G80 and would 'kill it' (or words to that effect) in every benchmark. We all know how that turned out.

He takes a lot for granted in that article, and I'd take it with a huge bucket of salt; he basically says that there's only a doomsday scenario and that Nvidia is dead in the water - I'm yet to see any evidence of that. That said, I am a little worried about their DX11 part and how it's going to turn out because it's quite a risky chip.

Haha, Google auto-suggests "Charlie Demerjian hates nVidia" when you search for his name.

I'm just hopeful that I can' safely skip a couple generations, of graphics hardware. the GTX 4xx and HD68xx will be the cards for me.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums