bit-tech.net

AMD R9 and R7 series graphics cards announced

AMD R9 and R7 series graphics cards announced

The AMD Radeon R9 290X.

AMD has taken the wraps off its latest range of graphics cards, revealing the R9 and R7 series.

The full lineup of entry-level to flagship products is lead by the new R9 290X, AMD's fastest ever graphics card, and finishes with the R7 250X, a sub-$89 card.

The new flagship card is the world's first 5 TFLOPS graphics card, marking an 11 percent increase over Nvidia's 4.5TFLOPS GTX Titan card.

The card also features 300GB/sec memory bandwidth, which AMD claims allows for over 100 layers of complex rendering effects in real time. It can also generate 4 billion triangles per second from its geometry engine.

To create all this the card uses 6.2 billion transistors.

As revealed by our sneak peak yesterday, the new card is a typical-looking dual-slot model with a large blower-type cover that features an intriguing array of grooves along it that may be for improved airflow.

The card also uses both a six-pin and an eight-pin auxiliary power input and features two Dual-link DVI video outputs along with a full-size DisplayPort and full-size HDMI.

No CrossFire connectors are present on the card, hinting at a possible connector-less solution for multi-GPU configurations.

AMD R9 and R7 series graphics cards announced

Few details of the chip at the core of the R9 290X have been revealed but we expect to hear more as the announcement progresses.

The full lineup of new cards is as follows:
  • R7 250 - <$89
  • R7 260X - $139
  • R9 270X - $199
  • R9 280X - $299
  • R9 290X

63 Comments

Discuss in the forums Reply
Kruelnesws 25th September 2013, 21:39 Quote
Go get em AMD!
... and first
Skiddywinks 25th September 2013, 22:32 Quote
My wallet is going to flinch when it hears the prices.
Xploitedtitan 25th September 2013, 22:36 Quote
In a table on GD Nerdy's Story, it was speculated to be 599$ for the R9 290X. Might force nVidia down in it's prices.

Yummy, GTX780 at 450€.
Fizzl 25th September 2013, 23:22 Quote
So.. length?
Dave Lister 25th September 2013, 23:44 Quote
I'm really hoping mantle takes off and replaces direct x
Panos 25th September 2013, 23:46 Quote
R9 280X looks the most promising at $299, given that we might be able to buy two of these at the price of one R9 290X!!!!!

And if the 290X is 30-34% faster than the GTX 780 and 10% faster than the Titan, the WORST a 280X can be is in par of the 770!!!! With that price, is stealing!

http://static.techspot.com/images2/news/bigimage/2013-09-25_15-49-54.jpg
jrs77 26th September 2013, 01:26 Quote
R7-250 shouldn't be a dual-slot-design, but single-slot. The pricetag is pretty nice for a 7750-succesor tho and slightly cheaper.
ssj12 26th September 2013, 03:54 Quote
I'm curious if their tests were 100% accurate.
DbD 26th September 2013, 09:41 Quote
Quote:
Originally Posted by Panos
R9 280X looks the most promising at $299, given that we might be able to buy two of these at the price of one R9 290X!!!!!

And if the 290X is 30-34% faster than the GTX 780 and 10% faster than the Titan, the WORST a 280X can be is in par of the 770!!!! With that price, is stealing!

http://static.techspot.com/images2/news/bigimage/2013-09-25_15-49-54.jpg

I hope it had a slight edge on a GTX 770. Price wars here we come :)
Harlequin 26th September 2013, 09:42 Quote
Quote:
Originally Posted by ssj12
I'm curious if their tests were 100% accurate.

6800 firestrike is bang on the money for a 7970 ghz or so
Dave Lister 26th September 2013, 10:18 Quote
I might have to upgrade to the R9 280X for that price.
1. I should see a decent jump in graphics over my 5870

2. It should take part of the cpu load away using it's new sound processor - increasing my cpu's (AMD1100T) useful life span.

Of course i'll be waiting for a bit-tech review and to see some benchmark results first !
GuilleAcoustic 26th September 2013, 10:24 Quote
Jeez, each time I'm pretty certain about what to buy ... their's a new annoucement :(. Those new card looks gorgeous, but what I like more is that new API. Planning a developpement + gaming system and learning a new API could occupy my spare time :D.

Return of the Glide ? What openGL should be ? I hope that Mantle will be used and not end like PhysX. With that and SteamOS around the corner, I hope gaming on Linux will increase.
loftie 26th September 2013, 10:33 Quote
Any ideas what the difference between the 290 and the 290x are?
GuilleAcoustic 26th September 2013, 10:46 Quote
no idea, but it has 2x 6 pins while both 280X and 290X have 6 + 8 pins. Prolly higher freq ?
Kovoet 26th September 2013, 10:55 Quote
I can see my two 7970's going for sale soon. Looking forward to seeing the price of the 290x

Sent from my GT-I9505 using Tapatalk 2
runadumb 26th September 2013, 11:00 Quote
I have been with Nvidia for around 4 generations now which is the longest I've ever stuck with 1 brand but it looks like I will be switching over very soon. Get out those benchmarks!
GuilleAcoustic 26th September 2013, 11:24 Quote
What only holds me from changing py PC now is : Will SteamOS be more nVidia or AMD oriented ?
shrop 26th September 2013, 11:31 Quote
I wonder what the pricing is for the 290X. Has it been hinted at or even announced? The 280X is an awesome price too.
cookie! nom nom 26th September 2013, 11:52 Quote
Quote:
Originally Posted by Kovoet
I can see my two 7970's going for sale soon. Looking forward to seeing the price of the 290x

Sent from my GT-I9505 using Tapatalk 2

thats what im waiting for! :D 7970 sales
Spuzzell 26th September 2013, 12:00 Quote
The cards are sexy, but the Mantle API is the gamechanger though.

If it works even a tenth as well as AMD claim, anyone with an Nvidia card will be significantly disadvantaged.
Panos 26th September 2013, 12:21 Quote
Given that next gen consoles going to force devs use Mantle if they want to pull really good graphics at the hardware provided, and since new AMD cards going support it (future Nvidia's could also), in addition to that is going to be OS free, with the performance boost, then I see DX going down the drains.

I was ready to upgrade my GTX570 with a 7970 at the end of this month, however now, I find it pointless no matter the price. The 7970 has to be very cheap (even on Ebay) to be able to grab 2 for the price the 280X and still I will think about it, given the other extra benefits.

I foresee it will happen the same crash on current gen GPUs as when the 8800 came out all those years ago.
They were worthless the week after it went on sale.
DbD 26th September 2013, 12:21 Quote
Quote:
Originally Posted by Xploitedtitan
In a table on GD Nerdy's Story, it was speculated to be 599$ for the R9 290X. Might force nVidia down in it's prices.

Yummy, GTX780 at 450€.

The chart has the R9 290X getting 8k in firestrike (not extreme). A GTX 7800 gets about 8.5K in firestrike. That puts it slightly behind the 780, and it's price is slightly behind the 780. I see no price war there :(
GuilleAcoustic 26th September 2013, 12:22 Quote
Quad channel APU ... AMD, can you ear me ? Kaveri IGP is based on GCN, so Mantle should work, right ? Quad channel would solve the memory bandwidth issue ... add unified memory + low level programming (through Mantle) and this would be a nice little system.
Harlequin 26th September 2013, 12:39 Quote
Quote:
Originally Posted by DbD
The chart has the R9 290X getting 8k in firestrike (not extreme). A GTX 7800 gets about 8.5K in firestrike. That puts it slightly behind the 780, and it's price is slightly behind the 780. I see no price war there :(

what chart? no firestrike score has been mentioned for the 290x
jrs77 26th September 2013, 13:59 Quote
Quote:
Originally Posted by GuilleAcoustic
Quad channel APU ... AMD, can you ear me ? Kaveri IGP is based on GCN, so Mantle should work, right ? Quad channel would solve the memory bandwidth issue ... add unified memory + low level programming (through Mantle) and this would be a nice little system.

I hope that Kaveri improves first and foremost on the CPU-side, as we don't need more powerful IGPs. If you need more GPU-power, than you can simply buy the new R7-250 for under $100.
AMD needs to improve their CPU-performance, before I cconsider buying any of their APUs.
GuilleAcoustic 26th September 2013, 14:08 Quote
Quote:
Originally Posted by jrs77
I hope that Kaveri improves first and foremost on the CPU-side, as we don't need more powerful IGPs. If you need more GPU-power, than you can simply buy the new R7-250 for under $100.
AMD needs to improve their CPU-performance, before I cconsider buying any of their APUs.

I'm looking at an i7-4765T (35W 4C/8T) + R7-250. This should run very well from a pico PSU.
suenstar 26th September 2013, 14:37 Quote
Loving the look of the shroud on that example photo.
I'm already tempted to go for one when they release for the simple fact that replacing those red strips with a light and having it on a breathing effect would be gorgeous.
rollo 26th September 2013, 14:55 Quote
Quote:
Originally Posted by Spuzzell
The cards are sexy, but the Mantle API is the gamechanger though.

If it works even a tenth as well as AMD claim, anyone with an Nvidia card will be significantly disadvantaged.

If it works as advertised Nvidia will not exist in the pc desktop gaming market place, Just as the companies that 3dfx battled with all went bankrupt on the back of there Glide stuff. The performance dif between Glide and its counter parts at the time was huge even low level 3dfx cards bested top end cards by competing manufactures due to Glide.

Nvidia would not be able to offer anything that would get close to Mantle performance figures if it takes off in a major way.

Mantle API been Low level is the game changer, Not like Open GL or Direct x can compete agaist it performance wise as both are High Level APIs.

To anyone who does not relise this a low level API is basically coding to hardware on consoles and if you know the exact gpu specification you can get alot more performance than you could using direct x.

Thats what makes Mantle and like glide before it such game changers if they work. The difference could be 80-90% performance gains from Mantle over using direct x maybe more in fact.

Think both Microsoft and Nvidia will be thinking to themselves we are in trouble here if this takes off. Nvidia would probably not be in the pc sector no more and there move over towards mobile would likely happen even faster.

Microsofts cash is still in Enterprise sector as it has been for years and this would not effect them at all.
jrs77 26th September 2013, 16:13 Quote
Quote:
Originally Posted by GuilleAcoustic
I'm looking at an i7-4765T (35W 4C/8T) + R7-250. This should run very well from a pico PSU.

You can't run a dedicated GPU from a picoPSU, even if it's only something like the 7750. The 3.3V line is strained by the PCIe-card and your picoPSU will overheat because of that. Usually a PCIe-card draws 3A over the 3.3V line, so the picoPSU will allways run at 50% from that alone. So your picoPSU won't live that long without actively cooling it.

That's why I decided against a picoPSU and went for an SFX-PSU instead for my upcoming workstation.

Read the manual -> http://resources.mini-box.com/online/PWR-PICOPSU-150-XT/PWR-PICOPSU-150-XT-manual.pdf
Quote:
Precautions for operating this DC-DC converter:
- For fanless operation de-rate the output of the 3.3 and 5V rails by ~35% or ensure PSU surface temperature should not exceed 65°C, whichever comes first.
- Combined and sustained output should not exceed 65% of total power or or ensure PSU surface temperature should not exceed 65°C, whichever comes first.
- Input current should not exceed 8A. For current higher loads, we suggest using a 2x2 mini-FIT JR as an input connector.
- Peak load for individual rails should not exceed 60 seconds.
- For long life operation, PSU surface temperature should not exceed 65°C.
GuilleAcoustic 26th September 2013, 16:30 Quote
Damned ! What about an M4 : http://www.mini-box.com/M4-ATX-HV?sc=8&category=981 ? Maybe this is the same and I'll have to glue some heatsink on it :(
DbD 26th September 2013, 16:31 Quote
Quote:
Originally Posted by Harlequin
Quote:
Originally Posted by DbD
The chart has the R9 290X getting 8k in firestrike (not extreme). A GTX 7800 gets about 8.5K in firestrike. That puts it slightly behind the 780, and it's price is slightly behind the 780. I see no price war there :(

what chart? no firestrike score has been mentioned for the 290x

This chart:
http://img94.imageshack.us/img94/7588/a599.png
debs3759 26th September 2013, 16:37 Quote
Quote:
Originally Posted by loftie
Any ideas what the difference between the 290 and the 290x are?

The only difference I know for sure is that the 290X has 2816 SP and the 290 only has 2560. I haven't searched enough yet to know the rest of the 290 specs :)
jrs77 26th September 2013, 16:42 Quote
Quote:
Originally Posted by GuilleAcoustic
Damned ! What about an M4 : http://www.mini-box.com/M4-ATX-HV?sc=8&category=981 ? Maybe this is the same and I'll have to glue some heatsink on it :(

I've settled for this -> http://www.chieftec.eu/de/gehaeuse/itx-tower/bt-02b.html
Harlequin 26th September 2013, 16:48 Quote
Quote:
Originally Posted by DbD
This chart:
http://img94.imageshack.us/img94/7588/a599.png

page doesn't load??
GuilleAcoustic 26th September 2013, 16:51 Quote
Quote:
Originally Posted by jrs77
Quote:
Originally Posted by GuilleAcoustic
Damned ! What about an M4 : http://www.mini-box.com/M4-ATX-HV?sc=8&category=981 ? Maybe this is the same and I'll have to glue some heatsink on it :(

I've settled for this -> http://www.chieftec.eu/de/gehaeuse/itx-tower/bt-02b.html

Nice one, thanks for the link.
DbD 26th September 2013, 17:52 Quote
Quote:
Originally Posted by Harlequin
Quote:
Originally Posted by DbD
This chart:
http://img94.imageshack.us/img94/7588/a599.png

page doesn't load??

That's annoying, it's the slide from the AMD presentation with a graph that has all the cards along the X axis and firestrike scores on the Y axis. Most places reporting on the cards have it somewhere, here's the [H] link to their article:
http://www.hardocp.com/news/2013/09/25/amds_gpu_14_product_showcase_webcast
Panos 26th September 2013, 18:00 Quote
Quote:
Originally Posted by rollo
If it works as advertised Nvidia will not exist in the pc desktop gaming market place, Just as the companies that 3dfx battled with all went bankrupt on the back of there Glide stuff. The performance dif between Glide and its counter parts at the time was huge even low level 3dfx cards bested top end cards by competing manufactures due to Glide.

Nvidia would not be able to offer anything that would get close to Mantle performance figures if it takes off in a major way.

Mantle API been Low level is the game changer, Not like Open GL or Direct x can compete agaist it performance wise as both are High Level APIs.

To anyone who does not relise this a low level API is basically coding to hardware on consoles and if you know the exact gpu specification you can get alot more performance than you could using direct x.

Thats what makes Mantle and like glide before it such game changers if they work. The difference could be 80-90% performance gains from Mantle over using direct x maybe more in fact.

Think both Microsoft and Nvidia will be thinking to themselves we are in trouble here if this takes off. Nvidia would probably not be in the pc sector no more and there move over towards mobile would likely happen even faster.

Microsofts cash is still in Enterprise sector as it has been for years and this would not effect them at all.


Glide & Mantle are published on different times and from a different position of the company created them.

When Glide came out, it was late 90s when the whole computer market was still an infant, and from a company that only product was an "add-on" card to the existing gfx card for PC that was used by very "savvy" people. That period the games that used it were counted on single hand, and only UT99 & Dark Omen were the ones that supported it fully. That was on a period when CD-ROM on PC was a luxury also, Linux was still for geeks like me, as was the Mac for the people who made no sense by choosing one.


Forward to 2013. AMD has the main next gen consoles on their pocket for the next 7 years. That is around 160,000,000 units. Plus half the discreet GPU market, raising their APU market share without actual competition on the top end market, and they are ready to put on sale their own ARM APU for mobile devices and PC of course. In addition Macs, are a pretty heafty market share of home computers.

So Mantle makes sense, because it can be used in everything above, plus is Open not Closed like Glide was back then, restricted to a single company. Plus now the variety of OS (Win, Linux, Androids, OSX) making it an one way street if you need to improve phones, pads, pc, consoles at the same time. Consider that because the market is so big on phones & pads, they are going actually soon to play proper games, and game studios like that idea. No more Java games. :)

So better good riddance on the DX & Windows and lets put what ever OS we want to our systems, and not being forced to use a single product only.
Harlequin 26th September 2013, 18:06 Quote
Quote:
Originally Posted by DbD
That's annoying, it's the slide from the AMD presentation with a graph that has all the cards along the X axis and firestrike scores on the Y axis. Most places reporting on the cards have it somewhere, here's the [H] link to their article:
http://www.hardocp.com/news/2013/09/25/amds_gpu_14_product_showcase_webcast

the bar is full? over 7000 ? with no indication of just how far it goes??
Corky42 26th September 2013, 18:44 Quote
Quote:
Originally Posted by rollo
If it works as advertised Nvidia will not exist in the pc desktop gaming market place, Just as the companies that 3dfx battled with all went bankrupt on the back of there Glide stuff. The performance dif between Glide and its counter parts at the time was huge even low level 3dfx cards bested top end cards by competing manufactures due to Glide.
<Snip>

AFAIK Mantel is open source, if anything Mantel combined with SteamOS (not using DirectX) spells the death knell for Windows as the main gaming OS.
AlienwareAndy 26th September 2013, 19:35 Quote
Quote:
Originally Posted by Corky42
AFAIK Mantel is open source, if anything Mantel combined with SteamOS (not using DirectX) spells the death knell for Windows as the main gaming OS.

Yeah I'm pretty sure that TressFX is open source too. I was really, seriously impressed with that. Amazing the difference hair can make.

It just made Tomb Raider so much more believable, rather than the main character having a judge's wig welded to her head.
.//TuNdRa 26th September 2013, 19:40 Quote
Mantle is going to take a while to get rolling. While it's going to be given a good shove by Battlefield 4; It's not going to be entirely viable till the entire market is onto Mantle compatible hardware. If it were possible to back-patch Mantle to older hardware and continue working away on it without an issue; then it may take off.

I, for one, welcome our new low-level programming.
ssj12 27th September 2013, 01:28 Quote
Nice cards, still going to buy Nvidia cards made by EVGA.
siliconfanatic 27th September 2013, 01:40 Quote
Bleh. I've finally joined the red side. Only reason I see to buy nV now is for dedicated physX.
Panos 27th September 2013, 06:59 Quote
Quote:
Originally Posted by siliconfanatic
Bleh. I've finally joined the red side. Only reason I see to buy nV now is for dedicated physX.

Indeed, which is working on how many titles published this year? 8!!!!
Corky42 27th September 2013, 08:45 Quote
Yea AMD are looking a lot stronger than they did a few months ago, being in all the next gen consoles, Mantel API, the R9 expected to beat the Titans. Then there is SteamOS and OEM's building Steambox's So much has been happening this week it feels like the whole gaming environment has been thrown up in the air, i wounder where everything will land.
DbD 27th September 2013, 11:17 Quote
Quote:
Originally Posted by Harlequin
Quote:
Originally Posted by DbD
That's annoying, it's the slide from the AMD presentation with a graph that has all the cards along the X axis and firestrike scores on the Y axis. Most places reporting on the cards have it somewhere, here's the [H] link to their article:
http://www.hardocp.com/news/2013/09/25/amds_gpu_14_product_showcase_webcast

the bar is full? over 7000 ? with no indication of just how far it goes??

Not much. Here's a post from someone who does know:
http://forums.overclockers.co.uk/showpost.php?p=25009819&postcount=1

So basically it's just a bunch of rebadges:
7950 -> R9 270X for about the same price
7970 GHZ -> R9 280X for about the same price

And one new card:
R9 290X is similar to GTX 780 for about the same price.
Harlequin 27th September 2013, 12:09 Quote
Gibbo doesn't know the spec or price of the R290X as it hasn't been released to retail or even NDA to retail so that's just poopoo on his part
memeroot 27th September 2013, 20:21 Quote
did you just totaly ruin my day?
AlienwareAndy 27th September 2013, 21:31 Quote
Gibbo is just using his weight to make sales.

Especially the bit about the other cards being pretty much the same as the 7950 and 7970. That post is nothing but a sales patter.
ssj12 28th September 2013, 00:19 Quote
I wonder if Nvidia will answer with the GTX Titan Ultra. haha
AlienwareAndy 28th September 2013, 00:23 Quote
Quote:
Originally Posted by ssj12
I wonder if Nvidia will answer with the GTX Titan Ultra. haha

If it's priced by a brain damaged chimp then they can announce whatever they like.
siliconfanatic 28th September 2013, 00:28 Quote
True, But they'd probably do it, too, if just to brag that they've still got the crown. Pricing lies in whether nVidia feels like being obliterated by AMD or not.
himaro101 30th September 2013, 13:35 Quote
am i the only one skeptical of what will effectively be a 'first gen' product?
yes, oh my god the specs looks good etc etc and the price is a damn steal, but new tech and new standards always have teething issues.
I'd wait for the 'second gen' personally.
AlienwareAndy 30th September 2013, 13:52 Quote
Quote:
Originally Posted by himaro101
am i the only one skeptical of what will effectively be a 'first gen' product?
yes, oh my god the specs looks good etc etc and the price is a damn steal, but new tech and new standards always have teething issues.
I'd wait for the 'second gen' personally.

I've really learned my lesson with buying anything that's new. It usually tends to not work properly. I bought a 7970 at launch and I had no end of trouble with it. Even though my screen was endorsed by AMD half of it would become a black hole if you let the PC sleep. Then I would get artefacts on the start button and all over my windows so in the end I had to just disable sleep completely. So much for saving power.... Any way in the end I got rid of it. 3D on AMD using Tridef was pretty crap, so I swapped it with some guy for two GTX 480s. Loved 'em.
damien c 30th September 2013, 14:31 Quote
Waiting for Maxwell, which is why I have decided to water cool my current 680's.
PingCrosby 1st October 2013, 09:04 Quote
'It can also generate 4 billion triangles per second from its geometry engine', mmmmmmm I love triangles me
Corky42 1st October 2013, 09:51 Quote
Quote:
Originally Posted by ssj12
I wonder if Nvidia will answer with the GTX Titan Ultra. haha

Nope, it looks like they may counter with a price cut..
http://www.techspot.com/news/54182-rumor-nvidia-to-cut-gpu-prices-in-november-to-compete-with-amd.html
ObsCure 1st October 2013, 13:39 Quote
This Mantle API sound like what the PC industry was begging for years. Having all that power and not being able to utilize it properly is just stupid.

I mean look at GTA5 on a 360 or PS3 with their "ancient" 7 generations old chips. What these people are able to pull off by coding to hardware is bonkers.

And on the PC side we have people buying £800 of gfx cards, water cooling them, just so they can play Crysis on 3 screens, and waiting for the next gen to upgrade because they still need more power. Sad really.

I think I'll keep my second hand HD5870 for a bit more, save my hard earned :DOSH: and see what Lord Gaben and AMD bring in 2014.
siliconfanatic 3rd October 2013, 21:10 Quote
Great. So AMD had to pull THAT kind of complete and utter bullshit on us.
Now we've got no clue what the price will be, and AMD is forcing us to pay blindly if we want top end graphics. Way to pull an nVidia off, AMD.
ObsCure 4th October 2013, 00:04 Quote
Quote:
Originally Posted by siliconfanatic
Great. So AMD had to pull THAT kind of complete and utter bullshit on us.
Now we've got no clue what the price will be, and AMD is forcing us to pay blindly if we want top end graphics. Way to pull an nVidia off, AMD.

Yes, that is right. AMD is holding a gun to your head and is forcing you to buy a pre-order slot, instead of waiting a week or two longer, like the rest of us dirty peasants.
Harlequin 4th October 2013, 00:47 Quote
Quote:
Originally Posted by siliconfanatic
Great. So AMD had to pull THAT kind of complete and utter bullshit on us.
Now we've got no clue what the price will be, and AMD is forcing us to pay blindly if we want top end graphics. Way to pull an nVidia off, AMD.

newegg had the price up earlier - its $729.99 for the BF4 edition.....
Kovoet 13th October 2013, 20:39 Quote
24 hours then we find out
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums