bit-tech.net

Nvidia GeForce GTX 580 to launch soon?

Nvidia GeForce GTX 580 to launch soon?

If this photo is accurate, Nvidia's new card will be roughly 290mm (11.5in) long - cripes!

Two stories doing the rounds strongly indicate that Nvidia is preparing a new range of graphics cards, and will probably launch them soon. VR-Zone has been practicing its Chinese and has gathered some shots and details of the new card from the PCInLife forums. Meanwhile, Tech Connect Magazine reports that the GeForce GTX 580 is mentioned explicitly in a new developer driver that was released on 27 October, plus seven other new or unknown products.

Unfortunately Chrome’s translation of the PCInLife forum thread is utter garbage, so we’ll have to rely on VR-Zone when it says that, ‘At first glance, the GTX 580 looks much more like the GTX 470 than the GTX 480. It features similar contours to the GTX 470, but is longer and features a larger fan.’ This is much more helpful than the ‘580 to the power of hope, long time useless N card, like a change of taste’ comment that Chrome believes lit_pclife made.

Using our own eyes, we can at least say that card doesn’t sport the same exposed heatsink cooler design of the GeForce GTX 480 1.5GB. Using our fingers, we can also say that the card looks long – the length between a graphics card’s backplate and the end of the 16x PCI-E connector is 145mm, and the GTX 580 looks to be twice that length. The PCB of a GTX 480 is 270mm (10.5in) long, while the GTX 580 looks like it’ll be 290mm (11.5in) long – we’re not sure that’ll fit in our Antec Twelve Hundred graphics test PC case!

The card features the usual display output of a GeForce 400 card – an HDMI and two DVI ports – so we’re not expecting the card to do any three-screen gaming without help.

Nvidia GeForce GTX 580 to launch soon? Nvidia to launch GeForce 500 range soon?

While the GeForce 500 series could well just be an updated GF100 GPU with all the stream processors enabled and some extra clock speed, those tweaks alone could make it 30 per cent faster than a GTX 480 1.5GB. This could well be enough to give ATI’s forthcoming Radeon HD 6900 cards a serious challenge in the performance stakes. While the longer PCB will allow Nvidia to fit a larger heatsink, it remains to be seen whether the reference GTX 580 cooler will avoid the problems of noise and heat build up that plagues the reference GTX 480 1.5GB cooler. At least the GTX 580 doesn’t use two 8-pin PCI-E power connectors, but it does keep the 8-pin and 6-pin power input configuration of the GTX 480 1.5GB.

Nvidia GeForce GTX 580 to launch soon? Nvidia to launch GeForce 500 range soon?

Do you think the GTX 580 looks like a Radeon killer, and do you care if it’s an updated GF100 as long as it performs well? Leave your thoughts in the forums.

Via The Tech Report.

96 Comments

Discuss in the forums Reply
B1GBUD 29th October 2010, 16:56 Quote
By crikey! I presume it'll fit in an Akasa Eclipse 62?
Redbeaver 29th October 2010, 16:57 Quote
ooooooooooh....................

/drool

580 vs 69xx radeon.... oh yeah, bring on the gloves! :D
wuyanxu 29th October 2010, 17:10 Quote
/want!

but no displayport :(
mi1ez 29th October 2010, 17:11 Quote
fold-tastic!
douglatins 29th October 2010, 17:14 Quote
Sign me up i want one or 2
Lazy_Amp 29th October 2010, 17:15 Quote
Alright, but unless they've made tweaks to bring the GF100 more in line with the GF104 architecture, how can it not run hotter than a furnace?
sakzzz 29th October 2010, 17:16 Quote
You forgot to mention that, It needs a 6 and 8 pin power connection connected to 1000V, 10A commercial power supply
bobwya 29th October 2010, 17:23 Quote
Am I looking at my next GPU... :-)
The_Beast 29th October 2010, 17:36 Quote
Yay, I can afford a 4xx series now
schmidtbag 29th October 2010, 18:07 Quote
damn both ati and nvidia are doing a crappy job lately. ati doesn't seem to be making any significant improvements other than a smaller fab size, and nvidia is just slapping more stuff on which imo is dangerous considering how the gtx400 series has enough power and heat issues.

i'm sure the gtx580 will perform the best but seriously, nvidia NEEDS to fix their power and thermal issues first. this is their chance - i don't think the hd6000 series is going to be much of a success and i feel like nvidia is just taking their sweet-ass time tacking on more things when they have stuff that needs improvement
r3loaded 29th October 2010, 18:59 Quote
Quote:
Originally Posted by Lazy_Amp
Alright, but unless they've made tweaks to bring the GF100 more in line with the GF104 architecture, how can it not run hotter than a furnace?
I think that's exactly what they've been doing. The new chip is thought to be codenamed GF110 btw.

We won't know until the TDP is announced, but the GTX 470 style cooler points towards less heat. Rather unlike the Radeon 6970 with its rumoured 255W TDP..
CowBlazed 29th October 2010, 19:00 Quote
nvidia can't "fix" how much power Fermi uses, it's already been set in stone that they're power hungry chips. Their only option would be to reduce power by dropping below 40nm, which isn't going to happen for at least another generation due to the fabs having issues.

Which is the same as AMD who also were unable to drop their fab size, and all improvements and power savings were made in other ways on the 6000 series. So AMD never dropped their fab size and nvidia is just trying to release a card with the original high end fermi specs.
Snips 29th October 2010, 19:14 Quote
How can you possibly assume from one picture that this card will over heat and cost a fortune to run?

You can't, so shut up until the guys and gals here at Bit-Tech tell you otherwise.
Telltale Boy 29th October 2010, 19:44 Quote
Quote:
Originally Posted by Snips
How can you possibly assume from one picture that this card will over heat and cost a fortune to run?

You can't, so shut up until the guys and gals here at Bit-Tech tell you otherwise.

No one's assuming anything from the picture. It's from the fact that the GTX 480 runs pretty damn hot and hungry as it is, and, as it appears that the 580 will have all the stream processors unlocked & have higher clocks, logical thinking says that the 580 should run even hotter unless Nvidia's managed to significantly improve the architecture.

They even suggest it themselves in the article: "While the longer PCB will allow Nvidia to fit a larger heatsink, it remains to be seen whether the reference GTX 580 cooler will avoid the problems of noise and heat build up that plagues the reference GTX 480 1.5GB cooler.2
thehippoz 29th October 2010, 20:19 Quote
que the whoopass music while huang swings from a rope on stage with wood screws
cheeriokilla 29th October 2010, 20:25 Quote
one of two things should happen nowadays... either ATi hires a group of driver devs that aren't a bunch of monkeys, or nVidia should win the performance war, therefore making it easier for customers... You want a 5970? sure its just 100$ more than a 480GTX and a lot of performance gain, so whats the con? well, they haven't released a good driver for that card in 6 months?
Claave 29th October 2010, 20:26 Quote
Quote:
Originally Posted by r3loaded
Quote:
Originally Posted by Lazy_Amp
Alright, but unless they've made tweaks to bring the GF100 more in line with the GF104 architecture, how can it not run hotter than a furnace?
I think that's exactly what they've been doing. The new chip is thought to be codenamed GF110 btw.

We won't know until the TDP is announced, but the GTX 470 style cooler points towards less heat. Rather unlike the Radeon 6970 with its rumoured 255W TDP..

I really don't understand why everyone seems so worried about a TDP of 255W - the GeForce GTX 480 had a maximum power draw of 299W which seems in the same ball-park. It's not as if ATI is breaking new ground, is all I'm saying!
Telltale Boy 29th October 2010, 20:36 Quote
Quote:
Originally Posted by cheeriokilla
one of two things should happen nowadays... either ATi hires a group of driver devs that aren't a bunch of monkeys, or nVidia should win the performance war, therefore making it easier for customers... You want a 5970? sure its just 100$ more than a 480GTX and a lot of performance gain, so whats the con? well, they haven't released a good driver for that card in 6 months?

I was getting ready to disagree with you for hating on ATI's drivers, but fair point about the 5970; judging by the thread on here it has a lot of BIOS/Driver issues.
Quote:
Originally Posted by Claave
I really don't understand why everyone seems so worried about a TDP of 255W - the GeForce GTX 480 had a maximum power draw of 299W which seems in the same ball-park. It's not as if ATI is breaking new ground, is all I'm saying!

Very true, I've seen quite a few people worrying about this & the same thought struck me. It's very likely to perform better than the 480 and with 25% less heat & power, that really doesn't sound all too bad to me.
Teelzebub 29th October 2010, 20:37 Quote
The GTX480's are not as bad as they are made out to be anyway mine dont run very hot at all.
Jasio 29th October 2010, 20:45 Quote
So when can we expect the new "SLi" Ready 2500 watt power supplies to feed these hideous things?

Sorry nVidia - AMD's going to stay on top for a few more quarters.
Telltale Boy 29th October 2010, 20:55 Quote
Quote:
Originally Posted by Bumsrush
The GTX480's are not as bad as they are made out to be anyway mine dont run very hot at all.

I'm pretty sure the mountain mods case must play more than a small part in that. ;)
ssj12 29th October 2010, 21:30 Quote
Quote:
Originally Posted by Jasio
So when can we expect the new "SLi" Ready 2500 watt power supplies to feed these hideous things?

Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.

Still truthfully I have my GTX480 running at 50c now running F@H on air so... ya...its actually not that hot of a GPU in the first place.

As for a purchase, I wish there were new AM3 SLi motherboards so I could have an AMD CPU and dual GTX480s. So I'm going Intel next upgrade so I can't afford a GTX580. Just maybe another GTX480.
Sloth 29th October 2010, 22:08 Quote
Quote:
Originally Posted by ssj12
Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.

Still truthfully I have my GTX480 running at 50c now running F@H on air so... ya...its actually not that hot of a GPU in the first place.

As for a purchase, I wish there were new AM3 SLi motherboards so I could have an AMD CPU and dual GTX480s. So I'm going Intel next upgrade so I can't afford a GTX580. Just maybe another GTX480.
There's a couple different things to go and muddle all of that up:

-The 299W GTX480 is 17% more than the rumoured 255W 6970. If the GTX580 uses the same 299W then it will need to perform a quite large 17% better to get the same performance per watt.
-Maximum TDP isn't a perfect measurement of heat. Ambient temperature, case design, cooler design, and any number of other smaller factors can widly change the same component's temperature depending on its environment. Your own report of 50C could be in a well air conditioned room, or a case with above average airflow. You could have a non-reference cooler design which performs much better. Perhaps F@H isn't pushing it as hard as other tests which have shown the card to get hotter. Temperatures really only apply when such variables are accounted for.
-The entire factor of cost. The two primary factors for most video card buyers are performance and price. Power draw, thermals, and noise are all secondary. Extra features that help buyers make a decision when the primary two factors are too close.
Teelzebub 29th October 2010, 22:25 Quote
Quote:
Originally Posted by Telltale Boy
I'm pretty sure the mountain mods case must play more than a small part in that. ;)

They didn't run much hotter when they was in the 830 stacker TBH, They was maxing in the mid 70's certainly not hot enough to cremate my granny.

Of course they would get hot when running the bench haven mid 80's but who sits and runs benchmarks all day hardly real life usage is it.
dangerman1337 29th October 2010, 22:30 Quote
You people are forgetting that TDP =/= Actual power consumption.
fingerbob69 29th October 2010, 22:57 Quote
Wow ...the nVdia fanboi's are all over this! lol

Seriously; this was on Fud TWO days ago, and we all know how reliable he can be... right?

If Bit tech reckon this card to 11.5" from the 'pixelised' image then the only people with a rig big enough to house it is the Democratic Peoples Republic of China
Kúsař 29th October 2010, 23:15 Quote
We can expect yet another epic battle between powerful (hopefully and finaly) optimised nVidia GPU and two ATi power efficient GPUs(wasn't announced yet, but I bet it'll be out before the end of year).
Bring it on! I could use some cheap upgrade :D
steve30x 29th October 2010, 23:23 Quote
My 800D will house a GPU that size
drunkenmaster 29th October 2010, 23:38 Quote
Why would a 6.5% shader increase(480 to 512) plus very generously 10% clock speed bump, give 30% more performance? Answers on a post code, if indeed it is just the fully enabled 512sp cards they've been building up stock of, however, if thats the case most likely they'll be INSANELY expensive and in incredibly short supply, much like the 285gtx last year, not that they couldn't make large numbers of those, after they EOL'd it price went up to pretty much match the 5870, because that way the few that are left stay on shelves as no one wants a 285gtx for £300 when you can get a 5870 for £300, or a 5850 for £200 which is significantly faster than it.

I expect most likely the same, short supply of whatever it is, massive price, but they can keep a few on shelves to make it look like all is fine.
Skiddywinks 30th October 2010, 00:01 Quote
Quote:
Originally Posted by ssj12

Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.

This raises one large question;

Where the **** have you been checking? 'Cause it certainly isn't BT, or anywhere else where they know what they are doing. See for yourself.

nVidia have produced a massive, hot, power hungry chip that is considerably underpowered when compared to what ATI have managed to achieve within much tighter limits. Hell, unless the 580 has a true architectural change, we will only being seeing what they originally delayed all those times from over a year ago. The fact that people can still love nVidia is madness. It is likely that this is simply the Fermi that nVidia were boasting about all that time ago. The card we were meant to get.

Of course, there is an argument for simply having the best single GPU on the market, even if it is only by not so much. But anyone who thinks nVidia did anything other than trip over their own arrogance this round, especially after they knew what to expect after the shock of the 4 Series from ATI, is out of their mind.

I
frontline 30th October 2010, 00:52 Quote
fingerbob69 30th October 2010, 01:02 Quote
Actually, I expected better of Bit-tech readers:

The gtx480 was a castrated 512 part because fermi512 shaders simply couldn't be made; too hot and too few. Google it if you doubt me.

So what we are now expected to welcome, 8 months after gf100 is son of gf100 ...gf110... gf100AsItWasMeantToBeOnlyWe****edUpEnjoyItNow!

or gf100-AIWMTBOWFUEIN

(I would just point out that AMD took 12 months to come up with their 6xxx series development of their successful 5xxx series while we are now to believe that nVdia have righted all wrongs in their current series in just 6 months... am I wrong to ...doubt?).
ssj12 30th October 2010, 01:06 Quote
Quote:
Originally Posted by Skiddywinks
Quote:
Originally Posted by ssj12

Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.

This raises one large question;

Where the **** have you been checking? 'Cause it certainly isn't BT, or anywhere else where they know what they are doing. See for yourself.

nVidia have produced a massive, hot, power hungry chip that is considerably underpowered when compared to what ATI have managed to achieve within much tighter limits. Hell, unless the 580 has a true architectural change, we will only being seeing what they originally delayed all those times from over a year ago. The fact that people can still love nVidia is madness. It is likely that this is simply the Fermi that nVidia were boasting about all that time ago. The card we were meant to get.

Of course, there is an argument for simply having the best single GPU on the market, even if it is only by not so much. But anyone who thinks nVidia did anything other than trip over their own arrogance this round, especially after they knew what to expect after the shock of the 4 Series from ATI, is out of their mind.

I

You do know you just pointed me at mid-range 68xx cards right vs high-end GTX4xx cards in power consumption right? The only comparison between your argument against mine is that the GTX460 uses like 10w more power to run which makes no sense anyways since I was speaking of the 69xx cards which have the estimated TMP of 255w (which really it will be like all cards and go somewhere north of that by at least 20w).

At least Sloth gave me a real response that made sense to my comment.

@Sloth, you have a point, but since the GTX580 is expected to be between 15 - 25% stronger then the GTX480 I think what you said is possible and it will be well worth it. Even if it lands at only 16% stronger versus your stated 17% that is well within margin (and a bit of OCing can solve that percentage difference anyways.)
glaeken 30th October 2010, 01:08 Quote
Quote:
Originally Posted by Skiddywinks
This raises one large question;

Where the **** have you been checking? 'Cause it certainly isn't BT, or anywhere else where they know what they are doing. See for yourself.

nVidia have produced a massive, hot, power hungry chip that is considerably underpowered when compared to what ATI have managed to achieve within much tighter limits. Hell, unless the 580 has a true architectural change, we will only being seeing what they originally delayed all those times from over a year ago. The fact that people can still love nVidia is madness. It is likely that this is simply the Fermi that nVidia were boasting about all that time ago. The card we were meant to get.

Of course, there is an argument for simply having the best single GPU on the market, even if it is only by not so much. But anyone who thinks nVidia did anything other than trip over their own arrogance this round, especially after they knew what to expect after the shock of the 4 Series from ATI, is out of their mind.

I

I'd tend to agree with you if you're looking at the Fermi from just a gaming perspective. However, Fermi is not just a gaming GPU, it was designed to be much more general purpose than AMD's 5/6 series. Fermi is a researching beast. It simply blows AMD out of the water (in hardware and software) when it comes to GPGPU applications.
frontline 30th October 2010, 01:24 Quote
Quote:
Originally Posted by glaeken

Fermi is simply a researching beast. It simply blows AMD out of the water when it comes to GPGPU applications.

Because that is what the majority will be buying a £300 - £400 GPU for, yes?
glaeken 30th October 2010, 01:32 Quote
The majority of what? Gamers? Perhaps not. Factor in researchers and businesses/corporations and then the majority are going with Nvidia/Fermi for GPU based HPC applications. And this is where the money lies.
fingerbob69 30th October 2010, 01:36 Quote
Quote:
Originally Posted by ssj12
Quote:
Originally Posted by Jasio
So when can we expect the new "SLi" Ready 2500 watt power supplies to feed these hideous things?

Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.

Still truthfully I have my GTX480 running at 50c now running F@H on air so... ya...its actually not that hot of a GPU in the first place.

As for a purchase, I wish there were new AM3 SLi motherboards so I could have an AMD CPU and dual GTX480s. So I'm going Intel next upgrade so I can't afford a GTX580. Just maybe another GTX480.

The 69xx is yet to be released/pictured/benched so as to it running " as hot and use as much power as the GTX480/580" ...absolutely no one who can comment, you included, can knowledgeably comment.

And you've got a 480 to run at 50c flat out? How?
mute1 30th October 2010, 01:47 Quote
Quote:
Originally Posted by glaeken
The majority of what? Gamers? Perhaps not. Factor in researchers and businesses/corporations and then the majority are going with Nvidia/Fermi for GPU based HPC applications. And this is where the money lies.

Except they won't be buying the Geforce versions of the GPU, will they, since they are for gamers. They will get the professional versions - the one with ECC and all the other necessary stuff - which cost a lot more anyway.
smc8788 30th October 2010, 01:57 Quote
Quote:
Originally Posted by mute1
Except they won't be buying the Geforce versions of the GPU, will they, since they are for gamers. They will get the professional versions - the one with ECC and all the other necessary stuff - which cost a lot more anyway.

Yeah, but they're still based on the same underlying GPU architecture. Nvidia is far more interested in the HPC market these days since that's where the majority of the money is, so they made a GPU that primarily performed well in GPGPU applications. The consumer GeForce version of the cards, while being what we're most interested in, were more of a secondary priority since the high-end gaming market is much, much smaller.
wafflesomd 30th October 2010, 02:47 Quote
Sweet, another high performance GPU to run games that previous gen cards can run maxed at 60+ fps!

We need some software to actually use all this hardware on the market...
general22 30th October 2010, 02:52 Quote
Two words.

Paper launch
rickysio 30th October 2010, 04:30 Quote
Quote:
Originally Posted by wafflesomd
Sweet, another high performance GPU to run games that previous gen cards can run maxed at 60+ fps!

We need some software to actually use all this hardware on the market...

Err... Crysis?
wafflesomd 30th October 2010, 04:44 Quote
Quote:
Originally Posted by rickysio
Quote:
Originally Posted by wafflesomd
Sweet, another high performance GPU to run games that previous gen cards can run maxed at 60+ fps!

We need some software to actually use all this hardware on the market...

Err... Crysis?

Yes, let's make cards just so we can run the pile of mediocrity that is Crysis.
jrs77 30th October 2010, 05:07 Quote
I'd rather see them developing more into energy-savings then to develop cards that only a handful of people might need to play games at 2560x1920 with max details.

I'm still waiting for a GPU as powerful as a G80 or R580 (8800GTS or x1950) with only some 25 Watt maximum TDP.
In times of energy getting more and more expensive a PC capable of playing a game like Left4Dead or Call of Duty 5 at 1280x1024 with medium settings shouldn't draw more then a maximum of 100 Watt alltogether at load.

Hopefully the Llano-thingies can live up to the expected GPU-performance of a HD5650 so that we're getting atleast near that 100 Watt mark while having a somewhat decent performant system.
ssj12 30th October 2010, 05:36 Quote
Quote:
Originally Posted by fingerbob69
Quote:
Originally Posted by ssj12
Quote:
Originally Posted by Jasio
So when can we expect the new "SLi" Ready 2500 watt power supplies to feed these hideous things?

Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.

Still truthfully I have my GTX480 running at 50c now running F@H on air so... ya...its actually not that hot of a GPU in the first place.

As for a purchase, I wish there were new AM3 SLi motherboards so I could have an AMD CPU and dual GTX480s. So I'm going Intel next upgrade so I can't afford a GTX580. Just maybe another GTX480.

The 69xx is yet to be released/pictured/benched so as to it running " as hot and use as much power as the GTX480/580" ...absolutely no one who can comment, you included, can knowledgeably comment.

And you've got a 480 to run at 50c flat out? How?

Im basing my assumptions off the scaling the 68xx numbers and AMD's leaked/confirmed info.

And I cleaned up and reapplied new thermal grease, got an EVGA high-flow bracket, and have a PCI slot fan under it. So it flows 1 - 3c around 50c stock speeds. Still rounded base average is 50c.
Elton 30th October 2010, 06:51 Quote
Quote:
Originally Posted by jrs77
I'd rather see them developing more into energy-savings then to develop cards that only a handful of people might need to play games at 2560x1920 with max details.

I'm still waiting for a GPU as powerful as a G80 or R580 (8800GTS or x1950) with only some 25 Watt maximum TDP.
In times of energy getting more and more expensive a PC capable of playing a game like Left4Dead or Call of Duty 5 at 1280x1024 with medium settings shouldn't draw more then a maximum of 100 Watt alltogether at load.

Hopefully the Llano-thingies can live up to the expected GPU-performance of a HD5650 so that we're getting atleast near that 100 Watt mark while having a somewhat decent performant system.

This is something that really needs to be addressed. If we can reach that kind of performance, even X1950XTX performance at incredibly low power consumption rates, we'd have a hell of a breakthrough, although our top end performance would suffer, technology overall woiuld still progress much more favorably.

If only eh?
Grape Flavor 30th October 2010, 07:22 Quote
Quote:
Originally Posted by ssj12
Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.

Still truthfully I have my GTX480 running at 50c now running F@H on air so... ya...its actually not that hot of a GPU in the first place.

As for a purchase, I wish there were new AM3 SLi motherboards so I could have an AMD CPU and dual GTX480s. So I'm going Intel next upgrade so I can't afford a GTX580. Just maybe another GTX480.

480 is said to run at 90˚- 97˚F, and you got it to run at 50˚C = 122˚F?

Either I'm somehow not comprehending what you said or you sure did a crap job of "upgrading" your cooling.
SlowMotionSuicide 30th October 2010, 08:25 Quote
Quote:
Originally Posted by Grape Flavor
480 is said to run at 90˚- 97˚F, and you got it to run at 50˚C = 122˚F?

Either I'm somehow not comprehending what you said or you sure did a crap job of "upgrading" your cooling.

Obviously you aren't. 90˚C equals 194˚F, so he did a bloody good job upgrading.

So good infact, it beats my EK fullcover block hands down, which only manages to keep the card at a measly 70˚C (or 158˚F) under Furmark.
zr_ox 30th October 2010, 09:37 Quote
Quote:
Originally Posted by wafflesomd
Yes, let's make cards just so we can run the pile of mediocrity that is Crysis.

Well said!

Crysis has become nothing other than a great benchmark tool. It's 4 years old soon and I really dont get why people are still so eager to find a gpu which plays it well? Chances are you have already played it, numerous times. Would you really want to relive the experience just to see if you still drop below 30 fps from time to time?


My Ati 4870 plays it with all of the candy turned on, at 1600x1200 without any trouble what so ever, a few dips here and there but nothing to spoil the experience. I added a second 4870 and never skipped a beat.

To be honest I expect more from what we are seeing from the current generation of cards. There is so little to be gained that upgrading seems pointless. Even with 24" screens if you have a gpu from the last 2 generations then chances are your fine.

I'm still running an Intel 9550 (@stock), 8GB memory with a 4870 (@stock) and it runs everything I can throw at it. Sure it's got everything to do with my native screen resolution, but I will not be upgrading anytime soon beacuse it's simply not worth it.

But let'splay the devils advocate and blame to game developers, because at the end of the day if they dont build anything demanding then hardware developers will never be forced to innovate!
Mraedis 30th October 2010, 10:46 Quote
Quote:
Originally Posted by Grape Flavor
480 is said to run at 90˚- 97˚F, and you got it to run at 50˚C = 122˚F?

Either I'm somehow not comprehending what you said or you sure did a crap job of "upgrading" your cooling.

You just got your Fahrenheit and Celcius mixed up in the "said to run at" there. ^^

WHY would they whine about a 'hot card' if it ran at 90 Fahrenheit...
Paddy4929 30th October 2010, 10:58 Quote
I may get one of these cards pending review from Bit-Tech. Been saving my money for quite sometime so I can upgrade from my GTX 295. My only gripe is the next game I will be buying is COD: Black Ops which will more than likely run perfectly smooth on my GTX 295. Maybe I should save my dough.
confusis 30th October 2010, 11:09 Quote
Quote:
Originally Posted by glaeken


I'd tend to agree with you if you're looking at the Fermi from just a gaming perspective. However, Fermi is not just a gaming GPU, it was designed to be much more general purpose than AMD's 5/6 series. Fermi is a researching beast. It simply blows AMD out of the water (in hardware and software) when it comes to GPGPU applications.

nVidia is only in that position because they pay out researchers (stanford, etc) to program for CUDA and opengl gets second place for development time..
Xtrafresh 30th October 2010, 12:44 Quote
Quote:
Originally Posted by glaeken
The majority of what? Gamers? Perhaps not. Factor in researchers and businesses/corporations and then the majority are going with Nvidia/Fermi for GPU based HPC applications. And this is where the money lies.
Well, not MY money, that's for sure. Last time i checked this was a website targetted at consumers, gamers mostly. So nVidia prioritising GPGPU development is actually another reason for me to go ATI.

About this 580 card... i'll believe it when i see it, but so far it's just nVidia trying their hardest to steal some marketing thunder and keep the fanbois on board before the 69xx series comes.

Also, if it's just Fermi with 512 cores/shaders/thingies/whatever enabled, it should be called the GTX 490. have we gotten so used to nVidia rebranding their chips that nobody calls them out on it anymore?

Anyway, since it's nVidia, see first, believe later.
fingerbob69 30th October 2010, 13:15 Quote
"But let'splay the devils advocate and blame to game developers, because at the end of the day if they dont build anything demanding then hardware developers will never be forced to innovate!"

Isn't the problem the over way round? The hardware is moving on, performance up 10-20% pa while games are released that still in DX9 while other software has to be 32 and 64 bit compatible and run in XP?

Maybe someone BIG, like a M$ or Valve to say that they're leaving some of these relics behind which would drive software innovation and of course, hardware sales.
Snips 30th October 2010, 15:14 Quote
Quote:
Originally Posted by Telltale Boy
Quote:
Originally Posted by Snips
How can you possibly assume from one picture that this card will over heat and cost a fortune to run?

You can't, so shut up until the guys and gals here at Bit-Tech tell you otherwise.

No one's assuming anything from the picture. It's from the fact that the GTX 480 runs pretty damn hot and hungry as it is, and, as it appears that the 580 will have all the stream processors unlocked & have higher clocks, logical thinking says that the 580 should run even hotter unless Nvidia's managed to significantly improve the architecture.

They even suggest it themselves in the article: "While the longer PCB will allow Nvidia to fit a larger heatsink, it remains to be seen whether the reference GTX 580 cooler will avoid the problems of noise and heat build up that plagues the reference GTX 480 1.5GB cooler.2

So it's pure speculation then? Exactly the point I was making.
Telltale Boy 30th October 2010, 15:31 Quote
Quote:
Originally Posted by Snips
So it's pure speculation then? Exactly the point I was making.

The point I was making was that while it is speculation, it isn't completely unfounded - there is a cause for concern. And it definitely wasn't speculation based on the picture as your post seemed to suggest.

As they made similar speculation in the article, I don't see why we aren't allowed to speculate too.
Krayzie_B.o.n.e. 30th October 2010, 17:22 Quote
whether your a Nvidia loyalist or an AMD loyalist either way I don't see the GTX 580 running faster than two GTX 460 in SLI or two HD 6870 in CF so what's the point unless you plan on buying two gtx 580 for sli which then again is pointless cause the price drop will make a GTX 480 sli set up more affordable.

So far no news on any re-engineering of this better late than never 512 Cuda core card so It's safe to say it will require it's own Nuke Reactor to power and create more heat than the Sun. GTX 580 SLI "have fun buying that 2500 watt PSU".

GTX 580 the Hummer of video cards.
Xtrafresh 30th October 2010, 17:44 Quote
Quote:
Originally Posted by Telltale Boy
The point I was making was that while it is speculation, it isn't completely unfounded - there is a cause for concern. And it definitely wasn't speculation based on the picture as your post seemed to suggest.

As they made similar speculation in the article, I don't see why we aren't allowed to speculate too.
You are only allowed to speculate that it will be awesome and that ATI might as well not launch the 6900 series now. ;)
Telltale Boy 30th October 2010, 17:50 Quote
Quote:
Originally Posted by Xtrafresh
You are only allowed to speculate that it will be awesome and that ATI might as well not launch the 6900 series now. ;)

Someone who understands! Now i'm going to go back to my tri-sli GTX 465's... What's that you say? The GTX 465 is awful? But it's Nvidia.... duh....
frontline 30th October 2010, 18:27 Quote
The GTX 580 v 6900 series is turning into a bit of 'you show me yours and i'll show you mine' - can't wait until the cards are on the table :)
Snips 30th October 2010, 19:52 Quote
Quote:
Originally Posted by Telltale Boy
Quote:
Originally Posted by Snips
So it's pure speculation then? Exactly the point I was making.

The point I was making was that while it is speculation, it isn't completely unfounded - there is a cause for concern. And it definitely wasn't speculation based on the picture as your post seemed to suggest.

As they made similar speculation in the article, I don't see why we aren't allowed to speculate too.

So it's still unfounded speculation then? Nothing in the article gives any fact on performance, heat and cost to run for you to possibly be able to substantiate your post.

Again, let's just wait for the Bit-tech review before the knives come out.
Xtrafresh 30th October 2010, 20:57 Quote
Snips, Why is this so important to you? Nobody here is making hard claims, and if they were, they would be easily disproven and laughed away. There's nothing wrong with speculation. And if you say "misinformation", i'm going to die of laughter, as the companies themselves are much better at that.
Telltale Boy 30th October 2010, 21:35 Quote
Quote:
Originally Posted by Xtrafresh
Snips, Why is this so important to you? Nobody here is making hard claims, and if they were, they would be easily disproven and laughed away. There's nothing wrong with speculation. And if you say "misinformation", i'm going to die of laughter, as the companies themselves are much better at that.

Thanks Xtra.

I realise that none of this speculation is back up by fact, I'm just saying that there's nothing wrong with speculation and your first post seemed a bit agressive to anyone who was speculating.
adidan 31st October 2010, 09:01 Quote
Quote:
Originally Posted by Mraedis
You just got your Fahrenheit and Celcius mixed up.
A GPU Mars Climate Orbiter incident?
I-E-D 31st October 2010, 11:10 Quote
http://www.gnd-tech.com/main/content/409-NVIDIA-GeForce-GTX-580-Full-Specs-and-Release-Date

GND-Tech has the full release date and where you can get it from.
frontline 31st October 2010, 19:50 Quote
Quote:
Originally Posted by I-E-D
http://www.gnd-tech.com/main/content/409-NVIDIA-GeForce-GTX-580-Full-Specs-and-Release-Date

GND-Tech has the full release date and where you can get it from.
Quote:
At the moment, no GTX 580 samples have been given out, but they will be given out on November 9 which also happens to be the release date. EVGA will receive 2,500 of these cards on November 9

So, samples for reviewers but no actual stock until later? I hope bit-tech's review will include details of general availability at launch.
Snips 31st October 2010, 19:52 Quote
So it's still unfounded speculation then?

Yet again, let's wait for the official benchtests from Bit-Tech!
frontline 31st October 2010, 19:57 Quote
Quote:
Originally Posted by Snips
So it's still unfounded speculation then?

Yet again, let's wait for the official benchtests from Bit-Tech!

Benchtests that i hope are of a card on general release at the time of the review!

In the meantime lets have some of Charlie D's speculation :)
Quote:
With that in mind, we are told GTX580's 'launch' will be pulled in to November 8, a few days before the Q3 financial conference call. It is now meant as a spoiler for uncomfortable analyst questions aimed at Dear Leader, not at AMD's parts. The problem is that in either case, you won't be able to buy parts until late January, best case, at the earliest. I wonder if any analysts will ask about that in an SEC governed venue?
knutjb 1st November 2010, 04:37 Quote
Other than a few grounded in reality most of you are WAGing. Maybe Nvidia did this or that, blah blah blah...

Given their propensity for dragging out releases with a few sample photos to give speculators something to drone on about.

I hope they have figured out the 40nm process and will release a card that won't spin the meter too fast.

The good: a new fast card keeps competition up and prices down.

The bad: I just hope the 580 isn't a 480 rerun like the 8800..., massaged but not so much as to call it new or even significant revision like Bart. Until *production from vendors shelves* are benched marked I have my doubts.
memeroot 1st November 2010, 11:33 Quote
given the cancellation of the 32nm all we can hope is a re-spin... which will be fine if its good enough
xaser04 1st November 2010, 12:20 Quote
Quote:
Originally Posted by jrs77


I'm still waiting for a GPU as powerful as a G80 or R580 (8800GTS or x1950) with only some 25 Watt maximum TDP.
In times of energy getting more and more expensive a PC capable of playing a game like Left4Dead or Call of Duty 5 at 1280x1024 with medium settings shouldn't draw more then a maximum of 100 Watt alltogether at load.

In theory this is where Gaming laptops are excellent.

The Alienware M11x for example pulls only 22-27 watts when gaming - (NBR Alienware M11x review). It can quite happily run most games at medium settings, (yes even Crysis) and is reasonably priced.

Yes it isn't the fastest machine on the planet but it is certainly efficient for what it can offer.
jrs77 1st November 2010, 15:41 Quote
I didn't chose the 100Watt-mark randomly. The PS3 Slim is sucking ~100 Watts at load when playing in 1920x1080.

So that's the mark a gaming-PC has to compete with imho.
wuyanxu 1st November 2010, 15:47 Quote
Quote:
Originally Posted by jrs77
I didn't chose the 100Watt-mark randomly. The PS3 Slim is sucking ~100 Watts at load when playing in 1920x1080.

So that's the mark a gaming-PC has to compete with imho.
but for 400w, your PC can render more than 4 times as much detail as your PS3.

problem is just lack of software (aka games) to take advantage of it.
jrs77 1st November 2010, 16:08 Quote
The question is, do you really need all that amount of details to make it look hyperrealistic....

In ARMA 2 for example, I turn the graphics down on purpose, because the high-detailed environment distracts too much and makes the game less playable.

The graphics achieved in BF2 on a PS3 are totally enough to make the game look good. And the best and most fun games don't even go for hyperrealistic graphics... look at GTA4, Borderlands, etc...
javaman 1st November 2010, 16:56 Quote
surely this will be the GTX6xx series if Nvidia keep their current name scheme
wuyanxu 1st November 2010, 19:46 Quote
Quote:
Originally Posted by jrs77

The graphics achieved in BF2 on a PS3 are totally enough to make the game look good. And the best and most fun games don't even go for hyperrealistic graphics... look at GTA4, Borderlands, etc...

ah ha! :D BF2, a game came out over 5 years ago. so what graphics card do we need to max out this BF2 game? something like GT420 with 48 unified shaders (7800GTX in 2005 had 24 pixel and 8 vertex shaders) and TDP of 50w. perhaps coupled with i3 and that will be about 100w of power usage while gaming. with more details and higher resolution than PS3 mind you.

same GT420 will be easily able to handle GTA4 at tiny 720p resolution with no view distance, as on PS3 of GTA4. i've played PS3 GTA4 before PC version came out, it was horrible.

you are looking at graphics cards that are pushing resolution, antialsing and drawing distance boundary, if you want just-enough. buy any lowest graphics card of this generation will be more than enough to handle any console game with good power efficiency.
frontline 1st November 2010, 19:57 Quote
Quote:
Originally Posted by jrs77
The question is, do you really need all that amount of details to make it look hyperrealistic....

In ARMA 2 for example, I turn the graphics down on purpose, because the high-detailed environment distracts too much and makes the game less playable.

The graphics achieved in BF2 on a PS3 are totally enough to make the game look good. And the best and most fun games don't even go for hyperrealistic graphics... look at GTA4, Borderlands, etc...

I agree that playing a game on a current gen console compared to playing it on a top of the range PC isn't as big a difference as say when comparing the DX7 mode of a game to DX9 mode. However, when you see games like BFBC2 with all the detail options cranked up to max at a decent resolution, it just makes the console version look shoddy in comparison.

I purchased an xbox 360 recently, however the only games i can stand playing on it are the xbox live arcade type games, where graphics capabilities aren't particularly important (Limbo springs to mind).

We'll see next year what AMD's Fusion APU is capable of churning out in terms of a gameplay experience at (hopefully) low power requirements.
SchizoFrog 1st November 2010, 19:59 Quote
Quote:
Originally Posted by jrs77
The question is, do you really need all that amount of details to make it look hyperrealistic....

In ARMA 2 for example, I turn the graphics down on purpose, because the high-detailed environment distracts too much and makes the game less playable.

I loved this comment... Maybe we should all poke ourselves in one eye as life is too realistic and takes away from the fun of it?
jrs77 1st November 2010, 20:58 Quote
Quote:
Originally Posted by SchizoFrog
I loved this comment... Maybe we should all poke ourselves in one eye as life is too realistic and takes away from the fun of it?

Yeah, you clearly can compare PC-games to real life... :(
AcidJiles 1st November 2010, 21:14 Quote
Ridiculours size. I buy big cases but even standard sizes dont give much room to work round and are awkward.
Sloth 1st November 2010, 21:51 Quote
Quote:
Originally Posted by jrs77
Yeah, you clearly can compare PC-games to real life... :(
Let me fix his analogy, while also making it a bit more sarcastic:

We should all play checkers. It uses single bit colors (black and white is enough, red is a luxury!), only needs a single 8x8 texture for the background and only four sprites for pieces!

It's partly a matter of taste, and partly a matter of game design. Some games, such as the wildly popular Minecraft, look terrible on purpose and that ugliness doesn't suit some people's tastes. Looking beautiful was part of the developer's plan when making Arma 2, though it doesn't suit your tastes.


To the argument of a PS3's power draw being less, it's quite true that it's actually doing less as well. As mentioned earlier, view distances are a clever way of developers making a game still look decent on consoles while demanding far less. You look at a specific object for comparison and won't notice much of a difference, yet when you look at the bigger picture you notice how many fewer other objects there are on screen! Another thing is anti-aliasing. You'll notice your console games can't use it, so even when textures and polygons are of a similar quality as PC, the console polygons are jagged on the edges. Whether you prefer playing games on a console or PC is totally personal choice, but comparing the two graphically is not direct unless you drop your PC settings and hardware to match.
robots 1st November 2010, 22:05 Quote
It would take something miraculous to make me change from my new 5870. This thing lets me crank every game I have, it runs very cool, and to my surprise, it's completely silent. Even when I'm playing Crysis the fan doesn't go over about 35% which is the exact same volume as when I manually turn the fan off.
xaser04 1st November 2010, 22:35 Quote
Quote:
Originally Posted by jrs77
I didn't chose the 100Watt-mark randomly. The PS3 Slim is sucking ~100 Watts at load when playing in 1920x1080.

So that's the mark a gaming-PC has to compete with imho.

This makes the comparison so something like a Alienware M11x more prevelant. The gaming quality provided by both is similar (the M11x will actually be able to run at comparably higher settings and a higher resolution) yet the M11x only requires 1/4 the power.
chimeradog 2nd November 2010, 05:46 Quote
Guys!!! See This!!!! I'm getting excited now!!!!!
robots 2nd November 2010, 05:50 Quote
I wanna see the finished thing.
Snips 2nd November 2010, 09:10 Quote
I hate leaks of this nature with the ridiculous comments of "but it won't beat the AMD dual GPU card!" which lets be honest, no one expects it to.

Come on Bit-Tech, give us some more meat on the bone!
jrs77 2nd November 2010, 15:11 Quote
Yeah.... 250-300 Watt TDP

My whole system (mainboard + E8400 + 460GTX + 2x HDD) doesn't use this much power :)
Sloth 2nd November 2010, 17:45 Quote
Quote:
Originally Posted by jrs77
Yeah.... 250-300 Watt TDP

My whole system (mainboard + E8400 + 460GTX + 2x HDD) doesn't use this much power :)
65W+160W=225W TDP for just CPU and GPU, you're walking a fine line. My own CPU+GPU is 65W+188W=253W TDP, still less than the 300W max. Know what we both have in common? Neither of us has the same rendering power as the GTX 580. You desire low power, so you laugh at the power draw of higher end GPUs, yet those who need more power than a single GTX 460 similarly laugh at your underpowered system. Neither is objectively better, only suited to different needs.

Where you should laugh is situations of equal power with differing power consumption, or situations of equal power consumptions with differing power.
thehippoz 2nd November 2010, 17:52 Quote
wow nov 9th really.. this should be fun to see- maybe wood screws was a bit hasty
Cyberpower-UK 2nd November 2010, 18:11 Quote
Can't wait, get me four
Telltale Boy 2nd November 2010, 19:20 Quote
This article at Kitguru says that it's expected to draw at least 35w more power than the 480, which is the opposite of what the Fuzilla article says. I guess we'll just have to wait from some official spec but either way, this definitely looks like it could be a good water-cooler's card.
Hakuren 3rd November 2010, 16:34 Quote
Since I moved to Mountain Mods cases I couldn't care less about height, length and weight. Horizontal motherboards rocks. Bring GTX580 on!

Still I think for such cards manufacturers should break with height norm. Nobody will squeeze such mammoth into media center, low noise or SFF case. So why not add 2-3 cm up top and make shorter card. It is so simple. I've only seen this done once when MSI released R5870 Twin Frozr. It can be done, why nobody else is bothered?

All that talk about high power consumption is funny. Are you playing 24/7/365 or something like that? Usually most of us (in particular if you are not a kid) playing ~1-4h/day, many people even less.

Of course power drain is noticeable if you run at full tilt permanently, but for casual gamer it is not a big deal (and at idle, power drain/temps from gtx4xx is/are much lower than for example 8800 series). Problems appearing when:

- you have UPS and there is power outage. Such monster at full power will suck dry every battery very quickly - in particular if UPS is loaded over 50%.
- Folding with quad-SLI. Now if you running such setup then most probably you also own nuclear reactor nearby. :D
memeroot 3rd November 2010, 16:49 Quote
+1 couldn't care less about power... except will my 850 run 2 of them?
ssj12 9th November 2010, 19:36 Quote
seems the GTX580 runs quite a bit cooler then the GTX480.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums