bit-tech.net

Nvidia GTX 580 power limit bypassed

Nvidia GTX 580 power limit bypassed

The power limitation function of the GTX 580 GPU, which prevents power draw exceeding 200W, can now be bypassed by GPU-Z.

Nvidia's GTX 580 GPUs include a clever power limiter that keeps the power draw low when executing intensive operations such as the Furmark benchmark. Now, however, GPU-Z developer W1zzard has released a way to disable the restriction - but warns that it's not for the faint-hearted.

The test build, available for download on the TechPowerUp forums, has the option to disable the clockspeed restriction logic. Although it makes no difference during daily use, when the limiter kicks in during Furmark or similar intensive benchmarking applications, the maximum power draw of GTX 580-based cards jumps from 200W to 350W.

W1zzard warns users that 'exceeding the power limitation of the card may result in damage to card and/or motherboard,' but for those who want to push their GTX 580-based cards to the very limit it's a way to squeeze the very last drop of performance out of the GPU - providing adequate cooling is in place.

Nvidia's GTX 580's power limiting system is something that would have potentially averted the overheating issues that plagued early versions of StarCraft 2. When we investigated the issue back in August, we found that the lack of a framerate limiter in the software meant that simple scenes, such as the menu, could over-stress the GPU - a problem that the GTX 580's power limiter neatly circumvents.

The news that Nvidia's GTX 580 can draw up to 350W when the power limitation logic is switched off will come as no surprise to rival GPU manufacturer AMD, which famously mocked its competitor on YouTube for the high heat output of the Fermi architecture. However, the much more reasonable 200W limit of a card with the power limiter still in place will be good news for those looking to upgrade from a power-hungry early Fermi board.

Are you impressed to see Nvidia's power-saving logic circumvented already, or just wondering why anyone would risk burning out their GPU for the sake of a synthetic benchmark score? Share your thoughts over in the forums.

40 Comments

Discuss in the forums Reply
PT88 15th November 2010, 15:58 Quote
Time to turn on Overchrage on my Personal Nuclear Power Plant!
V3ctor 15th November 2010, 16:15 Quote
Quote:
Originally Posted by PT88
Time to turn on Overchrage on my Personal Nuclear Power Plant!

+1 :D
Zurechial 15th November 2010, 16:24 Quote
It's interesting to note that StarCraft 2 is far from the only game that exhibits that problem, so anyone using this should be wary. Many games lack framelimiting in the menus or loading screens (or any time when the game world is not being rendered).

Crysis, for instance, doesn't seem to limit frames while in the menu or loading screens. I can see the framerate jumping up to just under 1000fps on my GTX275 at the menus in Crysis and all the way until it gets in-game. Not that I need to see it, as my GTX275 tends to capacitor-squeal when working like crazy.

Many other games exhibit this behaviour, so players be warned - StarCraft 2 isn't the only game that could drive a GTX580 crazy with this hack. :)
fingerbob69 15th November 2010, 16:51 Quote
Cue a malware writting ejit thinking wouldn't it be fun to write this into his latest trojan as a immediate executable when a gtx580 is detected!
Paradigm Shifter 15th November 2010, 16:51 Quote
As you say, Zurechial, Starcraft 2 isn't the only game that causes the GPUs to 'overwork' in menus - Magic: the Gathering: Duels of the Planeswalkers does the same if Vsync is disabled. I imagine lots of others do, too, but I tend to run with Vsync forced on, as tearing (even a little bit) drives me up the wall...
GoodBytes 15th November 2010, 16:55 Quote
Batman AA is another game that uses no frame limiter in the menu or pause screen.
Notice how your GPU fan spins like no tomorrow in the menu or when you pause the game, while in the game, where you would think you stress the GPU, the fan is fairly quiet in comparison.
MaverickWill 15th November 2010, 17:05 Quote
Nobody else noticed this is comfortably outside ATX spec?
Pete J 15th November 2010, 17:15 Quote
Can we tweak it so it'll go to say, 300W?
perplekks45 15th November 2010, 17:18 Quote
So what exactly does this thing do? It just limits the maximum power draw? Is there any performance gain to be had? And how will you supply a card with 350 Watt?! PCI-E slots give you75 Watt, each 8-pin connector another 150. That adds up to 375... pretty close if you ask me. Unless they added a 3rd 8-pin that I missed.
thelaw 15th November 2010, 17:21 Quote
<-------'Going out to get marshmallows'
Elledan 15th November 2010, 17:22 Quote
Quote:
Originally Posted by perplekks45
So what exactly does this thing do? It just limits the maximum power draw? Is there any performance gain to be had? And how will you supply a card with 350 Watt?! PCI-E slots give you75 Watt, each 8-pin connector another 150. That adds up to 375... pretty close if you ask me. Unless they added a 3rd 8-pin that I missed.

As the article says, there's no benefit to this modification with real-life usage, only with benchmarks. It could be useful with OCing, but you would risk burning out the card/mainboard any time you hit something like that non-framerate limited menu of SC2 :)
fingerbob69 15th November 2010, 17:34 Quote
but you would risk burning out the card/mainboard any time you hit something like that non-framerate limited menu of SC2 :)

hasn't that been patched yet?
thehippoz 15th November 2010, 17:43 Quote
Quote:
Originally Posted by MaverickWill
Nobody else noticed this is comfortably outside ATX spec?

150w + 75w + 75w

8pin + 6 pin + pcie slot

you'd think something gotta give- water cooling at 350w would probably be a good idea to keep temps down on the gpu
Cool_CR 15th November 2010, 19:40 Quote
Hard core but i can see somevery sad people if this start fragging motherboards and tripping out PSU's.
wuyanxu 15th November 2010, 19:44 Quote
it's like those 150mph limiters on modern saloons, why would you want to take it off if you are not going to take it on the tracks?

allowing it is just dangerous, it should only be done by reviewers and extreme overclockers, normal users should not be allowed to have such simple tool.
Tulatin 15th November 2010, 20:14 Quote
I, for one, cannot wait for the GTX 595, with it's "Conservative" 400W limiter, and 6 8-Pin PCI-E connectors.
GoodBytes 15th November 2010, 20:22 Quote
Quote:
Originally Posted by Tulatin
I, for one, cannot wait for the GTX 595, with it's "Conservative" 400W limiter, and 6 8-Pin PCI-E connectors.

You mean a 24pin motherboard connector :)
Anakha 15th November 2010, 22:56 Quote
Quote:
Originally Posted by fingerbob69
but you would risk burning out the card/mainboard any time you hit something like that non-framerate limited menu of SC2 :)

hasn't that been patched yet?

Yes. See that wonderful little checkbox labelled "VSync"? Click that and the problems are gone forever.
Tulatin 16th November 2010, 00:16 Quote
Anakha, games should not need to limit their needs to account for poorly designed, power hungry chips.
Bakes 16th November 2010, 00:43 Quote
Tulatin, to add to that, games should not stress the graphics card willy-nilly where it is unnecessary - although the card should not be damaged by the extra stress.
Sloth 16th November 2010, 00:59 Quote
Quote:
Originally Posted by Tulatin
Anakha, games should not need to limit their needs to account for poorly designed, power hungry chips.
Correction, games should not have sections of unlimited frame rates. There's no viable reason for frame rates beyond 120fps, so setting a cap at 120 (in reality, just above is safer to ensure a smooth 120+fps for 120Hz monitors) should be standard. A frame rate cap at a decent level has no downside and can save power/noise from fans/heat/card death.

I've got a buddy studying game programming at the moment. He puts caps on all of his games, even ones that take just a couple hours to make. No reason for Blizzard, or any company, to leave one out.
Tulatin 16th November 2010, 02:52 Quote
It's not the fault of game developers that the cards are being damaged by "Extra stress". Processors can run at 100% indefinately, you know. If a graphics card cannot maintain it's temperatures when running at it's designed frequencies, then something's done wrong.
Lord-Vale3 16th November 2010, 03:43 Quote
Is it really that poorly designed though if the designer put a control system on it that kept it from reaching that power draw? It only becomes dangerous if somebody takes that control system off - a lot of other things become dangerous too when control systems are taken off. Nuclear reactors ( I guess this analogy doesnt help at all lol) are pretty scary with loss of a control system, but that doesnt make them 'poorly designed'.

The 580 is not poorly designed because nVidia put a control system on it. Only someone who took it off and then played a game with no limit to framerate would be in trouble - but that would be their own fault.
dyzophoria 16th November 2010, 03:43 Quote
that is what I was thinking too, definitely running a processor at 100% 24/7, will shorten its life. (maybe shedding of a year or two from its MTBF specification), but a GPU that will last only a few mins to a few hours because of the basically same scenario?. about starcraft 2, its not like you will stand on the menu for a few hours right? :D
pingu666 16th November 2010, 04:35 Quote
might if you go afk
im sure ive left games paused for hours to go watch a grand prix
perplekks45 16th November 2010, 09:41 Quote
Quote:
Originally Posted by thehippoz
150w + 75w + 75w

8pin + 6 pin + pcie slot

you'd think something gotta give- water cooling at 350w would probably be a good idea to keep temps down on the gpu
And how does that add up to 350?

Even with

8pin + 8pin + PCI-E slot
150W 150w 75w

we'd be pretty close, wouldn't we? Looks like this beast might run at -10 degrees Celsius using a Prometeia Mach II GT. :|
leexgx 16th November 2010, 11:28 Quote
do not think the 6pin is 150w think its 125w (8 + 6 + PCI-e slot thats what an GTX580 comes with not 8 pins)

starcraft 2 thing was related to cheaper video cards that only had just enough cooling to just keep them cool + dust SC2 would make them toast and fail (i could see a lot of Nvidia compaq or HP laptops failing due to this)

Customer call
laptop will not boot up
power light comes on
is it an compaq
yes
does it have an green sticker that says Nivida on it
emm
bottom right hand coroner or left below the keyboard
yes
the laptop has most likely died due to an fault inside the laptop its an permanent fault, laptop most likely would need replacing

(i would still call out just to Flip the ram see if it was that but normally its dead)
perplekks45 16th November 2010, 11:35 Quote
Who said 6pin is 150W? It's 75W. Hence you'd need at least 2 x 8pin to run the card w/o limiter.
thehippoz 16th November 2010, 18:13 Quote
yeah that's what I was sayin perp, quoting maverick.. it's out of spec
Bakes 16th November 2010, 18:52 Quote
Quote:
Originally Posted by perplekks45
Who said 6pin is 150W? It's 75W. Hence you'd need at least 2 x 8pin to run the card w/o limiter.

No - there's no hard limits on the amounts that can be drawn, just the limits to remain within specification. So if you remove the limiter and draws more power, it's not a big deal - under normal conditions, it shouldn't.
Sloth 16th November 2010, 20:35 Quote
Quote:
Originally Posted by Tulatin
It's not the fault of game developers that the cards are being damaged by "Extra stress". Processors can run at 100% indefinately, you know. If a graphics card cannot maintain it's temperatures when running at it's designed frequencies, then something's done wrong.
Two parts wrong with that:

1. You're arguing that there's nothing wrong with programs using unnecessary amounts of system resources. If your processor can run at 100% indefinitely then surely you won't mind your operating system putting it at max load 24/7. Nothing wrong with that scenario, right?

2. As shocking as this may be, even processors can overheat when run at 100%. The tricky thing is, most people with a system designed/built poorly enough to cause this won't be running anything which pushes their system so hard. Along comes Starcraft 2 with its massive popularity and relatively low system requirements. Suddenly, people who never once thought about heat are finding out that their dusty, archaic case without proper ventilation will roast their card when put under such a load.
new_world_order 16th November 2010, 20:42 Quote
Bakes 16th November 2010, 22:46 Quote
Quote:
Originally Posted by new_world_order
Is it safe though?
Quote:
Originally Posted by Original Article
'exceeding the power limitation of the card may result in damage to card and/or motherboard,
Elledan 17th November 2010, 07:31 Quote
Quote:
Originally Posted by perplekks45
Who said 6pin is 150W? It's 75W. Hence you'd need at least 2 x 8pin to run the card w/o limiter.

Actually 6 and 8 pin PCIe are both 150 Watt. There are no additional wires for 8-pin. The spec called for an additional sense wire with 8-pin so that the wattage could be increased, but PSU manufacturers instead do the sensing inside the PSU and on the ATX connector, so that a 6-pin connector does 150 Watt with ease.

If you take a look at PCIe connectors, you'll see that the 8-pin version has its two extra pins simply looped back into two existing ground wires, this to satisfy some GPUs which actually check for their presence.
thehippoz 17th November 2010, 08:15 Quote
found this.. it might be able to supply an extra 50w without burning up the wire
Quote:
with only two 12 volt lines the standard implementation of PCI Express power cables use large enough gauge wire and a good enough connector to provide much more than the three amps per wire required to provide 75 watts. Nonetheless, the 6 pin PCI Express power cable officially provides only 75 watts. In all likelihood, however, real implementations of this power cable can provide far more than 75 watts.

I remember when they had the 6 to 8 pin adapters that came with the video cards too.. then someone pointed out it could cause a fire or something and they stopped supplying them with the card
MaverickWill 17th November 2010, 11:04 Quote
Quote:
Originally Posted by Bakes
No - there's no hard limits on the amounts that can be drawn, just the limits to remain within specification. So if you remove the limiter and draws more power, it's not a big deal - under normal conditions, it shouldn't.

No other graphics card has ever needed a limiter like this. If the card's capable of drawing such a massive amount of power that it needs to be heavily limited to fall within specification, it's doing something wrong. Very wrong. Doubly so if, as people say, it's not doing anything extra with that power.
Quote:
Originally Posted by thehippoz
found this.. it might be able to supply an extra 50w without burning up the wire

Right, so we're seeing a max potential power draw of 350W...

Wait, wut? Does this mean that if we used 2 8-pin plugs on a 580, we'd see even more? I shudder to think!
wuyanxu 17th November 2010, 12:51 Quote
Quote:
Originally Posted by MaverickWill
No other graphics card has ever needed a limiter like this. If the card's capable of drawing such a massive amount of power that it needs to be heavily limited to fall within specification, it's doing something wrong. Very wrong. Doubly so if, as people say, it's not doing anything extra with that power.

why are fast saloon cars limited to 150mph?

i don't understand why people see the limiter to be an issue, as long as it produces fast enough performance, why bother fiddle with it?

when you game, it's very similar to driving the fast saloon car on speed limited motorway, as long as it's fast enough, with horsepower to spare, why does anything else matter?

the limiter only becomes an issue when you try to go on the tracks, (aka benchmark) but as tests have shown, 580 doesn't get limited in benchmarks. besides, you can't play benchmarks.
Bakes 17th November 2010, 18:33 Quote
Quote:
Originally Posted by MaverickWill
No other graphics card has ever needed a limiter like this. If the card's capable of drawing such a massive amount of power that it needs to be heavily limited to fall within specification, it's doing something wrong. Very wrong. Doubly so if, as people say, it's not doing anything extra with that power.

So isn't it a good thing that nVidia have managed to limit power to consumption-heavy yet results-low parts of the card then? The fact that nVidia have managed to lower the temperatures yet increase the performance without fundamentally changing the core should be commended... right?
MaverickWill 18th November 2010, 05:53 Quote
Quote:
Originally Posted by Bakes
So isn't it a good thing that nVidia have managed to limit power to consumption-heavy yet results-low parts of the card then? The fact that nVidia have managed to lower the temperatures yet increase the performance without fundamentally changing the core should be commended... right?

Creating a card that had lower temperatures, faster performance, and lower power consumption would be a hat-trick. My issue comes with this:

nVidia start throttling the card heavily, should it start going past its stated power draw. The card shouldn't be able to go past its stated power draw - it should draw what it needs, and no more. Rather than saying "This card draws a maximum of 260W, because we're throttling it", they should be saying "This card draws 350W, but that shouldn't happen in gameplay." - better yet, design the card to only be able to draw 260W, not 350W and put on a cap.

To go back to the beaten (and quite frankly, broken) car analogy, it's the equivalent of Ferrari saying "Oh yes, sir, you'll get 12 miles per gallon at 150mph, all possible to this special little gadget we've installed that cuts your engine out half the time, since otherwise it'd be 6mpg". - remember, we're talking about a wasteful, excessive, and more importantly, outside ATX-spec, design. A power-draw limiter isn't the same as a speed limiter.
Pete J 18th November 2010, 07:10 Quote
Okay, I don't get this:

Am I right in thinking that for those of us who like to overclock our GPUs, this 'limiter' will result in a lower overclock? Because surely an overclock need moar powa!?
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums