bit-gamer.net

StarCraft 2 doesn't limit framerates

StarCraft 2 doesn't limit framerates

StarCraft 2 doesn't have a framerate limiter, creating overheating problems on some systems.

StarCraft 2 is apparently causing overheating issues on some systems, literally frying graphics cards thanks to the lack of a framerate cap in some areas of the game. Fortunately, Blizzard has issued a temporary fix for the problem until a patch can be rolled out via Battle.net.

According to Daily Tech the problem occurs at the main menu, where the lack of a framerate limit allows hardware to max out, potentially failing permanently.

"Certain screens make your hardware work pretty hard," Blizzard said via its tech support site.

"Screens that are light on detail may make your system overheat if cooling is overall insufficient. This is because the game has nothing to do so it is primarily just working on drawing the screen very quickly."

Blizzard's current solution is to tell customers to go to Documents\StarCraft II Beta\variables.txt and add the following lines, replacing the final integers with the desired value.

frameratecap=60
frameratecapglue=30


Some fans have voiced outrage that such a big and obvious bug, one potentially capable of destroying entire systems, slipped through to the release version. As Dailytech points out though, Blizzard isn't entirely to blame and even a maxed out GPU will only overheat that quickly if it already has cooling problems.

Check out our StarCraft 2 review for more information on the game, then let us know your thoughts in the forums.

80 Comments

Discuss in the forums Reply
perplekks45 3rd August 2010, 09:11 Quote
Unlike Blizzard to miss something like that...
barndoor101 3rd August 2010, 09:11 Quote
so the story is that a game is making graphics cards work hard and get hot? hold the front page.

i suppose only fermi owners will need to worry about this though...
d3m0n_edge 3rd August 2010, 09:14 Quote
Hmm. I take it that my watercooled setup is doing it's job then. And just a thought, wouldn't enabling vsync help? Please correct me if I'm wrong.
MajorTom 3rd August 2010, 09:23 Quote
It's an unusual thing to miss but you can't blame Blizzard for your faulty cooling. Stress test people!
r3loaded 3rd August 2010, 09:24 Quote
Wait, what? If your system is set up correctly, it shouldn't overheat in the first place. A FurMark session is much more likely to expose cooling problems in graphics cards.

@d3m0n_edge No, vsync won't help since it doesn't cap at 60fps, but at multiples of it.
liratheal 3rd August 2010, 09:25 Quote
Never had an issue..

My frame rate regularly soars into the hundreds/thousands on cut scenes on SCII and multiple games..
mjb501 3rd August 2010, 09:27 Quote
Shouldn't the GPU shut down/down clock itself to prevent it from getting cooked?
capnPedro 3rd August 2010, 09:30 Quote
Quote:
Originally Posted by r3loaded
Wait, what? If your system is set up correctly, it shouldn't overheat in the first place. A FurMark session is much more likely to expose cooling problems in graphics cards.
Quote:
Originally Posted by mjb501
Shouldn't the GPU shut down/down clock itself to prevent it from getting cooked?

Exactly. Everyone's getting on the blame Blizzard bandwagon, but I'd be much more likely to blame dodgy hardware designs. Exceeding the TDP of the chip like this is crazy.

Still, as I understand it most of the problems are occurring in laptops, probably used by non-hardcore gamers. laptop manufacturers are pretty notorious for their dodgy hot running designs.
Slavedriver 3rd August 2010, 09:57 Quote
My GPU never goes above 70C and can survive prolonged FutureMark runs however I got constant crashing in those cutscenes. I dunno if it has anything to do with overheating but something ain't right there, that's for sure.

Than again, seeing FRAPS tell you that the current scene is rendered at 600 FPS is pretty damn cool :)
impar 3rd August 2010, 10:03 Quote
Greetings!
Quote:
Originally Posted by d3m0n_edge
And just a thought, wouldn't enabling vsync help? Please correct me if I'm wrong.
It should. Dont think I ever saw more than 61 FPS in Fraps.
TomH 3rd August 2010, 10:04 Quote
Quote:
Originally Posted by perplekks45
Unlike Blizzard to miss something like that...
As any Engineer/Designer/Developer will tell you; no matter how rigorously you test your product, no matter how thoroughly you try to envision the possible permutations of how it will be utilised... The minute you release it to the general public, some sod will find a way to break it in a way that you never thought plausible.

The good news is that the Battle.net is there to enforce updates upon users, in order to protect them from their own dodgy hardware. What would you do without a mandatory Internet connection eh?

If I were to point fingers at anyone, it's the GPU designers and their partners for designing cards that CAN fail in this way. Mainly due to the fact that you would have hoped this would be a fundamental consideration of their design and testing. However, in some way I'm sure you could just refer back to the first paragraph.
Blackmoon181 3rd August 2010, 10:18 Quote
as a proud owner of antec 1200 , if my air cooled hardware started to overheat i really would start to worry.

from my experience i know a lot of people who play games/ watch movies on their laptops in their beds. no wonder the bloody things overheat in situations such as this!
b5k 3rd August 2010, 10:27 Quote
It's not the GPU designers fault. Explain why it's their fault that Blizzard told SC2 to render a basic menu like a time demo...

Having the ability to perform actions like Blizzard have on a card is paramount, it's just Blizzard didn't mean to do it.
Boogle 3rd August 2010, 10:34 Quote
Quote:
Originally Posted by b5k
It's not the GPU designers fault. Explain why it's their fault that Blizzard told SC2 to render a basic menu like a time demo...

Having the ability to perform actions like Blizzard have on a card is paramount, it's just Blizzard didn't mean to do it.

Rule #1 of hardware: Instructions sent to the hardware should not physically break it.

The fact that Furmark could destroy certain cards, and SC2 can destroy others shows corners being cut. You can remove the heatsink from a modern CPU while it's running and it won't burn up. It's like watching a certain TV show will break your TV. If the hardware is capable of something, then every part of it should be up to it. If the heatsink or VRMs aren't good enough, then it's simply corner/cost cutting.
tjh 3rd August 2010, 10:34 Quote
Quote:
Originally Posted by MajorTom
It's an unusual thing to miss but you can't blame Blizzard for your faulty cooling. Stress test people!

I was just stress testing my rig using StarCraft 2, when my GPU blew.
Siberdine 3rd August 2010, 10:38 Quote
Yeah, I was playing Starcraft 2 on max settings for about 3-4 hours nonstop when all of a sudden my computer shut down. I realized it had to be my graphics card, and turned on N-tune to see what temperatures I had. My GPU peaked at around 80 degrees celcius, so I decided to turn up my GTX 260's fan to a fixed 70% fan capacity while I was playing. This also seems to happen to me with Bad Company 2.

At least now I know that it wasn't my graphics card not being able to handle new games :D
phuzz 3rd August 2010, 10:57 Quote
Question:
How much longer would this have taken Blizzard to notice if they'd released during the (northern hemisphere) winter?

(I was really happy with how quiet my new case was, until the ambient temperature got up a bit and I had to turn the graphics card fan right up to stop in game crashes)
wuyanxu 3rd August 2010, 11:00 Quote
Quote:
Originally Posted by Siberdine
Yeah, I was playing Starcraft 2 on max settings for about 3-4 hours nonstop when all of a sudden my computer shut down. I realized it had to be my graphics card, and turned on N-tune to see what temperatures I had. My GPU peaked at around 80 degrees celcius, so I decided to turn up my GTX 260's fan to a fixed 70% fan capacity while I was playing. This also seems to happen to me with Bad Company 2.

At least now I know that it wasn't my graphics card not being able to handle new games :D
fixing fan speed is the issue here, not Blizzard, nVidia, ATI or anyone else.

to download MSI Afterbuner, set dynamic fan speed with fan max out at 100% at high temperature. now play any game to your heart's content, it will never overheat as long as the cooler is up to the task. (im sure it is, seeing overheating occurs mostly due to people not setting fan speed correctly)


on a side note: what is to blame on Blizzard is inexcusable omission of Anti-Aliasing in a designed-for PC game.
impar 3rd August 2010, 11:09 Quote
Greetings!
Quote:
Originally Posted by wuyanxu
on a side note: what is to blame on Blizzard is inexcusable omission of Anti-Aliasing in a designed-for PC game.
Yep.
Although in normal gameplay you usually dont notice, there are some cinematics and areas in gameplay where you just get distracted looking at the jaggies.
Applying AA in the driver control panel makes the in-game map blurry.
rickysio 3rd August 2010, 11:10 Quote
Good going, Blizzard.

I'm actually perfectly happy that this occurs - maybe more people will ACTUALLY care about their hardware/airflow now.

Then again...
Farting Bob 3rd August 2010, 11:24 Quote
Doom really stresses my card when i run it at over 9000 FPS as well.
Dreaming 3rd August 2010, 11:24 Quote
Do you guys remember when the survival mode patch came out for left4dead?

The forums were absolutely full of people complaining because some weird coalescence of game bug / hardware design flaw was just simply melting the cards.

I got a free upgrade from BFG at least :)
r3loaded 3rd August 2010, 11:32 Quote
Quote:
Originally Posted by capnPedro
Still, as I understand it most of the problems are occurring in laptops, probably used by non-hardcore gamers. laptop manufacturers are pretty notorious for their dodgy hot running designs.

Good point actually. My laptop folds almost 24/7, but only because I've taken the integrated keyboard off and have a desk fan blowing cool air across the heatsinks :P:P
DragunovHUN 3rd August 2010, 12:14 Quote
Quote:
Originally Posted by Dreaming

I got a free upgrade from BFG at least :)

Maybe this is the REAL reason they went out of business :D
TAG 3rd August 2010, 12:22 Quote
Every freaking game should feature an option to cap the framerate!
I don't need to play @ 120fps and game in my underwear in an annoyingly hot room. (open a window, with the rain I get here? not always an option)
I don't need to buy more electricity for something I don't notice.
I don't need to stress my hardware more than is required.

With current high power gaming hardware, I'm pretty sure one could save in excess of 100W of power on some of his games. Multiply that by millions and you land yourself a more eco friendly hobby too.

Please please please, make framecaps a standard option everywhere.
rollo 3rd August 2010, 12:25 Quote
you can force AA through drivers to those that miss it

and as for overheating issues buy more airflow. nobody runs any burn in tests these days

furmark and prime95 will pick up air flow problems. cable management is key
Floyd 3rd August 2010, 12:29 Quote
I noticed that last night when playing some SC2. My GPUs were at a roasting 46c! lol. God I love watercooling.
erratum1 3rd August 2010, 12:30 Quote
I wouldn't say it was anybodys fault, but if you put on a new game and it kills your gpu then your going to be a bit upset.

We can't all afford killer water cooled setups.
g3n3tiX 3rd August 2010, 12:54 Quote
They had this issue in the beta : in the menu the FPS was uncapped resulting in 100% gpu load and fan speed (on my GTX280...)
They issued a patch, but it seems it came back.
Vsync on on my computer, no problems.
GoodBytes 3rd August 2010, 13:22 Quote
Oh common Bit-tech.. StarCraft fault? Or more like faulty graphic card heatsink, or bad air flow case design.
I have no problem with my case.. and the case only has 2 fans (120mm and 92mm) lowest speed (very quiet), and the case was not even designed for a GTX260 (too long - I had to mod the case). And my CPU heats more than any of anyone here current main computer CPU.

Not to mention that this is not the only game... Batman AA did the same thing.

I like how people blame the wrong things. "DAMN IT my computer locked up! It's all my new car fault!... Stupid Ford!"
leexgx 3rd August 2010, 13:26 Quote
vsync would seem to fix most issues , as long as it does not add input lag (think its ok under gtx4xx cards due to the way its made need to test it thought, works ok under wow, gtx460)

some video cards scream under high fps as well
CardJoe 3rd August 2010, 13:32 Quote
Quote:
Originally Posted by GoodBytes
Oh common Bit-tech.. StarCraft fault? Or more like faulty graphic card heatsink, or bad air flow case design.
I have no problem with my case.. and the case only has 2 fans (120mm and 92mm) lowest speed (very quiet), and the case was not even designed for a GTX260 (too long - I had to mod the case). And my CPU heats more than any of anyone here current main computer CPU.

Not to mention that this is not the only game... Batman AA did the same thing.

I like how people blame the wrong things. "DAMN IT my computer locked up! It's all my new car fault!... Stupid Ford!"

Nowhere in the article do we blame Blizzard. In fact, we explicitly agree with DailyTech - it's only an issue if you've got underlying cooling problems.
GoodBytes 3rd August 2010, 13:33 Quote
Quote:
Originally Posted by CardJoe
Nowhere in the article do we blame Blizzard. In fact, we explicitly agree with DailyTech - it's only an issue if you've got underlying cooling problems.

Daily-Tech blamed SC2, unless they edited their article since yesterday.

[edit]
Yup they edited the article...
" And while it does appear a bug (uncapped framerates) is partially to blame for killing off the cards, a card pushed to the max would generally not die instantly were it not for poorly engineered and/or defective cooling." This is new.

In any case, it's not even partial Blizzard fault. It's 100% the owner's for not stress testing their case, and not think about air flow.
[/edit]
sp4nky 3rd August 2010, 13:34 Quote
If my GPU were killed by a stress test, whether it were one from gaming or designated testing software like futuremark, I'd be pretty much certain that the problem was the card itself. When you buy a graphics card, it doesn't have a warning that using it in the manner intended may lend itself to fail - if graphics cards did have that warning, nobody would buy them. Therefore, if the card does fail while you're using it properly, e.g. to play Starcraft II, the card cannot be fit for purpose.
Somatic 3rd August 2010, 13:35 Quote
Crysis and S.T.A.L.K.E.R. series don't appear to have frame rate caps on the intro movies and menu screens either. With my current 5850, it causes a squealing noise from the quadruple digit frames per second. Of course, there's nowhere near that amount of FPS while in-game with the settings maxed!
knuck 3rd August 2010, 13:37 Quote
Why are we even talking about this ? Since when do we care about noobs who don't understand any PC has to be cooled regardless of what they do ?

Let them fry their hardware, it'll help the economy when they buy new parts.
TAG 3rd August 2010, 14:08 Quote
Regardless of the cooling, graphics cards are still at fault for not integrating functional overheating failsafes. They're supposed to throttle down, aren't they?
rickysio 3rd August 2010, 14:09 Quote
Quote:
Originally Posted by knuck
Why are we even talking about this ? Since when do we care about noobs who don't understand any PC has to be cooled regardless of what they do ?

Let them fry their hardware, it'll help the economy when they buy new parts.

Helps keep other vendors afloat too.
fatty beef 3rd August 2010, 14:30 Quote
I have 2 5770s and usually peak around 76C playing the Witcher, DA:O, ME2, Torchlight, and thats when its 90F outside and unfortunately my lack of AC doesnt help my situation but thats pretty acceptable.

I hit 105 last time I fired up SCII which was super surprising and for some reason before that Id hit around 90C for the first time since I changed my fan profile in Afterburner. Havent fried anything yet but thats pretty nuts. You would think there would be some type of failsafe in the driver that kills the card when it gets that hot...

At least Blizzard was upfront about it and offered a quick fix straight away.
Yslen 3rd August 2010, 14:49 Quote
So it makes your card run at 100%? Why is that such an issue? If your card dies under full load you have poor cooling or a dodgy card, it's not the fault of the game. Any game that is GPU limited on your system that runs at less than 60fps (assuming you use vsync) will do exactly the same thing. On my system that's pretty much every game I play, as I only have a 4850. It's even overclocked and it's fine at 100% for hours, never had a problem.

Kudos for Blizzard for responding to complaints, but they shouldn't be taking the blame for badly made graphics card heatsinks or dodgy case airflow.
Yslen 3rd August 2010, 14:54 Quote
Quote:
Originally Posted by Blackmoon181
as a proud owner of antec 1200 , if my air cooled hardware started to overheat i really would start to worry.

from my experience i know a lot of people who play games/ watch movies on their laptops in their beds. no wonder the bloody things overheat in situations such as this!

It's got better in recent years because a lot of low end laptops have efficient Intel CPUs, so the fan doesn't even spin up unless the machine is under heavy load. Five years ago though I was amazed at how many people would sit with a laptop smothered by their duvet then express utter bewilderment when it switched itself off. Usually they would then blame Microsoft and say they were going to buy a Mac.
TAG 3rd August 2010, 15:12 Quote
There was a similar issue with Furmark and ATI a few months ago ...
Different games and benchmarks stress hardware in different ways. We shouldn't blame the software, the card's heatsinks or the PC's general cooling. What's to blame here is firmware/driver failsafes that should throttle down cards in such hot conditions.
Now are these failsafes set riskily as high as possible to allow for longer bench stress testing? I don't know but if that's the case it's a really poor tactic that really doesn't help with anything.
karx11erx 3rd August 2010, 15:20 Quote
"Some fans have voiced outrage that such a big and obvious bug, one potentially capable of destroying entire systems, slipped through to the release version."

Isn't the root of the problem faulty cooling?

Well, it's always easier to point fingers at others than to take over responsibility for yourself.

It would be somewhat interesting to know how many of the users suffering from the problem have cluelessly self-built or overclocked rigs.
GoodBytes 3rd August 2010, 15:25 Quote
Quote:
Originally Posted by TAG
There was a similar issue with Furmark and ATI a few months ago ...
Different games and benchmarks stress hardware in different ways. We shouldn't blame the software, the card's heatsinks or the PC's general cooling. What's to blame here is firmware/driver failsafes that should throttle down cards in such hot conditions.
Now are these failsafes set riskily as high as possible to allow for longer bench stress testing? I don't know but if that's the case it's a really poor tactic that really doesn't help with anything.
If true, it helps getting a better score.
What benchmark programs should do, is pause the process when this occurs, and continue the GPU/CPU/Chipset doesn't throttle anymore. Another thing is that GPU makers optimize their drivers to get a better benchmark score, and also cutting corners to get the highest benchmark. That is why I don't trust or care about benchmark score in reviews. The only advantage, I guess, to use benchmark, is if want to see your own overclock speed increase, or compare an pre-manufactured-OC version of a card with the normal one. But, that's about it. I know that Intel GPU's have fun optimizing their score in Windows Experience Index, which I find that very funny. I saw so many laptop with ATI and Nvidia GPUs that has lower WEI scores than Intel's, while Intel provide a choppy Aero experience on a large external display, while the Nvidia and ATI cards runs without an hitch.
sear 3rd August 2010, 15:34 Quote
This is embarrassing... not for Blizzard, but for PC gamers. If you've got high-end hardware, you need to keep it cool. If your computer is dying because you didn't have the common sense to get a decent case for it, then it's your own damn fault. I thought PC gamers knew how to build computers for themselves; apparently not.
perplekks45 3rd August 2010, 16:01 Quote
Quote:
Originally Posted by TomH
As any Engineer/Designer/Developer will tell you; no matter how rigorously you test your product, no matter how thoroughly you try to envision the possible permutations of how it will be utilised... The minute you release it to the general public, some sod will find a way to break it in a way that you never thought plausible.
Blizzard had a pretty brilliant reputation back in the days. They hardly ever had problems with bugs due to well planned testing methods.

I was just disappointed that this has changed even though it's not their fault graphics cards run hot.
Showerhead 3rd August 2010, 16:15 Quote
Does this mean that starcraft 2 is the new stress test?
delriogw 3rd August 2010, 16:51 Quote
i have to say that attitude in these comments is probably why a lot of people go to consoles.

what if you don't know huge amounts about pc hardware and just want to play the game. if your card is good enough to run the game you should expect it to be able to do so without dying - whether through shoddy hardware build or software issues.

i know enough to understand about half of what i read here, that puts me in the knowledgeable but not uber geek category. pretty much the only thing i'm not comfortable with hardware wise is fitting heatsinks. not everyone has the knowledge or an ubergeek to rely on. and the majority of gamers won't stress test anything.
GoodBytes 3rd August 2010, 17:01 Quote
Quote:
Originally Posted by delriogw
i have to say that attitude in these comments is probably why a lot of people go to consoles.
You know that consoles also fails because of games. In fact their is list of games of all consoles that you can find, that have high reports of the system overheating and failing... hope yours are in the 1 year warranty.
Quote:
what if you don't know huge amounts about pc hardware and just want to play the game.
You don't need to. The FIRST thing that is said billions of time, is STRESS Test your computer for at LEAST 1 hour. But of course, people skip that test, like the step about doing backups. And they have to learn the hard way.

You don't have to be genius to know this.
Sloth 3rd August 2010, 17:03 Quote
While reading this I was quite tempted to agree that PC gamers should be more aware of the hardware involved in high end gaming. Every PC should be properly cooled and ready to handle the worst case scenario of 100% load on all components in the middle of summer.

However, just because gamers should be ready for such a scenario doesn't mean Blizzard's free and clear after such a large oversight pushing systems far beyond what is required. I'd be pretty pissed seeing 100% load on a menu just because all of Blizzard's development team didn't think to put a limit on framerates, even if it didn't kill/damange my card.
Dreaming 3rd August 2010, 17:07 Quote
I think an important point is with the race for more graphics power, stability at peak loads has been compromised - because the people who make these cards assume that most people won't run them at full consistently anyway, and if they do, you can just blame the end user for not cooling it / providing enough power.

I agree with delriogw tbh, for the AVERAGE pc gamer, they shouldn't be required to know anything about cooling and all that. The cases are supposed to be designed to vent the hot air, and the gpus are supposed to be designed to do that too. Problem is poor designs - cards that run very hot and dump all the hot air into the case causing it to escalate and heat up.
delriogw 3rd August 2010, 17:17 Quote
Quote:
Originally Posted by GoodBytes
Quote:
Originally Posted by delriogw
i have to say that attitude in these comments is probably why a lot of people go to consoles.
You know that consoles also fails because of games. In fact their is list of games of all consoles that you can find, that have high reports of the system overheating and failing... hope yours are in the 1 year warranty.
Quote:
what if you don't know huge amounts about pc hardware and just want to play the game.
You don't need to. The FIRST thing that is said billions of time, is STRESS Test your computer for at LEAST 1 hour. But of course, people skip that test, like the step about doing backups. And they have to learn the hard way.

You don't have to be genius to know this.

i don't have a console (well i have a ps2, but that's survived years and years), merely saying that the hardcore attitude makes it understandable why people just don't bother.

and on the contrary, the majority of pc owners have them for other reasons and just happen to game on them. they expect that if a game box says your card will work, then your card will bloody well work.

and these kind of games stretch to all kinds of players. WoW players will possibly check it out, and many of those don't know a HDD from a CPU. some games extend beyond gamers in the normal idea of the word. it's unreasonable to expect people to know everything. if you had a dvd player, you wouldn't expect to have to test it depending on which dvd you were planning to watch. you would expect the hardware to do its job
paisa666 3rd August 2010, 18:00 Quote
Pfffffffff... not caping the framerates.. gotta be kinding me

If a card dies for this then its more a hardware issue rather than a software issue... Poor cooling
sheninat0r 3rd August 2010, 19:46 Quote
Quote:
Originally Posted by wuyanxu
on a side note: what is to blame on Blizzard is inexcusable omission of Anti-Aliasing in a designed-for PC game.

Deferred rendering, multiple render targets, etc. It's been beaten to death, and there is a very good technical reason for it, even if you and others don't want to read about it for fear of bursting your ActiBlizz hate bubble.

Force it in your graphics driver control panel. The new beta Catalyst has it, so please don't start on the ridiculous nV-ATi SC2 anti aliasing debate.
Hovis 3rd August 2010, 20:10 Quote
Quote:
Originally Posted by sear
This is embarrassing... not for Blizzard, but for PC gamers. If you've got high-end hardware, you need to keep it cool. If your computer is dying because you didn't have the common sense to get a decent case for it, then it's your own damn fault. I thought PC gamers knew how to build computers for themselves; apparently not.

Yeah, cos this never happened to any of the current generation of consoles. :)
frontline 3rd August 2010, 20:41 Quote
Quote:
Originally Posted by Sloth
While reading this I was quite tempted to agree that PC gamers should be more aware of the hardware involved in high end gaming. Every PC should be properly cooled and ready to handle the worst case scenario of 100% load on all components in the middle of summer.

However, just because gamers should be ready for such a scenario doesn't mean Blizzard's free and clear after such a large oversight pushing systems far beyond what is required. I'd be pretty pissed seeing 100% load on a menu just because all of Blizzard's development team didn't think to put a limit on framerates, even if it didn't kill/damange my card.

Agree with this tbh, if a game is stress testing your GPU in a menu or cutscene then it is clearly poor coding. Then again, at least the problem has been recognised and action will be taken, so kudos to Blizzard for this.
popcornuk1983 3rd August 2010, 21:33 Quote
Quote:
Originally Posted by Hovis
Yeah, cos this never happened to any of the current generation of consoles. :)

+1 to that. It's the PC's RROD!!! :D

Not Blizzards fault. Although it should have been picked up in testing by a company of this caliber you can't blame them for shoddy coolers or bad manufacturing. It should be up to the card vendors to replace defective cards.

I must admit I was getting a bit worried. I've got a powercolor 4890 and it was making some bloody racket. Seems to have eased off a bit since I applied the frame rate fix.

On a side note. Is AA completley disabled in SCII? There are no options in the menu and it doesn't make a difference when I set AA through the CCC.
GoodBytes 3rd August 2010, 21:54 Quote
Quote:
Originally Posted by popcornuk1983

On a side note. Is AA completley disabled in SCII? There are no options in the menu and it doesn't make a difference when I set AA through the CCC.

Looks like (I have the GTX 260).
You can see in the video's the shaggy lines, and especially on the "...: green bubble cursor when you roll-over a character between missions.
Tulatin 3rd August 2010, 22:33 Quote
I find it funny how blame is pointed at Blizzard for this. Maybe OEMs should use coolers that aren't shite.
Skiddywinks 3rd August 2010, 23:33 Quote
Quote:
Originally Posted by sheninat0r
Quote:
Originally Posted by wuyanxu
on a side note: what is to blame on Blizzard is inexcusable omission of Anti-Aliasing in a designed-for PC game.

Deferred rendering, multiple render targets, etc. It's been beaten to death, and there is a very good technical reason for it, even if you and others don't want to read about it for fear of bursting your ActiBlizz hate bubble.

Force it in your graphics driver control panel. The new beta Catalyst has it, so please don't start on the ridiculous nV-ATi SC2 anti aliasing debate.

I heard about the reason for a lack of AA being they use a deferred renderer. Since you seem quite knowledgable on the subject, I don't suppose you know of anywhere I might find a decent explanation of why Starcraft doesn't have AA (obviously incorporating some info on deferred rendering, mutiple render targets etc).

I would just scout google or wikipedia, but they are hard to find exactly the information you need.
TAG 4th August 2010, 00:19 Quote
Coolers aren't shite, they work just fine in normal operating conditions. Bench stress testing and bad programming lead to higher temps than normal operation. Failsafes temperature triggers are too high, thats where the issue is.
general22 4th August 2010, 01:03 Quote
My guess is that during the menus it is running at a 2D clock and fan speed and for some reason the GPU software doesn't spin the fan up with a temperature increase. Most people who install vendor programs that dynamically increase fan speed with temperature will be ok. Those who set high constant fan speeds will also be fine.
TAG 4th August 2010, 01:39 Quote
Doubt 2d speeds would fry a card. Moreover is this really happening in the menus or was that just an example in this thread?

All that was stated was "Screens that are light on detail"
Makaveli 4th August 2010, 04:10 Quote
Wow... where have all the real computer enthusiast gone.

Is the internet just full of noobs these days?

I think those of you that don't understand a computer needs cooling should really just stick to consoles and leave the PC scene for those of us that know what we are doing.
wuyanxu 4th August 2010, 11:34 Quote
Quote:
Originally Posted by sheninat0r
Deferred rendering, multiple render targets, etc. It's been beaten to death, and there is a very good technical reason for it, even if you and others don't want to read about it for fear of bursting your ActiBlizz hate bubble.

Force it in your graphics driver control panel. The new beta Catalyst has it, so please don't start on the ridiculous nV-ATi SC2 anti aliasing debate.
care to explain why Starcraft 2 Beta has Anti-Aliasing option?
fatty beef 4th August 2010, 12:47 Quote
Update:

Im an idiot, after updating CCC and reinstalling afterburner my fan curve was disabled and set to default... Haha back to normal temps now, definately user error. Whoops.
Psytek 4th August 2010, 13:29 Quote
I have to agree with the general sentiment that this isn't blizzard's fault. As someone who has done quite a bit of graphics programming, I can tell you, there's really nothing you can do purposely or otherwise that should cause a graphics card to overheat. If you application runs at a high framerate, that means you AREN'T asking the graphics card to do very much work, the graphics drivers should take care of how it behaves when being underutilised during the individual cycles of the main loop.
popcornuk1983 4th August 2010, 13:35 Quote
Quote:
Originally Posted by Makaveli
Wow... where have all the real computer enthusiast gone.

Is the internet just full of noobs these days?

I think those of you that don't understand a computer needs cooling should really just stick to consoles and leave the PC scene for those of us that know what we are doing.

Woah! Hold on a minute. So you're saying that the reason these cards are fried is because the owners are incompetent? That's a total elitist attitude you have there and it's what puts people off buying or tinkering with their PC's. An average user shouldn't have to have an in depth understanding of how to monitor and manage their temps in order to play games.

If everyone had the the attitude you do then joe average wouldn't be buying and building PC's and would defiantly buy a console instead. Which would lead to PC game publishers releasing less and less games and sticking to console only releases.

It's up to the manufacturers to worry about these problems not the users.

I have no understanding of how a car works. If my engine is running too hot and it blows then it's my fault? I get what I deserve? Rubbish!
wuyanxu 4th August 2010, 14:40 Quote
Quote:
Originally Posted by popcornuk1983

I have no understanding of how a car works. If my engine is running too hot and it blows then it's my fault? I get what I deserve? Rubbish!

it's your fault for not monitoring engine temperature gauge, (graphics card temperature) and for not properly maintaining coolant levels. (clear out dust in your computer case, maintain suitable fan control)
Skiddywinks 4th August 2010, 14:42 Quote
Quote:
Originally Posted by popcornuk1983


I have no understanding of how a car works. If my engine is running too hot and it blows then it's my fault? I get what I deserve? Rubbish!

He never said that. He said those that don't understand a PC needs cooling, not those that don't understand cooling itself. Frankly, I agree with him entirely. There should be no situation based entirely on coding where a GPU can fry itself. If there is anything you can do, be it certain prgrams or certain software conditions, that cause your GPU to fry, then, well, it is unstable and it is the manufacturers fault.

Sure, in this case it has only come to light because of an oversight in the coding, but it isn't the codings fault that the card can not handle any and all conditions it could face.
popcornuk1983 4th August 2010, 14:48 Quote
Quote:
Originally Posted by wuyanxu
it's your fault for not monitoring engine temperature gauge, (graphics card temperature) and for not properly maintaining coolant levels. (clear out dust in your computer case, maintain suitable fan control)

I agree with what your saying. These are things you should be checking on a regular basis. I do all of the above to my car and PC. However the point I'm making is that if my engine/GPU was defective because of poor manufacturing/poor cooling then I shouldn't have to keep a closer eye on my temps because of it.
popcornuk1983 4th August 2010, 14:59 Quote
Quote:
Originally Posted by Skiddywinks
He never said that. He said those that don't understand a PC needs cooling, not those that don't understand cooling itself. Frankly, I agree with him entirely. There should be no situation based entirely on coding where a GPU can fry itself. If there is anything you can do, be it certain prgrams or certain software conditions, that cause your GPU to fry, then, well, it is unstable and it is the manufacturers fault.

Sure, in this case it has only come to light because of an oversight in the coding, but it isn't the codings fault that the card can not handle any and all conditions it could face.

Fair enough I misread the post slightly. But still, why should a user have to be aware of cooling? Current gen consoles need cooling too and nobody goes on about it. Even though it's obvious that the xbox360 had a poor cooling system which led to all the fried consoles. Again it was manufacturing issues not the users fault.
TAG 4th August 2010, 15:03 Quote
And cars actually give warnings. GPUs don't by default, they're supposed to throttle down! I said it before, it happened with furmark and ATI a few months ago. It's a driver/firmware issue. Cooling is not the issue, the cards here are pushed to the extreme (for bad reasons) and the driver/firmware failed to protect the card.
wuyanxu 4th August 2010, 17:08 Quote
Quote:
Originally Posted by popcornuk1983
I agree with what your saying. These are things you should be checking on a regular basis. I do all of the above to my car and PC. However the point I'm making is that if my engine/GPU was defective because of poor manufacturing/poor cooling then I shouldn't have to keep a closer eye on my temps because of it.
that is true, a situation i can think of is 48x0 VRM dies when running Furmark.

but Starcraft 2 taxes the actual GPU chip heavily, and the GPU chip has a huge heatsink on top of it, so surely it should never kill the GPU is there is no dust/fan problem.
Makaveli 5th August 2010, 00:35 Quote
Quote:
Originally Posted by popcornuk1983
Quote:
Originally Posted by Makaveli
Wow... where have all the real computer enthusiast gone.

Is the internet just full of noobs these days?

I think those of you that don't understand a computer needs cooling should really just stick to consoles and leave the PC scene for those of us that know what we are doing.

Woah! Hold on a minute. So you're saying that the reason these cards are fried is because the owners are incompetent? That's a total elitist attitude you have there and it's what puts people off buying or tinkering with their PC's. An average user shouldn't have to have an in depth understanding of how to monitor and manage their temps in order to play games.

If everyone had the the attitude you do then joe average wouldn't be buying and building PC's and would defiantly buy a console instead. Which would lead to PC game publishers releasing less and less games and sticking to console only releases.

It's up to the manufacturers to worry about these problems not the users.

I have no understanding of how a car works. If my engine is running too hot and it blows then it's my fault? I get what I deserve? Rubbish!

Sorry if I came off as elitist but it needs to be said. PCs are complex electronic devices that require regular maintenance just like cars as many have said in here. Just because people are ignorant to the inner workings of a PC doesn't give you an excuse. I 100% agree with if its a manufacturing defect in the stock cooler they are at fault.

Ignorance is not bliss when it comes to PCs because it means lost or damaged hardware. If your parents are footing the bill its whatever but for those of us that build our own that is unacceptable.

Some of you are obviously new to this game. GPU's have steadily gotten hotter and pulling more watts every generation at the top end. You might want to start paying attention now!
popcornuk1983 5th August 2010, 02:00 Quote
Quote:
Originally Posted by wuyanxu
that is true, a situation i can think of is 48x0 VRM dies when running Furmark.

but Starcraft 2 taxes the actual GPU chip heavily, and the GPU chip has a huge heatsink on top of it, so surely it should never kill the GPU is there is no dust/fan problem.

But as TAG said your graphics drivers should then kick into life and protect the card if it's overheating. Even if your PC was as clean as a whistle it might still overheat if the cooler/GPU was defective and not keeping up with the heat dissipation.
popcornuk1983 5th August 2010, 02:04 Quote
Quote:
Originally Posted by Makaveli
Sorry if I came off as elitist but it needs to be said. PCs are complex electronic devices that require regular maintenance just like cars as many have said in here. Just because people are ignorant to the inner workings of a PC doesn't give you an excuse. I 100% agree with if its a manufacturing defect in the stock cooler they are at fault.

Thats what the whole post was about! Shoddy cards/coolers/graphics drivers which have been exposed through a bug in a game. It wasn't about people not being clued up on PC's and how to maintain them. I don't believe people are being ignorant. Some just either don't want to learn this as they have other interests or they can't understand it. Most people that fall into this category wouldn't be building their own PC's. They would be bought from a PC vendor which uses all stock cooling.
Quote:

Ignorance is not bliss when it comes to PCs because it means lost or damaged hardware. If your parents are footing the bill its whatever but for those of us that build our own that is unacceptable.

Bah! I wish my parents would foot the bill for my upgrades! I do agree with you on this. If you build your own PC's using custom coolers etc then yeah it's your own fault.
Quote:
Some of you are obviously new to this game. GPU's have steadily gotten hotter and pulling more watts every generation at the top end. You might want to start paying attention now!

I've been building computers since I was 15 so i've been around for a while ;) Things have become much hotter but we wouldn't need to start paying attention if manufacturers could build suffienent coolers to tame the heat in the first place!
Makaveli 5th August 2010, 02:20 Quote
Quote:
Originally Posted by popcornuk1983
Quote:
Originally Posted by Makaveli
Sorry if I came off as elitist but it needs to be said. PCs are complex electronic devices that require regular maintenance just like cars as many have said in here. Just because people are ignorant to the inner workings of a PC doesn't give you an excuse. I 100% agree with if its a manufacturing defect in the stock cooler they are at fault.

Thats what the whole post was about! Shoddy cards/coolers/graphics drivers which have been exposed through a bug in a game. It wasn't about people not being clued up on PC's and how to maintain them. I don't believe people are being ignorant. Some just either don't want to learn this as they have other interests or they can't understand it. Most people that fall into this category wouldn't be building their own PC's. They would be bought from a PC vendor which uses all stock cooling.
Quote:

Ignorance is not bliss when it comes to PCs because it means lost or damaged hardware. If your parents are footing the bill its whatever but for those of us that build our own that is unacceptable.

Bah! I wish my parents would foot the bill for my upgrades! I do agree with you on this. If you build your own PC's using custom coolers etc then yeah it's your own fault.
Quote:
Some of you are obviously new to this game. GPU's have steadily gotten hotter and pulling more watts every generation at the top end. You might want to start paying attention now!

I've been building computers since I was 15 so i've been around for a while ;) Things have become much hotter but we wouldn't need to start paying attention if manufacturers could build suffienent coolers to tame the heat in the first place!

I agree, that is why I never trust stock coolers and always replace with aftermarket but then again I don't classify myself as a novice user.

In the end I think the manufactures take the biggest part of the blame here they coolers should be able to run the gpu's at 100% load at all times not for only 2 hours a day. For me next in the chain to play the blame game with would be the end users for not monitor their hardware, then blizzard last for not putting in a fps cap at the menu's.

However i've seen games in that past with no fps caps in menu's so for me #1 blame still goes to the Card vendors for using shitty coolers.

End users still need to play closer attention to what going on with their computers regardless if they want to learn or not :P

If it happens to you and you burn something the vendor is not going to just give you a new GPU you are going to have to fight teeth and nail for it sometimes. And why put yourself through that mess just be alittle more informed about your box and temperatures.

popcornuk1983 you have been very gracious with your replies and want to thank you for not turning this into a flame war not saying you would but its good to have the back and forth banter and it be productive.
popcornuk1983 5th August 2010, 11:34 Quote
Quote:
Originally Posted by Makaveli
I agree, that is why I never trust stock coolers and always replace with aftermarket but then again I don't classify myself as a novice user.

In the end I think the manufactures take the biggest part of the blame here they coolers should be able to run the gpu's at 100% load at all times not for only 2 hours a day. For me next in the chain to play the blame game with would be the end users for not monitor their hardware, then blizzard last for not putting in a fps cap at the menu's.

However i've seen games in that past with no fps caps in menu's so for me #1 blame still goes to the Card vendors for using shitty coolers.

End users still need to play closer attention to what going on with their computers regardless if they want to learn or not :P

If it happens to you and you burn something the vendor is not going to just give you a new GPU you are going to have to fight teeth and nail for it sometimes. And why put yourself through that mess just be alittle more informed about your box and temperatures.

popcornuk1983 you have been very gracious with your replies and want to thank you for not turning this into a flame war not saying you would but its good to have the back and forth banter and it be productive.

Totally valid points. It would be beneficial if people were a bit more clued up on PC maintenance just to be on the safe side. I doubt it will actually happen though :(

I don't trust stock cooling either. I have a Radeon 4890 and the stock cooler I felt, was letting the GPU get wayyyy to hot and sounded like a bloody jet when it was spinning @100%. The new one keeps it much cooler while also cooling the VRM's which IMO should have had more cooling from the start!

Likewise, good banter and valid points all round :) people who flame just look for the nearest comment they disagree with and don't listen to anyone's opinion! The big drama queens ;)
Sloth 5th August 2010, 18:20 Quote
Quote:
Originally Posted by Makaveli
Wow... where have all the real computer enthusiast gone.

Is the internet just full of noobs these days?

I think those of you that don't understand a computer needs cooling should really just stick to consoles and leave the PC scene for those of us that know what we are doing.
Oh yes, the noobs who are happily playing their game as menus push their PC to its limits because someone didn't feel like adding the measly two lines which players are now being told to add themselves. It's not a game developer's fault for not adding it to their games, it's your fault of course!

Obviously, since us PC enthusiasts are so hardcore, we should just run FurMark 24/7 because according to you and everyone else blaming gamers for this there is no problem with running your GPU at 100% for no reason what-so-ever.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums