Crysis - Did you upgrade?
Posted on 15th Dec 2009 at 11:05 by Antony Leather with 55 comments
Most of us had the same traumatic experience. We installed Crysis adjusted the graphics settings to something that resembled reasonable expectations, and saw our PCs grind to a halt. It was pretty depressing to see the GeForce 8800GT fall flat on its face when it played all other games so easily. Even the mighty 8800GTX couldn't handle the game at max settings and even struggled at 1,280 x 1,024.
I was prepared to throw some serious cash at my PC to enable it to play Crysis at max settings on my 20in monitor. Who wasn't? Like FarCry, Battlefield 2, and several games before it, I had a burning desire to upgrade to get some decent frame rates and luscious visuals.
But, no matter where I looked, graphics card reviews and forum threads all showed the same story. Crysis was an untameable beast with current hardware and even several 8800GTXs in SLI struggled in the early days of the technology with minimum frame rates regularly dipping to unsavoury figures.
Which leads me to my question. Did Crysis give a much-needed lift to the upgrade scene which had stagnated due to lack of new games and the complete flop that was ATI’s 2900 series? Or did the game hinder upgrading, as its ludicrous demands forced people to bat for the other team, and spend their cash on consoles instead?
Doubtless, many would have seen it as an insurmountable obstacle. Indeed many an excuse I've heard from PC gamers who have defected to consoles was that they refused to spend hundreds of pounds upgrading their PCs to play one game. This is understandable; the lifespan of new hardware is never more than a couple of years at best and Crysis would have made upgrading seem even worse value.
Was Crysis a step to far? Do you expect to be able to play all new games at max settings on the latest hardware? If it could, would you spend any more on upgrading? Let us know in the comments.