Posted on 30th Sep 2014 at 09:30 by Antony Leather with 89 comments
Some may disagree here, but to me, I’d welcome more pixels than my current 24in 1,920 x 1,200 main monitor offers. As I have two screens, I’ve also considered investing in a super-wide screen too.
There are some fantastic-sounding options here in the ultra high resolution department as well. LG and AOC have 34in 3,440 x 1,440 monitors plus Dell and LG have recently announced their own curved versions (WANT). The prospect here for immersive, high resolution gaming is pretty compelling but the extra screen real-estate is useful for all manner of other tasks too. I’ve played with super-wide monitors before as well, as you can read about here, and despite older 30in models only sporting 1,080 vertical pixels, I didn’t find this too restrictive when editing photos and the like.
However, there’s one major issue stopping me splashing some cash on a new ultra HD monitor. This is the fact that I’d need to invest twice as much again in the graphics department to be able to get playable frame rates in games. I’ve never been one to tone down graphics settings in order to get playable frame rates; this is partly the reason I find myself writing about PC hardware for a living, apart from the fact I caught the upgrade bug two decades ago.
However, even if I was prepared to drop a little in terms of detail settings, this still wouldn’t be enough to allow even a £400 single-GPU graphics card to handle all the latest games, never mind my aging GTX 660 Ti. Even Nvidia’s latest effort – the GTX 980 was a long way from achieving playable frame rates in Crysis 3 in our review; you’d need to opt for a monster such as AMD’s R9 295X2 in order to get some headroom at 4K.
Something else that concerns me, though, is that there’s not much effort going on to address the main issue here, which is that higher resolutions are going mainstream. Windows 8.1 achieved a lot in terms of 4K scaling, though there are a few more issues to iron out, not least of all by software companies with their own program scaling.
However, we’re nearly at the point where it makes absolute sense to aim for 4K in a mid to high-end system, rather than a super-high end one as is the case at the moment. This doesn’t mean I think those of us with limited wallet power won’t consider splashing out £300-400 on a 4K-capable graphics card, but the fact is that once 4K monitors fall in price further, mid and high-end enthusiasts will have a bit of a problem on their hands.
They can afford a 4K monitor, but not the graphics card/s to power it in games. We haven’t had such a big reason to upgrade our graphics card since Crysis landed but AMD and Nvidia need to do more to make these ultra high resolutions more attainable outside of super-expensive systems. In the past, you've needed to invest heavily if you game on triple screens, for example, and I think this needs to change.
4K is waiting to take off, be it in super-wide or standard aspect ratio monitors. In addition, true 4K gaming is also something the latest consoles lack. So this is also a huge opportunity for PC gaming to take a giant leap forwards and offer something tangible when it comes to a better gaming experience.
In short what we need is a GTX 970-type graphics card that can handle the latest games at 4K – something in the region of £250-350 – not the £700 odd that you’d currently need for something like the R9 295X2. So come on AMD and Nvidia, rise to the challenge and give us more reasonably-priced 4K-capable graphics cards.