bit-gamer.net

PC gaming too specialised, says Nvidia, Intel

PC gaming too specialised, says Nvidia, Intel

The PC gaming scene is dying and its all Crysis' fault!

It seems like barely a month goes by without someone making an apocalyptic prediction about the state of gaming in some form or another. Sometimes its to do with the death of a genre, sometimes its more to do with the lack of innovation in the games market. Today; the death of PC gaming as a whole.

At the Nvidia GeForce LAN 4 Event in California at the moment, spokespersons from Nvidia, Microsoft, EA and Crytek are discussing the future of PC gaming and trying to ensure market growth.

In recent years consoles have taken up a massive part of the gaming market thanks to the ability to appeal to casual and hardcore gamers whilst also retaining a fixed price and a more secure shelf-life. PC gaming meanwhile has been in decline, with NPD statistics showing the total sales in the market falling significantly from $1.5 Billion to $970 Million between 2001 and 2006. Even Bill Gates would notice dropping that much change.

Roy Taylor, VP of content relations at Nvidia and to whom we chatted not long ago, told Next-Gen.biz that the industry has so far combated next-gen consoles by focusing on the PC's ability to create increasingly hi-tech games like Crysis.

However, this could backfire on the market in the end. Roy noted how even top-end PCs can struggle with Crysis at the moment, forcing more users to upgrade.

"Something needs to be done so a person buying a PC at Wal-Mart could be a PC gamer too," said Randy Stude, director of Intel's gaming platform office. Unsurprisingly, he thinks Intel has the answer though and suggested that more support for Intel's integrated graphics chipsets could be one way to overcome the problem.

What do you think? Is PC gaming dying out or are we just having a bit of a slump? Let us know what you think in the forums.

37 Comments

Discuss in the forums Reply
toric334 20th November 2007, 11:11 Quote
Dying?! I've seen more top class PC titles come out this year than any I can remember. Why would there be any reason it would die now?
Blademrk 20th November 2007, 11:16 Quote
I haven't bought a PC game in ages, mainly because I don't think my system cuts the mustard anymore and I haven't got the spare cash laying around to upgrade at the moment (hmm, maybe I shouldn't put mustard in my 'puter then, it might last longer). But saying that, I have still got UT3 on Pre-order (mainly because it was cheap when I pre-ordered it last year and it looks like it'll scale down to my lowly 6600 graphics card).

I find myself playing on my 360 a lot more than I play on my PC these days (I seem to be on my 360 constantly at the moment)
yakyb 20th November 2007, 11:22 Quote
Quote:
Dying?! I've seen more top class PC titles come out this year than any I can remember. Why would there be any reason it would die now?
beacuase the numbers they are actually selling are low

this is where piracy actually makes a differance i would never pirate a game and those that do download cracked versions of crysis etc should be shot. this is not the music industry where producing a title cost relative pittance. the amount of cost that goes into development of a game is astronomical and the market is smaller than music so cant support mass piracy. for those of you who do pirate games sitting at home thinking well i got my game for free why should i care i'll let the other guys buy it, do you not realise the potential for your acts to shape the PC gaming industry.
eldiablo 20th November 2007, 11:22 Quote
I would have to agree though. The lifecycle of a gaming pc is probably about 1 year, where the lifecycle of a console is easily 5 years and the cost is less.
Bindibadgi 20th November 2007, 12:37 Quote
Quote:
Originally Posted by eldiablo
I would have to agree though. The lifecycle of a gaming pc is probably about 1 year, where the lifecycle of a console is easily 5 years and the cost is less.

How many 360s or PS2s have people been through?

Not to mention new "HDMI" or "Slim" ones.


:P

Randy-wots-his-face at Intel needs a slap with a wet fish. Intel is the sole reason we're all being held back thanks to its craptacular integrated graphics.
[USRF]Obiwan 20th November 2007, 12:38 Quote
I still play cs, tfc after 10 years. So whats up whit that then?
Delphium 20th November 2007, 13:00 Quote
Indeed, with all the releases we have hade this year, and along with all the other past games, I wish there was more hours in the day to play more gameage.

However despite there being a mass of great games out this year, I still find myself playing other such classics as starcraft and C&C RA2 for example, todays game graphics are all very well, but tbh, its the gameplay and replayability that counts, somthing of which Bioshock lacks replayability, yet was still an awesome game.
Hamish 20th November 2007, 13:07 Quote
i bought more games this year than probably the last 5 put together :p

my system is about 2.5 years old now too, its had 1 graphics card upgrade in that time (which alone cost less than a new console)
still plays everything out now, even crysis (just :p)
Tim S 20th November 2007, 13:57 Quote
Quote:
"Something needs to be done so a person buying a PC at Wal-Mart could be a PC gamer too," said Randy Stude, director of Intel's gaming platform office. Unsurprisingly, he thinks Intel has the answer though and suggested that more support for Intel's integrated graphics chipsets could be one way to overcome the problem.

Personally, I think that Intel needs to make its integrated graphics chipsets a hell of a lot better than they are currently... at that point, I'd say it's worthwhile making good games that support Intel IGPs. At the moment, there's no way you can introduce new gameplay innovations (I'm not saying this is the solution to making better games, by the way) with Intel's graphics because they're not powerful enough to render them.

Hopefully Intel's push to make integrated graphics dramatically better than they are today in the next three years will help make Stude's vision more of a reality.
chrisb2e9 20th November 2007, 14:35 Quote
We have seen some good games come our way this year, but I really dont have anything good to look forward to in the future. and if people dont stop stealing games, that may not change a lot.
Although, it would be nice if game developers focused as much on making a really good story as they do with really good graphics. that being said, a graphic upate for starcon 2 would have me shelling out some moneyt just to play that story again.
Aterius Gmork 20th November 2007, 14:59 Quote
I don't think games really need to run on integrated graphics. But it would be nice if I could buy a game, and play it as nice as I might on a console even though I "only" have an 1950GT. Take for instance Oblivion. I can play it nicely now. Nicely - but not as good as on a 360. Before I had an 6600GT - the game ran like crap. 800x600 with everything at Medium to get a somewhat playable framerate. I bought the game because I had more than the recommended game specs.

Since then I haven't bought any game right when it did come out, but later, when I don't have to pay so much, and my hardware is better. Sometimes the industry seems to forget, that it isn't the poeple with the top notch rigs that finance the gaming industry, but people like me, who can buy a computer around every 3 years, and a middle range graphics update every 12 to 18 months. And I know many people who are fed up by the fact, that they cannot even run a game on 1024x768 with settings on high, after having foked over $60 for it.

In fact I would buy a next gen console, if it weren't for the crappy gameplay using a joystick. Even a PS3 for $600 + a TV would save me much money on the long run. New titles on the PS2 - which really has bad specs - don't exactly look bad, so it seems that it is possible indeed. It would be possible on a PC too - if it weren't for companies like NVIdia or ATI, who actually demand from modern games to only run on their newest technology.

Oh yeah: This is no whining. ;) I don't buy games, that's all that's to it.
legoman666 20th November 2007, 15:38 Quote
If I could use my keyboard and mouse on the latest consoles, I might consider buying one. But until then, I'm sticking to my PC.
labsman 20th November 2007, 15:48 Quote
Well, PC can do a lot! A lot more than any consoles can, presently or even in the future, in practical theory though... The problem is that developers are having a hard time juicing the true potential of PC... When i said developers, that's both the softwares and the hardware-notably the lack of support for various input devices... What do we have now, a mouse, a keyboard and occasionally some nifty game controllers, still thats way limited...
fargo 20th November 2007, 16:31 Quote
I think intel has hit on a way to improve pc gaming in the future much better intigrated graphics as
many casual gamers buy their pc's in commercial stores where all pc's are sold with this type of graphic's as well as sound. of course hardcore gamers will always opt for custom or personal built pc's
with top notch graphic's and sound.
devdevil85 20th November 2007, 16:36 Quote
Sadly w/ PCs, both hardware and software have to be at their best to get people to jump on the bandwagon (myself). I want to see a "leap" in gaming in terms of both gameplay and realism (graphics). W/o those two things I am left waiting/hoping while I play on my friends' 360s/PS3s and continue hating my current PC setup due to its inferiority to play games even near the consoles' potential....

I want futureproofing, as well, to get the most out of my $1XXX investment because every year or so a game that I want to play comes out and requires me to replace my old card that I just spent $1XX on a year ago, and now I get to sell it for $20 on ebay. And that's not just it, my CPU as well, and how about a new mobo while we're at it...the list keeps growing, so with that said, PC gaming is ready to grow, but w/o innovation overall (that seperates PC gaming from Console gaming) there's no need for PC gamers such as I to want to spend the money to upgrade.....

I also like how PCs can use controllers/wheels/joysticks as well for sports, racing & flying games which helps for selling those kinds of games on PC....

Having bought my computer 4 years ago (2500+ & 9600PRO baby!) and having bought all types of games during those first 2 years of ownership, I disagree w/ the statement that PC gaming is too "specialised"...there just needs to be a "push" and I feel w/ DX10, games such as (Crysis, Bioshock & Episode 2) and hardware innovation; if these areas continue to grow, come this summer I will have a reason to get back into PC gaming....
Quote:
Originally Posted by legoman666
If I could use my keyboard and mouse on the latest consoles, I might consider buying one.
you can M+K on the PS3 natively, and on 360 using a 3rd party adaptor (which is hella expensive for what it is btw)....
labsman 20th November 2007, 16:38 Quote
Yeah, Intel knows that those are things coming, but the question should be raised on what degree does the company's commitment is up to... AMD's got a better chance in this manner, so to speak, but like the blue ones, their commitment is really not that kind of welfare to those non hardcore gamers... Its because profit is the most important after-product a company wants, IMO...
munim 20th November 2007, 16:52 Quote
In 1999, Quake 3 launches and reaches a new high-water mark for graphics, some fear the advanced graphics will hurt pc gaming

In 2004, Doom 3 launches and reaches a new high water mark for graphics, some fear the advanced graphics will hurt pc gaming

In 2005, FEAR launches and reaches a new high water mark for graphics, some fear the advanced graphics will hurt pc gaming

In 2006, Oblivion 4 launches and reaches a new high water mark for graphics, some fear the advanced graphics will hurt pc gaming

In 2007, Crysis launches and reaches a new high water mark for graphics, some fear the advanced graphics will hurt pc gaming
E.E.L. Ambiense 20th November 2007, 17:32 Quote
Why, because it involves a little more than just plugging something in and pushing the 'on' switch?
Bungle 20th November 2007, 17:38 Quote
We hear the same load of ******** year in year out. Consoles have come and gone. PC gaming has remained at the forefront of innovative gaming since as far back as I can remember. If PC games sales are down, I imagine it's due partly to the upsurge of Online MMORPG gaming. Single player games are second place now (in my book anyways) to online gaming. The market has changed yes, but PC gaming is as strong now as it ever has been.
devdevil85 20th November 2007, 18:36 Quote
Quote:
Originally Posted by munim
In 1999, Quake 3 launches and reaches a new high-water mark for graphics, some fear the advanced graphics will hurt pc gaming

In 2004, Doom 3 launches and reaches a new high water mark for graphics, some fear the advanced graphics will hurt pc gaming

In 2005, FEAR launches and reaches a new high water mark for graphics, some fear the advanced graphics will hurt pc gaming

In 2006, Oblivion 4 launches and reaches a new high water mark for graphics, some fear the advanced graphics will hurt pc gaming

In 2007, Crysis launches and reaches a new high water mark for graphics, some fear the advanced graphics will hurt pc gaming
Exactly.

There needs to be a game(s) that sets the bar for all other games which in turn makes gamers (the ones who've waited for PC gaming to regain its ground/superiority over consoles) want to invest in upcoming advanced hardware to get to a level of gaming that no console can reach. I own every one of those games, and graphics was one reason why, but it was also the gameplay, too, that was innovative....

For me, the early Quake series had amazing gameplay/graphics along with very high FPS even at a very high rez's, w/ Doom 3 it was the realism that made the game so scary and fun to play, for FEAR it was the advanced gameplay and gorgeous graphics, w/ Oblivion it was the advanced gameplay and graphics, and w/ Crysis it's going to be for both gameplay and graphics as well....

These games set the bar early, and with more games coming out (that will hopefully reach Crysis' /Bioshock's level), PC hardware/games sales will surge.....also, what about XNA? Where is that at atm? I thought it would allow for easier development/porting of PC/360 games...maybe I assumed wrong....
ChiperSoft 20th November 2007, 18:39 Quote
Personally I think this all falls on the developers to be more careful in their waste of processing power. There's plenty of examples right now of new games that don't require high end video cards. The source engine scales wonderfully for it's graphics power, I've been able to play HL2 and Portal wonderfully on the intel chipset. It doesn't look as pretty, but the framerates are still good. Blizzard has done an excellent job keeping the demands of World of Warcraft low, as they've done with all their games for as long as the company has been around, while still making it take advantage of higher end graphics when it's available.

Crytek is the kind of company that got their reputation because of the graphical power of the Cry engine used in FarCry. Naturally, they have to top that to keep from seeming like a one-hit-wonder. The only way to do that is to make it so intense that they push the edge of GPU performance.

There are still plenty of developers out there who are perfectly content to work within the limits of current hardware. Just because a few big names are going far, doesn't mean the whole system is in trouble.

I'll echo other people's comments that Intel needs to beef up their video chipset if they want people to use it, but it's still up to the developers to restrain themselves.
choupolo 20th November 2007, 19:31 Quote
I want PC to remain specialised. I'm a little wary of the big companies making compromises to suit a bigger audience in an attempt to get more money. One size fits all is the complete opposite philosophy to PC gaming.

I know PC devs need to make more money, and consoles are stealing a lot of revenue, but I would commit suicide the day PC gaming decided to join the bandwagon and cater mostly for the admittedly huge casual crowd.

It'd be like Valve suddenly turning round and saying 'Hey we give up, we're gonna make internet flash games from now on that we can sell on XBL and PSN.

I dont want PC devs to make games for the clueless guy in Walmart, I want them to make games for me. You know would GTR/GT Legends have been the same if they said, 'No no its too realistic, we need more motion blur...and some NOS!"

There must be another way!!! :'(
lamboman 20th November 2007, 20:51 Quote
It would be OK if newer games were able to run on very old systems, but most people upgrade every three years, not six.

More on the games though, if we had titles like PGR4 or GT5 on the PC, I would see it becoming much more popular.
metarinka 20th November 2007, 21:07 Quote
the big thing is console hardware is standardized and in most cases subsidized by the manufacturer, so on a price point its hard for PC to compete. When I think of "gaming" in terms of the general market, kids, casual etc. What are people gonna do buy a $300 wii OTD. or buy a pc or upgrade the family dell (which has the extra caveat of all games being single player nowadays). I think the PC market will normalize itself and at the moment consoles are in the lead in terms of graphics but give it another year or so and pc's will be back in front, although honestly grpahics aren't as big of a motivator for a platform as people might think (note how the ps2 outsold the xbox). I'm not worried that the PC market will disappear or get so small that developers won't risk it. PC use is only increasing in homes aroun the world, the market is opening not closing
sub routine 20th November 2007, 23:18 Quote
We NEED game and there is nothing wrong with pushing the boundaries.

But I mean come on, PC`s do need alot of cash spent on them. Say £500 to a grand every 2 years to keep up with eveything. Accessability ain`t a bad thing either. Do they really fully push the boundaries of what the hardware can do before they move on?

Are software companies encouraged to move onto new technologies too early. Does this push hardware sales? *cough* DX10* heck even software sales?
devdevil85 20th November 2007, 23:48 Quote
Quote:
Originally Posted by metarinka
I'm not worried that the PC market will disappear or get so small that developers won't risk it. PC use is only increasing in homes aroun the world, the market is opening not closing
I'm just afraid that there won't be enough 'outstanding' games available to make me want to get a PC come this summer vs. getting a console. Knowing that I'm not alone on this, if my statement becomes fact in the upcoming months PC gaming will lose its appeal and the market will decline, in terms of more "casual" PC gamers that play multiple genres.

What's been kind of making me wish-washy lately is that pretty much every game I want for PC has come to console, and then there are games on console that I want, but aren't on PC. The only advantage to PC gaming is that games are natively programmed for M+K support rather than a controller, but that is really the only upside to it that I find myself concluding on. Honestly, the only reason for me to want a new computer is for gaming; my old computer can handle emails, internet surfing, music, video, etc.

The things that will make me want to get a new gaming rig will be:
  • high-quality games (what will games offer me on PC over what is capable on console?)
  • hardware's price vs. performance (what will I get for my money)
  • future-proofing (how long will my hardware last me, and can I continue using the older technology if I upgrade to something better?)

Again, I know I'm not alone on this one and computers, lately, have been getting a bad rep due to games expecting hardware upgrades every 7-8 months, which I find completely out of my budget to do, and with the "sell old - swap new" mentality, it's just not practical and I want to see my investment last me longer (in the gaming realm) longer than before.
cjmUK 21st November 2007, 00:54 Quote
Quote:
Originally Posted by Bindibadgi

Randy-wots-his-face at Intel needs a slap with a wet fish. Intel is the sole reason we're all being held back thanks to its craptacular integrated graphics.

Seconded.

I hate the way he sounds like he's doing us all a favour by improving the onboard gfx, whereas he should be apologising for the crap that they have been shipping so far.
Quote:
Originally Posted by Tim S
Personally, I think that Intel needs to make its integrated graphics chipsets a hell of a lot better than they are currently... at that point, I'd say it's worthwhile making good games that support Intel IGPs. At the moment, there's no way you can introduce new gameplay innovations (I'm not saying this is the solution to making better games, by the way) with Intel's graphics because they're not powerful enough to render them.

If I were a developer, I wouldn't bother developing for the Intel gfx chips. Sure, if they made them roughly equivalent to a lower/mid-range discreet card then this would be a great way of broadening the appeal of PC Gaming.
Quote:
Originally Posted by choupolo
I want PC to remain specialised. I'm a little wary of the big companies making compromises to suit a bigger audience in an attempt to get more money. One size fits all is the complete opposite philosophy to PC gaming.

Seconded, again.


I liken console gaming to karting - fairly cheap and good fun and very accessibile. Higher end PC gaming is Formula 1 by comparison - can be very expensive and continual hard work, but it's where the glory really is.
chrisb2e9 21st November 2007, 03:29 Quote
if i could have my mouse and keyboard, and my internet, and e-mail, and everything else that i can do on my pc, on a console. i would switch. so basically, I want a console thats the same as a pc, but with the games and price of a console.
Hey, i can dream cant I?
Nix 21st November 2007, 03:58 Quote
I think if the Pc gaming market was dying... it wasnt down to the games, but the hardware.

People have started to notice recently that you can get quite a nice bit of hardware within a console at a far cheaper price than a PC. PC hardware prices have stayed roughly the same, but rising in some areas. I think the average user would get annoyed of splashing out at least £700 every year or 2 on upgrading their pc so they can play the latest craze.

Sadly i cant see the hardware sector changing their price structure just because gaming might take a hit...unless it takes a huge hit that the hardware manufacturers take quite a big hit on their profits

In regards to it being too specialised. I think its a load of rubbish. The wide range in PC hardware is a good thing rather then a bad thing... my only gripe is that there arent more MITX motherboards available at more reasonable prices :P
Otto69 21st November 2007, 04:22 Quote
Don't forget: the companyies that make consoles can exercise a tremendous amount of control over the content of a game, if they choose. Such is not the case with titles for the PC.

BTW, I built my current machine about 4+ years ago and it's still adequate, but barely so; 2.4 Pentium, 1 gig of memory. I upgraded the video card once to a 7800 AGP 8x card. I've played Half Life 2, F.E.A.R.,WoW and most of the orange box recently and the only thing that's sort of marginal is HL 2 ep 2.
serialnuber2012 21st November 2007, 07:27 Quote
Quote:
"Something needs to be done so a person buying a PC at Wal-Mart could be a PC gamer too,"

I disagree . A person buying a PC from Wal-Mart has much bigger problems than of being a gamer.
devdevil85 21st November 2007, 08:31 Quote
Quote:
Originally Posted by chrisb2e9
if i could have my mouse and keyboard, and my internet, and e-mail, and everything else that i can do on my pc, on a console. i would switch. so basically, I want a console thats the same as a pc, but with the games and price of a console.
Hey, i can dream cant I?
what about a PS3 modded w/ Ubuntu and a 500GB HDD? That should suit you quite well for the price....sadly, FPSs that support M+K on console will be very scarce unless developers notice a trend.....so PC gaming still has it's edge there.....
Constructacon 21st November 2007, 09:08 Quote
Also how much of PC hardware is being wasted in overheads from the operating system? I mean an XB360 uses a custom GPU (not sure what it's equivalent to) with only 10MB VRAM and a total of 512MB system RAM. A PS3 uses 256MB RAM and 256MB VRAM. Both use multicore CPUs of around 3.2GHz.

That's not very high end when you come to think about it. What we need is for developers to work on extracting the most out of what is there rather than saying "well the new cards can do more" and doing less optimizing.
heir flick 21st November 2007, 09:55 Quote
ive been reading for years about the death of pc gaming, i bought a console a few months ago and allready bored of it and back gaming on the pc
Nictron 21st November 2007, 11:02 Quote
If I may add my few cents to this discussion.

We need three things to happen:

1) Standardization of hardware specification (IE Low, Medium and High)
2) Ease of installation process.
3) Longer development cycles that produce bigger leaps in technological advancement.

1: Standardization of hardware specification (IE Low, Medium and High).

Vista had a very good idea of incorporating a rating system into the operating system to evaluate system performance, though it is limited in the tests that it can do and the accuracy of the tests. Games should use this standard benchmark to make the game configuring process more simple and accurate, IE: if you have a rating of 5.7 in Vista it would configure Crysis correctly so that it runs at average 30 FPS. This is really lacking and causes confusion. PC gaming must also always allow you to tinker your performance to the max but the intelligence of the user should be set at a low level not an expected advanced level.

2: Ease of installation process.

Plug in the disk and off you go, Halo 2 on vista had this system in place but for some reason we are not seeing it in games released currently. Why not? Are the developers lazy or are Microsoft charging an arm and a leg for the implementation? Again ease of use with the advanced options available for those more intelligent users. O and implementing copy protection that causes confusion and interruption is also not helping the matter. When will the Grey suits realize that copy protection only wastes their money and impedes the user’s experience! Show me one game that has not been cracked over the last 10 years? I do not pirate at all, but I have used piracy to get a game to work when the process was flawed or when my CD Rom broke or would not be compatible with a disk.

3: Longer development cycles that produces bigger leaps in technological advancement.

The fact that the graphic companies release so many cards just to keep their revenue ticking is a pain in the butt, the development cycles should ensure that the leap in technological advancement is big enough so that it forces you to buy every 2 to 3 years not every year. This causes more development strain on the game developers and it will make the client with less money go for a console instead of the PC. If the advancement was great then we would all upgrade and we would all see the benefits and we would all enjoy that purchase more. Not the DirectX 10 then 10.1 then 10.2 advancements we have currently. It just causes more confusion and adds flame to the fire.

------

As a final note I would like to say that I sometimes get the feeling that the PC market is used as a testing ground for new technologies that is filtered and then the best of that filters through to the console space.
Zurechial 21st November 2007, 12:12 Quote
Am I the only one who thinks standardisation of PC performance/hardware is abhorrent to the enthusiast mentality?

CPU/GPU combo systems as proposed by AMD and others would be an interesting concept, but not for me.

If I wanted a system that is stuck in its original production-era and which would almost need to be replaced in entirety (peripherals such as disk-drives aside) instead of an upgrade, I'd buy a dross-for-the-masses console and settle for mediocrity in every other regard too.

No thanks.


I agree that Intel need to get their arses in gear with their integrated graphics solutions - If every OEM system, laptop and DELL Workstation out there that has an integrated Intel graphics processor could play games of the previous generation at mid-high settings and next-gen games at low settings, then the market for PC gaming would be massive.

Would the expense of effecting such a change really be so much for Intel? Surely they'd make it back in the long-run, as an investment in an industry that thrives on their high-end processors.
Start a non-gamer off on an OEM system or Laptop that can actually handle decent, recent games and there's a good chance that a year later that same person would be a potential customer for high-end desktop CPUs and GPUs.
(And maybe more people might see just why a joypad will NEVER compare to the Mouse & Keyboard combo for shooters and FPRPGs)

Otherwise, they'll settle for mass-market mediocrity - FIFA, Madden and Halo on overrated and underpowered gaming systems with a collection of games aimed at the lowest common denominators of the gaming market, and thus innovation and creativity will yet again fall afoul of mass-market appeal and catering to the average idiot.
May I please present.. Deus Ex: Invisible War and BioShock

I accept that some kind of optional standardisation of PC gaming would be a great thing - It'd make my above dream of a greater market for PC gaming much more likely due to accessibility for those who don't want the technical hassles, but if it isn't an optional thing, then we may as well stick with consoles and mediocrity.

I think it's a myth and misconception that PC gamers must spend ridiculous sums of money in comparison to console gamers, to enjoy their games.
Firstly, the bleeding edge for console gaming is stuck at whatever year the console is released, so 2 years later it's not really bleeding like it used to.
Secondly, a PC gamer could buy/build a PC around the same time as the release of a given console, for a similar sum of money (when factoring in the cost of a TV, peripherals and the huge price tags on the latest consoles) and simply choose to not upgrade it in the same period if they're not fooled by the marketing that convinces them that they need to.
I've been happy enough playing Crysis on mostly medium settings @ 1024x768 on my current system (C2D E6600, 2gb DDR2, 7900GT) and I feel no *need* to upgrade....and it STILL looks and plays better than anything I've seen or played on any console.

Even at medium settings and average resolutions, most PC games still look better than console games - Play them on a big, blurry TV with as much blur-filtering and post-processing as is used on consoles, and they'll look just as 'fake-anti-aliased' as they would coming from a console.
Stick a high-end GPU in the system, play it on a 1920x widescreen LCD and it'll make any console weep.
The important point is that it's optional - A matter of choice, taste and personal willpower.
You can save your money and play your games at similar visual quality to consoles, whilst retaining the versatility of PCs, or you can upgrade and play games with visuals and effects that the consoles won't match until the next-gen rolls out.

I've said above that it's a matter of choice, and that's why I don't resent the mere existence of console gaming - It's a choice for some people, it's just not for me.
My problem is with how figureheads and industry 'celebrities' speak doom and gloom about the future of PC gaming, yet they're the same people churning out dumbed-down **** for the masses without innovation, allowing the lowest common denominator of the market dictate what games should be produced, and then tailoring it for consoles, even when the game in question would often be best suited to PCs.

The console market has mass-appeal, yes, and is the obvious money-making choice - But the PC market would be a hell of a lot more profitable if they pulled their thumbs out, stopped bending over and taking it from EA and other publishers; and started to make high-quality, innovative, intelligent, ground-breaking games for the PC like they used to from 1980 to about 2004, instead of an endless stream of console ports, "multi-platform launches" and dumbed-down titles with more potential than is ever realised in the final product.

KotOR and Mass Effect should have been PC titles to begin with, Bioshock shouldn't have been dumbed-down from the SS2 formula that WORKED for PC gamers, Oblivion shouldn't have had an interface more suited to Sesame Street than a GUI, Fallout 3 should be isometric and deep, not Oblivion with radiation; Invisible War was an XBox-inclined insult to the masterpiece that was Deus Ex. And does anyone remember the original plans for Halo before Microsoft got their hands on it? Yeah, I thought so.

As far as I'm concerned, PC gaming SHOULD be specialised - If they got it right like they did in the past it'd be a perfectly viable and profitable alternative to mass-market console gaming. I'm no economist, but I do remember the golden days of PC gaming, and that makes me feel and sound bloody old.


An elitist? Me? Never. :p
Asking for too much? Me? All the bloody time.

.....End ridiculously verbose/long-winded rant. Cue flames. :o
cjmUK 21st November 2007, 16:38 Quote
Rather than standardisation, we might aim for classification. Some sort of universal benchmark that manufacturers, developers and the consumers all had access to. It would make it clear what would run on your PC, it would provide reliable demographic information for developers to select their targets and would make the whole thing a lot more transparent.

The Windows Experience Index isn't perfect, but I think it's quite a neat idea.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums