bit-tech.net

No new Nvidia GPU this year

No new Nvidia GPU this year

AMD was first to launch a 40nm graphics card - it looks set to do the same for 28nm

Nivida has confirmed that it will not be shipping its next generation GPU this year.

Things have been very quiet on the next-gen GPU front recenty, as we mentioned in our recent podcast . However, the news, which is sure to disappoint Nvidia fans hoping to ask Father Christmas for some new shiny silicon before the year is out, was confirmed by Ken Brown, a spokesman for Nvidia over on Techspot.

Keplar - the codename for the successor to Nvidia's current GPU architecture, Fermi, and Nvidia's effort at nailing the 28nm manufacturing process, will not be landing on shelves until 2012

Mr Brown stated 'Although we will have early silicon this year, Kepler-based products are actually scheduled to go into production in 2012. We wanted to clarify this so people wouldn’t expect product to be available this year'

Meanwhile, the rumour mill hasn't exactly been hard at work regarding AMD's Southern Islands HD 7000-series GPUs. We reported earlier this year about production beginning this summer, and that the new GPUs will include PCI-E 3 however there's little to go on to confirm a 2011 launch for the red team either.

Are you waiting for the next generation of graphics cards or are you happy with your current hardware? Do you think this will change with release of games such as Battlefield 3? Let us know in the forum.

56 Comments

Discuss in the forums Reply
wuyanxu 8th August 2011, 11:30 Quote
Quote:

even if AMD can push out new generation, it'll not be highest end due to manufacturing limitations. likewise, if AMD does manage high end, expect a very fast refresh cycle for that GPU, and wasted investment.
damien c 8th August 2011, 11:47 Quote
I am upgrading when Sandy Bridge E comes out and if they don't have PCI-E 3.0 card's out by then, but the board's can support it then fine I don't mind.

I want better card's but really the only game that will tax them is going to be Battlefield 3 because all the rest are just simply put Console trash so even now, my current pc will cope with Battlefield 3 with no problem's but nothing out there at the moment even stresses it.

I mean Crysis 2 with the DX11 patch etc at 1920x1080 with maximum detail setting's I am still hitting well over 100fps.

Basically put it doesn't matter whether they release it this year, next year or the year after because 99.99% of games are developed for the consoles and the only thing possibly that could use it better is none gaming software like Badaboom and F@H etc and even just for them it's not that important.
thom804 8th August 2011, 11:54 Quote
Quote:
Originally Posted by damien c


I mean Crysis 2 with the DX11 patch etc at 1920x1080 with maximum detail setting's I am still hitting well over 100fps.

What's powering your system? Sellerfield?!

I'm of course going to assume that with the texture pack as well yes?
damien c 8th August 2011, 12:04 Quote
Yes it is lol

Yeah I have the texture pack as well.

I am running a Core I7 2600k @ 4.8ghz and 2x GTX 580's Overclocked aswell to 950mhz for the core, can't remember the mem speed of the card's etc are as I am at work.

But yeah it doesn't struggle with any game out there, but I want the Sandy Bridge E because of video converting etc otherwise I wouldn't bother as it won't make much difference for gaming.
Madness_3d 8th August 2011, 12:16 Quote
You think they would build the delay time into their roadmap. At least that way if it came out with little delay it would take AMD by surprise. I'll just have to leave my cash in the bank a little longer then I guess!
Blarte 8th August 2011, 12:18 Quote
Sellafield isnt a generator it reprocesses ...I'll get mi coat ..
KiNETiK 8th August 2011, 12:20 Quote
Quote:
Originally Posted by damien c
Yes it is lol

Yeah I have the texture pack as well.

I am running a Core I7 2600k @ 4.8ghz and 2x GTX 580's Overclocked aswell to 950mhz for the core, can't remember the mem speed of the card's etc are as I am at work.

But yeah it doesn't struggle with any game out there, but I want the Sandy Bridge E because of video converting etc otherwise I wouldn't bother as it won't make much difference for gaming.

2 Points:

Your moaning because nothing taxes your computer, yet your computer represents 0.00001% of the market since it is pretty much the highest spec you can currently get.. Why do you think game developers don't produce games/spend large amounts of money implementing features that would bring this setup to its knees :)?

Secondly, why are you only gaming at 1080p if you have that setup!? Surely you should invest in a higher resolution monitor if you have that sort of money to spend on hardware :) You might find it taxes it a little more then :)
tom_hargreaves 8th August 2011, 12:55 Quote
Quote:
Originally Posted by KiNETiK
Quote:
Originally Posted by damien c
Yes it is lol

Yeah I have the texture pack as well.

I am running a Core I7 2600k @ 4.8ghz and 2x GTX 580's Overclocked aswell to 950mhz for the core, can't remember the mem speed of the card's etc are as I am at work.

But yeah it doesn't struggle with any game out there, but I want the Sandy Bridge E because of video converting etc otherwise I wouldn't bother as it won't make much difference for gaming.

Secondly, why are you only gaming at 1080p if you have that setup!? Surely you should invest in a higher resolution monitor if you have that sort of money to spend on hardware :) You might find it taxes it a little more then :)

Hit the nail on the head really. I have Radeon 5870's in Crossfire with an i7 930 setup gaming at 1920 x 1080, and nothing really taxes it either tbh. Although I haven't played Crysis 2 - Can't imagine there'd be a problem.
Goty 8th August 2011, 13:14 Quote
Quote:
Originally Posted by wuyanxu
Quote:

even if AMD can push out new generation, it'll not be highest end due to manufacturing limitations. likewise, if AMD does manage high end, expect a very fast refresh cycle for that GPU, and wasted investment.

AMD's performance should not suffer to any noticeable degree by making their next gen GPUs on the first-available process from TSMC. The only real limitation should be the switching speeds of the transistors, which should not affect AMD unless they move to a similar design to NVIDIA's where the shaders are run at twice the clock of the rest of the chip.
Woodspoon 8th August 2011, 13:16 Quote
I blame consoles
Absolutely no basis to it what so ever, I just do.
Thank you, goodbye.
Bede 8th August 2011, 13:25 Quote
Interesting that they say the new series will only be going into production in 2011 - looks like it we won't be seeing new cards for at least 6-8 months.
Action_Parsnip 8th August 2011, 13:40 Quote
Quote:
Originally Posted by wuyanxu
even if AMD can push out new generation, it'll not be highest end due to manufacturing limitations. likewise, if AMD does manage high end, expect a very fast refresh cycle for that GPU, and wasted investment.

Not really wasted if the refresh is minor, say speed bump, memory clock bump.
Quote:
Originally Posted by thom804
What's powering your system? Sellerfield?!

They used to call it Windscale :)
Quote:
Originally Posted by Woodspoon
I blame consoles
Absolutely no basis to it what so ever, I just do.
Thank you, goodbye.

Chucklez
Xir 8th August 2011, 13:56 Quote
Hoped Keppler would put some pressure on the prices
hmmm, think I'll settle for a GTX560Ti as a "filler card" then.
Was going for the 570, but the price is just too high for not enough performance gain. (same as the 6970)
Kúsař 8th August 2011, 14:04 Quote
AMD actually taped out first sample of Southern Isles GPU(which features new SIMD architecture) back in february and they're currently polishing its design. And they have plenty of time to do that, nVidia taped out first sample of Keppler just a few weeks ago. If those rumours are correct than HD7000 series should be ready for mass production in Q3 or Q4 this year and it'll be manufactured by both - TSMC and GloFo on 28nm. We'll definitely see new AMD gpus this year while nVidia will come at first half of 2012.

I'm definitely looking forward to replace my good ol' 4770 with some new 28nm power efficient mid-range GPU...
[USRF]Obiwan 8th August 2011, 14:28 Quote
The fact is that in the last two years or so the consolification of games, porting to PC and the lag of having any innovative new games that bring current hardware to a crawling state.

Add the fact that the whole gaming business is changed the last couple of years, not to mention the greedy bank managers causing a worldwide crysis that had and still has a great impact on consumers worldwide.

Also add, that we do not have the 'need' for a more powerful graphics card. For example; I can play all current games with my quadcore AMD and GTX460 in full HD with all options high. (for example crysis2 with dx11 and HQ texture patches) and I do not think this will change in the near future.

So it probably is a wise decisions to delay new cards.
Lazy_Amp 8th August 2011, 14:32 Quote
Quote:
Originally Posted by Kúsař
AMD actually taped out first sample of Southern Isles GPU(which features new SIMD architecture) back in february and they're currently polishing its design. And they have plenty of time to do that, nVidia taped out first sample of Keppler just a few weeks ago. If those rumours are correct than HD7000 series should be ready for mass production in Q3 or Q4 this year and it'll be manufactured by both - TSMC and GloFo on 28nm. We'll definitely see new AMD gpus this year while nVidia will come at first half of 2012.

I'm definitely looking forward to replace my good ol' 4770 with some new 28nm power efficient mid-range GPU...
Most of this is right, but I'm going to point out again that it's pretty much impossible to spin the same designs at both TSMC and GF due to the drastically different processes/design rules, the standard cell libraries are completely different. GF will run CPUs and all future Fusion parts, while TSMC will only run GPUs (and the Brazos line on 40nm)

I'm pretty optimistic about Tahiti, the high end Southern Island chip. It should be a nice Christmas present. Dunno when the midrange cards will be out.
DbD 8th August 2011, 14:51 Quote
If the hold up is TMSC, which is likely as these are the first 28nm parts which is a huge shift from 40nm, then it'll be a problem for AMD as much as nvidia. Traditionally AMD would be better off as they have smaller chips which are easier to make, but if they are trying to add gpu compute like nvidia did for fermi then who knows?
Bede 8th August 2011, 15:02 Quote
Quote:
Originally Posted by [USRF
Obiwan]Add the fact that the whole gaming business is changed the last couple of years, not to mention the greedy bank managers causing a worldwide crysis that had and still has a great impact on consumers worldwide.

I know we like games here, but 'crysis' is a game and 'crisis' is a moment in time where an important decision needs to be made :P

Pushing graphical fidelity will only make games more and more expensive to make, and frankly I think most people on this planet would like not to have to upgrade their system once a year just to play games at max settings. Better gameplay, not better graphics is what we should be looking for.
MjFrosty 8th August 2011, 15:19 Quote
Peoples opinion on this will differ entirely on what their current situation is.

For example if you've skipped Fermi all together I could understand why you'd be disappointed.

My current setup is a 965 Extreme (bloomfield) and 580GTXs in SLi. More then enough power for all the titles scheduled to be released this year (including running in 3D at 1080)

As for the comments above, if you wanted something to push your so-called super computer ..see how your Sandybridge middle of the road e-peen new toy copes with Witcher 2 and uber sampling at a decent resolution. ;)
Lazy_Amp 8th August 2011, 16:12 Quote
Quote:
Originally Posted by DbD
If the hold up is TMSC, which is likely as these are the first 28nm parts which is a huge shift from 40nm, then it'll be a problem for AMD as much as nvidia. Traditionally AMD would be better off as they have smaller chips which are easier to make, but if they are trying to add gpu compute like nvidia did for fermi then who knows?

Lots of factors can cause this difference in the timelines.
1. Most obvious is the timescale. If AMD taped out it's design in February, then 6 months later there's definitely pre-production silicon being tested right now. If Nvidia taped just recently, then yeah, add about 9 months from now to get full production started on this new process.
2. Design: If AMD's design is more manufacturable than Nvidia's, like how Nvidia's shaders need twice as fast a clock as AMD, then sure, AMD silicon can be brought up faster.
3. Politics: Neither AMD nor NVidia own TSMC. Is it so unreasonable that AMD has scratched TSMC's back the right way, and they're getting priority to bring up 28nm?

I doubt TSMC is the problem, especially with #1 there. AMD got there first, TSMC won't want to spin much NVidia silicon if they're making good process debug with Southern Islands.

EDIT: http://www.xbitlabs.com/news/graphics/display/20110726103803_AMD_Expects_to_Be_First_GPU_Designer_with_28nm_Graphics_Chips.html If you doubt AMD has Southern Island silicon
tonyd223 8th August 2011, 16:14 Quote
Crysis or crisis? isn't life becoming a bit of a game? The other day, I shot this guy from 40 feet and all I heard was "Headshot!"

I'll get my coat...
Lehmann 8th August 2011, 16:24 Quote
I would be more than happy if current cards just had a die shrink. lower power less heat.
docodine 8th August 2011, 17:25 Quote
Quote:
Originally Posted by Lehmann
I would be more than happy if current cards just had a die shrink. lower power less heat.

Agreed, it's not like we need much more performance until the next round of consoles anyway..
true_gamer 8th August 2011, 17:35 Quote
This is good news to me, as it means my GTX 580's will still be worth some money when I sell them at the end of the year. (Nice cheap xmas present for someone).
Chicken76 8th August 2011, 20:27 Quote
I believe the name is Kepler, not Keplar. (third paragraph)
Quote:
...and Nvidia's effort at nailing the 28nm manufacturing process...
I believe it's TSMC, not Nvidia, that has to do most of the nailing, isn't it?
Jimbob 8th August 2011, 21:32 Quote
I think a bit longer is OK, i'm running a 6990 with 3 22" x eyefinity setup and even that runs most games at full graphics. It does struggle occasionally but currently I don't see a need for a major jump in graphics power until ultra high res becomes moer popular or the next gen PS4 xBox comes out.

PC graphics cards are so far ahead of consoles it's ridiculous.
slothy89 9th August 2011, 04:46 Quote
I think it's a good idea for nVidia to take their time. Get it right the first time, no need to repeat the gtx480 ;)

As for me, the gtx600 series will be my next upgrade, if it's gonna be mid 2012 then good, more time to save, and less waste for my current system.

Plus I can't see SLI 460s struggling even with BF3
impar 9th August 2011, 09:42 Quote
Greetings!

There really isnt a reason to upgrade GPUs anymore.
You always needed more GPU power to climb from 640x480 to 800x600 to 1024x768 to 1280x1024 to 1680x1050 and finally to 1920x1080. For most, there is no reason to go beyond 1920x1080. Some can go the dual monitor route but they will be a minority.
And the consolification hasnt helped.
Parge 9th August 2011, 10:47 Quote
Man, if AMD can push out the 7000 series in time for 28th October they are going to make a wad of cash.
damien c 9th August 2011, 13:21 Quote
Quote:
Originally Posted by KiNETiK
Quote:
Originally Posted by damien c
Yes it is lol

Yeah I have the texture pack as well.

I am running a Core I7 2600k @ 4.8ghz and 2x GTX 580's Overclocked aswell to 950mhz for the core, can't remember the mem speed of the card's etc are as I am at work.

But yeah it doesn't struggle with any game out there, but I want the Sandy Bridge E because of video converting etc otherwise I wouldn't bother as it won't make much difference for gaming.

2 Points:

Your moaning because nothing taxes your computer, yet your computer represents 0.00001% of the market since it is pretty much the highest spec you can currently get.. Why do you think game developers don't produce games/spend large amounts of money implementing features that would bring this setup to its knees :)?

Secondly, why are you only gaming at 1080p if you have that setup!? Surely you should invest in a higher resolution monitor if you have that sort of money to spend on hardware :) You might find it taxes it a little more then :)

Well all I will say about nothing taxing my system is that developers should be utilising the hardware available using, the engines that are currently out there and not just tweaking existing old engines to try and make them look better, and then people with pc's like mine won't be so annoyed about it.

As for the monitor comment, I only run at 1920x1080 because I am yet to find something that offer's a higher resolution with atleast 3 HDMI connection's and VGA connection and also a Scart connection, so that I can use my PC and Xbox as well as Tivo box, then also use my laptop on it when I just want to watch something and decide it's not worth turning my pc on for it, and also a scart for my DVD Recorder.
MjFrosty 9th August 2011, 15:59 Quote
Quote:
Originally Posted by damien c
Quote:
Originally Posted by KiNETiK
Quote:
Originally Posted by damien c
Yes it is lol

Yeah I have the texture pack as well.

I am running a Core I7 2600k @ 4.8ghz and 2x GTX 580's Overclocked aswell to 950mhz for the core, can't remember the mem speed of the card's etc are as I am at work.

But yeah it doesn't struggle with any game out there, but I want the Sandy Bridge E because of video converting etc otherwise I wouldn't bother as it won't make much difference for gaming.

2 Points:

Your moaning because nothing taxes your computer, yet your computer represents 0.00001% of the market since it is pretty much the highest spec you can currently get.. Why do you think game developers don't produce games/spend large amounts of money implementing features that would bring this setup to its knees :)?

Secondly, why are you only gaming at 1080p if you have that setup!? Surely you should invest in a higher resolution monitor if you have that sort of money to spend on hardware :) You might find it taxes it a little more then :)

Well all I will say about nothing taxing my system is that developers should be utilising the hardware available using, the engines that are currently out there and not just tweaking existing old engines to try and make them look better, and then people with pc's like mine won't be so annoyed about it.

As for the monitor comment, I only run at 1920x1080 because I am yet to find something that offer's a higher resolution with atleast 3 HDMI connection's and VGA connection and also a Scart connection, so that I can use my PC and Xbox as well as Tivo box, then also use my laptop on it when I just want to watch something and decide it's not worth turning my pc on for it, and also a scart for my DVD Recorder.



I can tell you this instance theres already a few things that will "tax your system". Try 3D for a start (don't call it a fad either, that's not an acceptable excuse) ;)
maverik-sg1 9th August 2011, 16:52 Quote
Quote:
Originally Posted by Kúsař
AMD actually taped out first sample of Southern Isles GPU(which features new SIMD architecture) back in february and they're currently polishing its design. And they have plenty of time to do that, nVidia taped out first sample of Keppler just a few weeks ago. If those rumours are correct than HD7000 series should be ready for mass production in Q3 or Q4 this year and it'll be manufactured by both - TSMC and GloFo on 28nm. We'll definitely see new AMD gpus this year while nVidia will come at first half of 2012.

I raise your speculation and add

The partial truths:

1) AMD will have 28nm product this year - should they decide to release it of course.
2) Nvidia's 28nm will be a Q1 2012 release.

on a lighter note - I hope these guys can integrate tri-gate manufacturing to their silicon ASAP as even with a 28nm gpu, the top end products will still be hungry and toasty.
wuyanxu 9th August 2011, 16:55 Quote
trigate won't solve anything. it'll only increase heat density, meaning harder to cool.

those type of transistors have existed for ages, trigate is only an intel marketing term.
thehippoz 9th August 2011, 16:57 Quote
too bad they can't use intels skip 28 all together =p
maverik-sg1 9th August 2011, 16:59 Quote
Quote:
Originally Posted by damien c
As for the monitor comment, I only run at 1920x1080 because I am yet to find something that offer's a higher resolution with atleast 3 HDMI connection's and VGA connection and also a Scart connection, so that I can use my PC and Xbox as well as Tivo box, then also use my laptop on it when I just want to watch something and decide it's not worth turning my pc on for it, and also a scart for my DVD Recorder.


If only they developed some sort of 3-1 HDMI Port adaptor which would allow you to connect 3 devices to a box that would allow easy switching between devices to your monitor......... behold I bring you the future....today!!


http://cgi.ebay.com/Mini-3X1-HDMI-Switcher-Box-Switch-Adapter-Square-V1-3-/190549361606?pt=LH_DefaultDomain_0&hash=item2c5da083c6


Google is your friend
maverik-sg1 9th August 2011, 17:02 Quote
Quote:
Originally Posted by wuyanxu
trigate won't solve anything. it'll only increase heat density, meaning harder to cool.

I read they they would be easier to cool and generate less heat anyway right? - but I accept that I don't have an in depth knowledge on this subject to comment beyond what I read elsewhere.
wuyanxu 9th August 2011, 17:07 Quote
Quote:
Originally Posted by maverik-sg1
I read they they would be easier to cool and generate less heat anyway right? - but I accept that I don't have an in depth knowledge on this subject to comment beyond what I read elsewhere.
going down in nanometers will always decrease switching power, increase leakage power. meaning unless power gated, idle power usage will not decrease; active power usage will always decrease on each shrink.

trigate is only a marketing jargon used by Intel. nothing special at all.
OCJunkie 9th August 2011, 17:47 Quote
Reading between the lines what Nvidia's actually saying is "we're not releasing anything until HD 7000 comes out" ... personally I don't think there's any rush for either camp, illustrated by the previous posts crying that nothing taxes their 580, and the fact that my 6950 doesn't sweat even in Eyefinity. Do we REALLY need new cards right now? For gadget-pr0n prurposes only tbh.
maverik-sg1 9th August 2011, 17:53 Quote
Agreed OCJ - we don;t actually need anything more than we have available today, in fact we probably need higher resolution 24-28" monitors to even maximise what we have now.
LedHed 9th August 2011, 17:56 Quote
I love how everyone acts like 1080p is "full HD" when talking about gaming, I've been playing at 1200p for over 4 years now. Also there is nothing wrong with the GTX 480 as long as you aren't worried about power consumption and you are using after market cooling. The card was not a "mistake," they just couldn't use a single SM unit to increase yield, big deal. A heavily OC'd 480 can break 6000P in 3DMark11, something a stock 580 can not on the same CPU (this should show the little difference between the two.) Of course the 580 is the better card with a vapor chamber and better power consumption plus the single missing SM unit. However I'm playing with a Q9550@4ghz + 8Gb DDR2 1066 + GTX 480 850/4000 on a 24" BenQ FP241W (1920x1200p) and I have yet to find a game that would even make me think about new hardware. If a game does run poorly and you check the forums, guys with 580s and i5/i7 are having the exact same problems.
Elton 9th August 2011, 18:57 Quote
I'm running a HD6850. Once the new generation is out, I think I'll actually go crossfire for once. It's about time, and plus it looks facking sexy.
Sloth 9th August 2011, 19:16 Quote
Quote:
Originally Posted by Elton
I'm running a HD6850. Once the new generation is out, I think I'll actually go crossfire for once. It's about time, and plus it looks facking sexy.
Running on a 5870, probably going to buy at the same price point with the next gen. Crossfire may look sexy, but the 7970 has an awesome looking and sounding name.

Because as any true PC enthusiast knows, performance is one thing, appeal is another. :D
thehippoz 9th August 2011, 19:30 Quote
Quote:
Originally Posted by LedHed
I love how everyone acts like 1080p is "full HD" when talking about gaming, I've been playing at 1200p for over 4 years now. Also there is nothing wrong with the GTX 480 as long as you aren't worried about power consumption and you are using after market cooling. The card was not a "mistake," they just couldn't use a single SM unit to increase yield, big deal. A heavily OC'd 480 can break 6000P in 3DMark11, something a stock 580 can not on the same CPU (this should show the little difference between the two.) Of course the 580 is the better card with a vapor chamber and better power consumption plus the single missing SM unit. However I'm playing with a Q9550@4ghz + 8Gb DDR2 1066 + GTX 480 850/4000 on a 24" BenQ FP241W (1920x1200p) and I have yet to find a game that would even make me think about new hardware. If a game does run poorly and you check the forums, guys with 580s and i5/i7 are having the exact same problems.

you could have sold the 480 awhile back.. the cards are way too power hungry for what they do
SolidShot 9th August 2011, 19:50 Quote
whats the point of rushing? theres nothing that can challenge THIS gen graphics cards and SB chipsets!

Its like saying i want a twin turbo v12 in my car to go down to the shops to get some milk. WHY?
Bloody_Pete 9th August 2011, 20:12 Quote
Quote:
Originally Posted by SolidShot
whats the point of rushing? theres nothing that can challenge THIS gen graphics cards and SB chipsets!

Its like saying i want a twin turbo v12 in my car to go down to the shops to get some milk. WHY?

BF3 may change things :P
LedHed 10th August 2011, 04:38 Quote
Quote:
Originally Posted by thehippoz
Quote:
Originally Posted by LedHed
I love how everyone acts like 1080p is "full HD" when talking about gaming, I've been playing at 1200p for over 4 years now. Also there is nothing wrong with the GTX 480 as long as you aren't worried about power consumption and you are using after market cooling. The card was not a "mistake," they just couldn't use a single SM unit to increase yield, big deal. A heavily OC'd 480 can break 6000P in 3DMark11, something a stock 580 can not on the same CPU (this should show the little difference between the two.) Of course the 580 is the better card with a vapor chamber and better power consumption plus the single missing SM unit. However I'm playing with a Q9550@4ghz + 8Gb DDR2 1066 + GTX 480 850/4000 on a 24" BenQ FP241W (1920x1200p) and I have yet to find a game that would even make me think about new hardware. If a game does run poorly and you check the forums, guys with 580s and i5/i7 are having the exact same problems.

you could have sold the 480 awhile back.. the cards are way too power hungry for what they do

I actually got the GTX 480 from a GTX 295 RMA, so once I bought a Zalman VF3000F, I decided to stick with the card for a while. The GTX 295 was very power hungry for what it did and only in certain games with profiles made. Now I have one of the top 5 single GPU cards available and I have pushed the card over 950mhz on core; the card is a cherry and I have no need to sell it just to save a little power. If I wanted to save power I would upgrade my Mobo/CPU/RAM.
LedHed 10th August 2011, 04:39 Quote
oh and at least the 480s are solid, unlike the GTX 570 which have a high failure rate.
VeNoM JaCKaL 10th August 2011, 09:19 Quote
Dam it GTX260 is beginning to really showing its age and was hoping for an upgrade. Guess I’ll have to do with GTX580 but I’ll wait for price drop ^__^'
damien c 10th August 2011, 13:28 Quote
Quote:
Originally Posted by maverik-sg1
Quote:
Originally Posted by damien c
As for the monitor comment, I only run at 1920x1080 because I am yet to find something that offer's a higher resolution with atleast 3 HDMI connection's and VGA connection and also a Scart connection, so that I can use my PC and Xbox as well as Tivo box, then also use my laptop on it when I just want to watch something and decide it's not worth turning my pc on for it, and also a scart for my DVD Recorder.


If only they developed some sort of 3-1 HDMI Port adaptor which would allow you to connect 3 devices to a box that would allow easy switching between devices to your monitor......... behold I bring you the future....today!!


http://cgi.ebay.com/Mini-3X1-HDMI-Switcher-Box-Switch-Adapter-Square-V1-3-/190549361606?pt=LH_DefaultDomain_0&hash=item2c5da083c6


Google is your friend

Already know about those thing's but that still leaves me unable to use my DVD Recorder and Laptop on it, without buying TV card's etc to do it all.
damien c 10th August 2011, 13:32 Quote
Quote:
Originally Posted by MjFrosty
Quote:
Originally Posted by damien c
Quote:
Originally Posted by KiNETiK
Quote:
Originally Posted by damien c
Yes it is lol

Yeah I have the texture pack as well.

I am running a Core I7 2600k @ 4.8ghz and 2x GTX 580's Overclocked aswell to 950mhz for the core, can't remember the mem speed of the card's etc are as I am at work.

But yeah it doesn't struggle with any game out there, but I want the Sandy Bridge E because of video converting etc otherwise I wouldn't bother as it won't make much difference for gaming.

2 Points:

Your moaning because nothing taxes your computer, yet your computer represents 0.00001% of the market since it is pretty much the highest spec you can currently get.. Why do you think game developers don't produce games/spend large amounts of money implementing features that would bring this setup to its knees :)?

Secondly, why are you only gaming at 1080p if you have that setup!? Surely you should invest in a higher resolution monitor if you have that sort of money to spend on hardware :) You might find it taxes it a little more then :)

Well all I will say about nothing taxing my system is that developers should be utilising the hardware available using, the engines that are currently out there and not just tweaking existing old engines to try and make them look better, and then people with pc's like mine won't be so annoyed about it.

As for the monitor comment, I only run at 1920x1080 because I am yet to find something that offer's a higher resolution with atleast 3 HDMI connection's and VGA connection and also a Scart connection, so that I can use my PC and Xbox as well as Tivo box, then also use my laptop on it when I just want to watch something and decide it's not worth turning my pc on for it, and also a scart for my DVD Recorder.



I can tell you this instance theres already a few things that will "tax your system". Try 3D for a start (don't call it a fad either, that's not an acceptable excuse) ;)

3D is a fad and guess what 4D is coming out now as well.

3D was brought out simply because the tv manufacturers were not making enough money so they brought it in to your home to make, everyone suddenly think "oh my shiny new HD tv is old and useless must buy 3D"

I have tried 3D and didn't like it, simply because I ended up with a headache after using the Nvidia 3D glasses etc, and to be honest it's fine at the cinemas watching film's as I don't get a headache but it's add's nothing to the film and I actually think it's makes films and games worse using than not, because you see the 2 images overlapping each other constantly as in tearing around people and object's.

3D is no where near ready yet for home use or at the cinemas in my oppinion and until it is I will not spend the money on it, as I might as well just set it alight and watch it burn.
wuyanxu 10th August 2011, 13:38 Quote
Quote:
Originally Posted by damien c

As for the monitor comment, I only run at 1920x1080 because I am yet to find something that offer's a higher resolution with atleast 3 HDMI connection's and VGA connection and also a Scart connection, so that I can use my PC and Xbox as well as Tivo box, then also use my laptop on it when I just want to watch something and decide it's not worth turning my pc on for it, and also a scart for my DVD Recorder.

Dell U2711:
1 HDMI + 2 DVI for PC and laptop (or HDMI to DVI cable) + VGA connection + Scart (well, the video yellow port is there) + Displayport for future proof + component

you are just not looking hard enough.

get 3 U2711's for nvidia surround. and i'd like to see how your system do, at 3x2560x1440.
damien c 10th August 2011, 14:21 Quote
Not gonna spend £1800 on 3 monitors when I only have room for anything from a single 32" to a single 42", I will wait for 4K Monitors
Baz 10th August 2011, 15:48 Quote
Quote:
Originally Posted by damien c
Not gonna spend £1800 on 3 monitors when I only have room for anything from a single 32" to a single 42", I will wait for 4K Monitors

This. 3D is a horrid fad-tastic tech. 4K will be the next big step; shame it'll take 10 years to see free to air in 4k broadcasts!
Elton 10th August 2011, 20:56 Quote
'Hmm, well if I can get rid of the HD6850, then I'll get another card. I'm still wary of driver support however good it may be.
MjFrosty 11th August 2011, 15:18 Quote
Quote:
Originally Posted by damien c
3D is a fad and guess what 4D is coming out now as well.

3D was brought out simply because the tv manufacturers were not making enough money so they brought it in to your home to make, everyone suddenly think "oh my shiny new HD tv is old and useless must buy 3D"

I have tried 3D and didn't like it, simply because I ended up with a headache after using the Nvidia 3D glasses etc, and to be honest it's fine at the cinemas watching film's as I don't get a headache but it's add's nothing to the film and I actually think it's makes films and games worse using than not, because you see the 2 images overlapping each other constantly as in tearing around people and object's.

3D is no where near ready yet for home use or at the cinemas in my oppinion and until it is I will not spend the money on it, as I might as well just set it alight and watch it burn.

Doesn't give me a headache, maybe you need glasses.;)

The fact you actually mentioned 4D in relation to Nvidia 3D Vision automatically made me stop reading what you were saying.
MjFrosty 11th August 2011, 15:21 Quote
Quote:
Originally Posted by Baz
This. 3D is a horrid fad-tastic tech. 4K will be the next big step; shame it'll take 10 years to see free to air in 4k broadcasts!

Would you like me to tuck you in under that blanket statement?
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums