bit-tech.net

Nvidia announces GeForce GTX 690 4GB

Nvidia announces GeForce GTX 690 4GB

Nvidia's GeForce GTX 690 4GB boasts two Kepler 28nm graphics cores, but comes at a serious cost.

Nvidia has taken the wraps off its latest top-end graphic card design, the dual-GPU GeForce GTX 690 4GB, due for release in 'limited quantities' from the 3rd of May with a full retail release on the 7th.

Announced by Nvidia co-founder and chief executive officer Jen-Hsun Huang at the Nvidia Game Festival in Shanghai, the board is based on two 28nm Kepler GPUs - the same cores as found on the GeForce GTX 680 2GB. Nvidia claims the GeForce GTX 690 4GB is the world's fastest graphics card, functionally equivalent to two GeForce GTX 680s running in dual-SLI mode.

Nvidia's own performance stats - currently the only stats available on the as-yet unreleased boards - show a rough doubling of performance in some games, including Crysis 2 and Metro 2033, while other titles like Batman: Arkham City and Civilization IV show a smaller - but still impressive performance boosts.

The reference design boasts 3,072 CUDA cores, a base clock of 915MHz - boosted to 1019MHz in selected scenarios - 4GB of GDDR5 on a 512-bit bus, dual eight-pin power connectors and a thermal design profile (TDP) of 300W. Three dual-link DVI ports and a single Mini-DisplayPort 1.2 offer outputs, while the boards connect to the host system via PCI Express 3.0.

Nvidia's design includes a new cooling system using dual vapour chamber heat-sinks - one over each GPU - and a central axial fan with optimised fin pitch and angle to reduce noise. 'Inside each vapour chamber is a small amount of purified water,' explains Nvidia's James Wang. 'As the GPU heats up, the water evaporates, carrying away heat in the process. Once the vapour reaches the top of the fin stack, it cools, condenses, and the process repeats itself. It's similar to a miniature form of water cooling but, because the liquid is entirely self contained, there's no need for tubing and no chance of leaks.'

If that wasn't enough, the reference design boasts a thixomolded magnesium alloy fan housing for improved heat dissipation and vibration dampening and a ten-phase heavy-duty power supply connected to a ten-layer two-ounce copper PCB.

'The GTX 690 is truly a work of art - gorgeous on the outside with amazing performance on the inside,' claimed Brian Kelleher, senior vice president of GPU engineering at Nvidia, of the board. 'Gamers will love playing on multiple screens at high resolutions with all the eye candy turned on. And they'll relish showing their friends how beautiful the cards look inside their systems.'

With an exterior appearance designed to evoke a Formula 1 engine, complete with a brushed metal finish and polycarbonate windows onto the cooling fins, there's no denying that the GeForce GTX 690 4GB is an attractive card. Sadly, its high-end appearance hints at something else high-end: the price.

While UK pricing has yet to be confirmed, Nvidia has stated that its initial batch of boards will be priced by its hardware partners at around $990 (around £608 excluding tax.) If that wasn't eye-watering enough, Nvidia is hoping to convince customers to splash out on a pair of boards for four-GPU SLI gaming - boasting that such a setup is able to break 120 frames per second in the notoriously demanding Unigine Heaven benchmark.

Huang also announced a new software tool dubbed GeForce Experience, designed to query system components to provide highly customised game settings for maximum image quality without sacrificing performance. Designed for what Nvidia claims is the 80 per cent of gamers who leave games' settings at their defaults, the tool is designed to make PC gaming as easy as console gaming while ensuring that gamers get the very best out of their hardware.

36 Comments

Discuss in the forums Reply
guvnar 30th April 2012, 10:11 Quote
OK, so start saving then......
greypilgers 30th April 2012, 10:17 Quote
Wow. I went through their website announcement yesterday, and although I'd like to see some independent tests for proper final opinion, this does indeed look like an immensely capable piece of hardware. I'd love to have one hooked up to a 30inch screen!
WarrenJ 30th April 2012, 10:24 Quote
Wish i had the time to play games to warrant a new graphics card.

Looks like a nice piece of hardware, though, wonder what the PPD would be on that?
Vo0Ds 30th April 2012, 10:41 Quote
SLI :-0
Spreadie 30th April 2012, 10:51 Quote
Eeesh! How much?
maverik-sg1 30th April 2012, 11:15 Quote
Is it fair to say that unless the bridging chipset and or the drivers are sorted and properly supported - this thing will micro stutter like a biatch either now or when driver support is no longer important to nvidia?

Got to admit, it does look very nice, far too expensive for me - seems to me that the rebadged 660 (remember the real 680 is based on GK110 due out late summer I think) is a high margin cash cow for Nvidia, mostly due to AMD's epic fail to bring a decent high end gpu to the market...... I am skipping this farcical series of GPU's from both red and green camps and hope this is not the shape of things to come.
VipersGratitude 30th April 2012, 11:39 Quote
Expect to pay at least £800 for it.
The cheapest 680 over here was £400 compared to $500 in the states.
Mizugetsu 30th April 2012, 11:53 Quote
after 4 years of neutering my PC down to a shell of it's former self This will be added to my dream system build in september but i think only 1 card initially
rollo 30th April 2012, 12:03 Quote
People still assuming gk110 will ever see consumer release for gaming, I'm expecting it to be quadro only and never made into a gaming gpu.

And since when did AMD or nvidia release 2 gpu updates in 1 year, 480 to 580 took close to 14 months ( and give as much performance gain as going from 580 to 680. But it saves a tonne of power ( and tech has an ivy bridge + sli 680 pc system in ( 450 ish watts under full load, the last 580 sli system was closer to 800 that's a huge reduction in power and therefore heat)

Days when we see 40-50% more performance are gone I feel look at CPU market i7950 I brought 3-4 years back is still not holding games back and even ivy bridge is a small minimum frame rate upgrade over it ( not worth £400 it would cost).

This won't see light of day for £600 across here closer to £800 I feel still could be a useful investment to certain people
Blarte 30th April 2012, 12:04 Quote
Sticking to the 580's set up for now, for a grand I'd want my tea making as well (and ironing done)
runadumb 30th April 2012, 12:13 Quote
Quote:
Originally Posted by maverik-sg1
Is it fair to say that unless the bridging chipset and or the drivers are sorted and properly supported - this thing will micro stutter like a biatch either now or when driver support is no longer important to nvidia?

In the year and a half I have been using SLi (two 570GTX's) I have never seen micro stutter. Not once, and that's with a good mixture of old and new games.

I have actually asked bittech about why they hate on SLi so much a few times but never got a response. It has given me zero issues, yet reading them talk about it makes you think it only works once in a blue moon. I have far more issues caused by running three displays, which can be a real chore...but is mostly worth it in the end.
Gareth Halfacree 30th April 2012, 12:19 Quote
Quote:
Originally Posted by VipersGratitude
Expect to pay at least £800 for it.
The cheapest 680 over here was £400 compared to $500 in the states.
£600 plus import tax and VAT (which adds £120 to the price by itself) means £800 is about right, give or take a few grubby tenners.
sakzzz 30th April 2012, 12:22 Quote
Quote:
Originally Posted by guvnar
OK, so start saving then......
More like..start starving then.. :D
GoodBytes 30th April 2012, 12:25 Quote
Quote:
Originally Posted by rollo
People still assuming gk110 will ever see consumer release for gaming, I'm expecting it to be quadro only and never made into a gaming gpu.

And since when did AMD or nvidia release 2 gpu updates in 1 year, 480 to 580 took close to 14 months ( and give as much performance gain as going from 580 to 680. But it saves a tonne of power ( and tech has an ivy bridge + sli 680 pc system in ( 450 ish watts under full load, the last 580 sli system was closer to 800 that's a huge reduction in power and therefore heat)

Days when we see 40-50% more performance are gone I feel look at CPU market i7950 I brought 3-4 years back is still not holding games back and even ivy bridge is a small minimum frame rate upgrade over it ( not worth £400 it would cost).

This won't see light of day for £600 across here closer to £800 I feel still could be a useful investment to certain people

GK110 is most likely reserved for a model between now and Maxwell. New architecture takes a long time to develop. Nvidia needs a card to fill up this space, else they will be late into the game, as where before.
The GK104 showed to beat with a nice enough margin AMD best single GPU offering, and now, possibly (we have to see reviews), dual GPU's. So why release the GK110, when the slower model does the trick. If all true, Nvidia will keep the GK110 for a model between now and Maxwell, in the case AMD (and they will) release a better GPU from now and Maxwell. Nvidia can hope to tweak and overclock their GK110 if not fast enough, and start producing the chip.
Action_Parsnip 30th April 2012, 12:46 Quote
Quote:
Originally Posted by runadumb
Quote:
Originally Posted by maverik-sg1
Is it fair to say that unless the bridging chipset and or the drivers are sorted and properly supported - this thing will micro stutter like a biatch either now or when driver support is no longer important to nvidia?

In the year and a half I have been using SLi (two 570GTX's) I have never seen micro stutter. Not once, and that's with a good mixture of old and new games.

I have actually asked bittech about why they hate on SLi so much a few times but never got a response. It has given me zero issues, yet reading them talk about it makes you think it only works once in a blue moon. I have far more issues caused by running three displays, which can be a real chore...but is mostly worth it in the end.

Valid enough question though, I heard life is not *too* rosey for gtx 295 owners, with SLI issues for some games, especially older titles increasing as time goes by and new forceware drivers come through. And That's only gt200 which isn't all *that* old.
Dwarfer 30th April 2012, 13:25 Quote
WOW That's alot of cash - I can't wait to see some reviews :)
Marvin-HHGTTG 30th April 2012, 13:52 Quote
Yay - 2GB VRAM per GPU - that's going to go down a treat at 2560x1600 or above. Anyone who purchases such a card for 1920x1200 is ill-informed, and would do better to spend that on some nice monitors instead...
Jezcentral 30th April 2012, 14:55 Quote
Nice to see such great hardware.

Now all we need is some software to run on it.....
Fizzban 30th April 2012, 15:13 Quote
Shouldn't it have 6GB of vram on it? Would make more sense on a card like that. Other than that it looks gorgeous. Waste of money as 2 680s in sli would be better, but beautiful none-the-less.
Gareth Halfacree 30th April 2012, 15:17 Quote
Quote:
Originally Posted by Fizzban
Shouldn't it have 6GB of vram on it? Would make more sense on a card like that.
It's already nearly $1,000, and you want 'em to stick another 2GB of DDR5 on there? Blimey.
Quote:
Originally Posted by Fizzban
Waste of money as 2 680s in sli would be better, but beautiful none-the-less.
Nvidia reckons it's exactly the same as two 680s, performance-wise, but we've no real figures yet to back that up.
roosauce 30th April 2012, 18:53 Quote
It should perform just a bit under 2x680s, due to the slightly lower clock speeds.

These dual GPU cards are good if you only have one x16 PCIe slot and you need the extra power, but I really dislike that half of the heat will be pushed straight into your case ...
Makaveli 1st May 2012, 02:54 Quote
Looks like a beast of a card but $1000....

umm ya no!
The_Beast 1st May 2012, 04:59 Quote
Can you tri-SLI it?




edit: Also, will it blend?
CAT-THE-FIFTH 1st May 2012, 10:06 Quote
OcUK confirmed the card as being around £800 IIRC.
Adnoctum 1st May 2012, 12:10 Quote
What a monumental waste of money.
There isn't a game that would need it.
runadumb 1st May 2012, 12:50 Quote
Quote:
Originally Posted by Adnoctum
What a monumental waste of money.
There isn't a game that would need it.

People running 3D or multiple displays need all the power they can muster. The market is small i'm sure but don't dismiss it because you don't have a use for it.
rollo 1st May 2012, 16:06 Quote
people on this forum could easily make use of 2 680s in sli or 1 690
erratum1 2nd May 2012, 16:08 Quote
Quote:
Originally Posted by Adnoctum
What a monumental waste of money.
There isn't a game that would need it.

No one needs a Ferrari or a Zonda........would be nice though, eh?
GoodBytes 2nd May 2012, 21:40 Quote
Announcement video:
http://kotaku.com/5907108/the-spectacular-live-debut-of-nvidias-999-dual+gpu-video-card

- The graphic card was designed to be overclockable
- For fact they are windows on the heatsink for better air flow
- Designed to be very quiet (3 000RPM fan - I assume max speed)
- Illuminated Geforce GTX logo
GoodBytes 2nd May 2012, 21:50 Quote
Full keynote (30min)
-> Geforce GTX 690 announcement
-> Geforce Experience Service (free)
-> Hawken trailer
-> new Cryengine 3 demo
http://www.geforce.com/whats-new/articles/watch-live-streaming/
Adnoctum 3rd May 2012, 11:17 Quote
Quote:
Originally Posted by runadumb
People running 3D or multiple displays need all the power they can muster. The market is small i'm sure but don't dismiss it because you don't have a use for it.
Quote:
Originally Posted by erratum1
No one needs a Ferrari or a Zonda........would be nice though, eh?

I didn't dismiss it, I just said it was a monumental waste of money, just like the Zonda.
A Zonda is like a puppy at Christmas: Fun until you need to feed it, walk it, clean its anal gland and pick up its...leavings.
Give me a BMW M3 and a back seat I can have sex in (sans dog)...or at least swing an elbow in. Forever Alone :'(

There will always be those who are willing to burn cash turning a 5760x1200 3D prosthetic penis into a 2900 Btu/h space heater in pursuit of a few extra frames/units/???
They don't need a GTX690, they want it. Which is perfectly valid. It's still a monumental waste of money.

I'm an enthusiast gamer and system tinkerer - I could use every frame my system can give me. But there are limits to which I'm willing to indulge my upgrade itch.
GoodBytes 3rd May 2012, 14:34 Quote
Review:
http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/19

The GPU does deliver, and is just as fast as dual GTX 680 (well very very close). Great overclocker, No real compromise as previous dual GPU on one card. Even if a game doesn't like SLI, well it's still very performing card, like a single GTX 680.

Here are some charts, from the reviews
http://images.anandtech.com/graphs/graph5805/46283.png
http://images.anandtech.com/graphs/graph5805/46207.png
http://images.anandtech.com/graphs/graph5805/46209.png
http://images.anandtech.com/graphs/graph5805/46181.png
http://images.anandtech.com/graphs/graph5805/46182.png
http://images.anandtech.com/graphs/graph5805/46183.png
http://images.anandtech.com/graphs/graph5805/46176.png

They are more, check the link.
Mizugetsu 3rd May 2012, 21:43 Quote
GoodBytes 3rd May 2012, 22:02 Quote
Yea, but it's 1000$ just the graphic card.. you have to add the box and everything else :p
It's all part of the Nvidia Advantage Plan (TM).
(just pulling a stupid joke)

In all seriousness, it's because the GPU is brand new, wait a week or two, and it should be appropriately priced. Then again, Nvidia mentioned the U.S and "MSRP" price. Some models are always more expensive then others, some are bellow Nvidia price.
TehKrak3n 8th May 2012, 10:53 Quote
Spending £800 on a GPU and only getting 2GB per core seems silly
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums