bit-tech.net

Nvidia G-Sync Review

Comments 1 to 21 of 21

Reply
Corky42 23rd December 2013, 14:47 Quote
Just think if the big three could actually work together and share some technological advancements, i cant help thinking all this proprietary stuff like Mantel, G-Sync, etc is holding back wide spread adoption of some advancements to the determent of us end users.
ZeDestructor 23rd December 2013, 15:01 Quote
Quote:
Originally Posted by Corky42
Just think if the big three could actually work together and share some technological advancements, i cant help thinking all this proprietary stuff like Mantel, G-Sync, etc is holding back wide spread adoption of some advancements to the determent of us end users.

AMD claims Mantle is open, but I have yet to see any docs, so as far as I'm concerned it's proprietary, plus it will be very heavily optimised for AMD GPUs in its first incarnation at least, so it will take time for Nvidia to catch up. Add to that that Nvidia is very, very confident in it's OpenGL implementations and you can see why they have been generally apathetic so far.

G-Sync, while proprietary to Nvidia right now, should be fairly easy for AMD to implement as well should Nvidia open the spec: It uses a standard DisplayPort interface together with an off-the-shelf FPGA (running admittedly very custom logic) in the monitor. I do believe Nvidia will license it fairly soon, much like it eventually licensed SLI to non-nvidia chipsets back in the day. If that fails, AMD will implement their own version, and then after a generation or two of feet-dragging, they'll make it a damn standard and we'll all move on.

All in all, there's nothing so proprietary that it can't be opened up right now. Just give them time to make money of exclusivity and then watch the dust settle.
AlienwareAndy 23rd December 2013, 15:30 Quote
Quote:
Originally Posted by ZeDestructor
AMD claims Mantle is open, but I have yet to see any docs, so as far as I'm concerned it's proprietary, plus it will be very heavily optimised for AMD GPUs in its first incarnation at least, so it will take time for Nvidia to catch up. Add to that that Nvidia is very, very confident in it's OpenGL implementations and you can see why they have been generally apathetic so far.

G-Sync, while proprietary to Nvidia right now, should be fairly easy for AMD to implement as well should Nvidia open the spec: It uses a standard DisplayPort interface together with an off-the-shelf FPGA (running admittedly very custom logic) in the monitor. I do believe Nvidia will license it fairly soon, much like it eventually licensed SLI to non-nvidia chipsets back in the day. If that fails, AMD will implement their own version, and then after a generation or two of feet-dragging, they'll make it a damn standard and we'll all move on.

All in all, there's nothing so proprietary that it can't be opened up right now. Just give them time to make money of exclusivity and then watch the dust settle.

Nvidia will never open up anything. They claw onto these little tech tidbits with their corporate grip.

They could have opened Physx ages ago but no. It's a shame because Phsyx could have been so much more than just a game every year or so. AMD opened up TressFX, heck, it even ran better on my GTX 670s in SLI than my 7990.

They're also assholes when it comes to 3dvision. I bought a TV recently that does Active 3D. Wired it up to my 670 SLI rig all excited, then found out they want $30 for the privilege of using it on my bloody telly.

So I downloaded the demo and it was locked to 26 FPS (even the full version is). So instead of opening up 3dvision and letting companies like, for example, Toshiba (who make my set) implement it for the full 60hz they instead charge $30 for it and it's derped.

The only thing Nvidia open up is Jen-Hsun Huang's asshole so he can fart once in a while.

It's a shame really, because if they were less focused on greed these silly little tech tidbits they come out with once in a while could actually be something good, instead of something you have to pay for (hint - you don't bother).
Deders 23rd December 2013, 16:04 Quote
What I want to know is is it going to be any smoother than simply enabling Triple buffering (for free via D3DOveridder, which i realise only seems to work on 32bit exe's), and would the difference be worth paying the £100 plus buying a new monitor that may have worse attributes than the one I already own.

I monitor the frametime amongst other thing in game with rivatuner statistics server and from what I can see D3DOverrider does the same job.

I can see how this may benefit online gamers with a twitch style of playing when compared to Triple Buffering, if it is just as responsive as not having Vsync enabled.
law99 23rd December 2013, 16:06 Quote
I guess this is something we'll need to use to really appreciate. Mostly as I look at the vsync on video and think it is good enough. Same as lots of things I suppose. Thanks for the write up.

Really would be a shame if they can't let AMD in on this one. Unless there is going to be a plug-able adapter for relevant cards in an easy to access port on supporting monitors. Who knows. Companies want to make money... it is the point. :s
AlienwareAndy 23rd December 2013, 16:09 Quote
Quote:
Originally Posted by law99
I guess this is something we'll need to pay out the arse for to really appreciate.

FTFY
Maki role 23rd December 2013, 16:52 Quote
To be honest, at only 1080p this was never going to do much. Whilst it's currently very much the mainstream resolution, it's certainly not the target for those looking at that budget sector. This is especially the case given how strong current cards are when running at 1080p, it's fast becoming an old standard at the high-end of the market.

However, this would be so useful for 4k and other very high resolution setups. There, even top end cards can be crippled by newer games, especially in multiple GPU setups where the frame rate can be very variable. Adding another £100 to a 4k display is neither here nor there if the experience will improve as much as mentioned (given how 30-60 fps will be far more common and how random dips can occur with SLI/Crossfire).

Adding £100 to the monitor is much cheaper than adding a third 780/Titan/780Ti or 290/290x, plus it'll remain relevant hopefully for quite a while. Not to mention I would imagine the technology will become much cheaper with greater adoption down the line.
edzieba 23rd December 2013, 18:22 Quote
The hard part about G-sync isn't the protocol (it fiddles with the VBLANK interval, whoopee), but creating the display panel controller hardware. As long as AMD aren't totally inflexibly with their display driver design (and by this I mean 'the chips that drive the physical display outputs', not a software driver), they shouldn't have too much trouble implementing G-Sync in their next cards. Creating monitors that would actually be able to do anything with the signal? That's the hard part, and where Nvidia have put their research.

Once the controller is mated with one of those 2560x1440 IPS panel bare-bones monitors I'll be snapping one up. My main beef wth it is the current iteration is designed solely to drive LVDS panels, which rules out its use in the first consumer iteration of the Oculus Rift, which will have to use a MIPI DSI panel because nobody makes small highDPI LVDS panels .
ch424 23rd December 2013, 19:00 Quote
Quote:
Although this monitor is capable of running at 144Hz, to keep our testing more general and realistic we did most of our testing at 60Hz.

I'd have thought it's more interesting to compare G-Sync to a 144Hz standard monitor with V-sync enabled, no? In both cases you have to spend a lot of money, so it would be interesting to see if G-Sync makes gameplay significantly better than a regular 144Hz monitor. Did you guys play with that at all?
Meanmotion 23rd December 2013, 19:25 Quote
Quote:
Originally Posted by ch424
Quote:
Although this monitor is capable of running at 144Hz, to keep our testing more general and realistic we did most of our testing at 60Hz.

I'd have thought it's more interesting to compare G-Sync to a 144Hz standard monitor with V-sync enabled, no? In both cases you have to spend a lot of money, so it would be interesting to see if G-Sync makes gameplay significantly better than a regular 144Hz monitor. Did you guys play with that at all?

We've admittedly done limited running at 144Hz and will be adding a few more thoughts on that scenario shortly. But we chose 60Hz as the main point of comparison because the vast majority of monitors run at that frequency, and particularly because all IPS monitors do. This simply gives a better overall impression of what the technology will do for most users. Spending most of our time looking at the niche of 144Hz monitors would give a very skewed sense of the value of the technology for most people.
ch424 24th December 2013, 00:46 Quote
Sorry, I didn't quite make my point clear - if "most people" have a standard 60Hz screen now, should they pay the little extra to upgrade to 144Hz, or should they pay even more extra to get G-Sync? Is it worth the difference?
Meanmotion 24th December 2013, 12:01 Quote
Quote:
Originally Posted by ch424
Sorry, I didn't quite make my point clear - if "most people" have a standard 60Hz screen now, should they pay the little extra to upgrade to 144Hz, or should they pay even more extra to get G-Sync? Is it worth the difference?

Ah, I see what you're getting at. The logic there was that 144Hz (or other high frame rate) monitors have been around for some time, so although it's not something bit-tech has covered extensively, the arguments for and against have long since been made. What are those arguments? Well, TN gives you fast framerates, which are beneficial in competitive gaming, but poor image quality compared to IPS panels. We're generally of the opinion that going with quality is better overall.
GregTheRotter 24th December 2013, 12:18 Quote
Anyone know where the kits will be sold and for how much?
Yslen 24th December 2013, 13:09 Quote
I think this review covers most of the points nicely; expensive monitor, potentially an expensive GPU upgrade to see results, and you lose the benefits of whatever monitor you have now for a similar or lower price, be it 2560x1440, IPS or whatever.

I'm interested in G-sync, but give it a few years, I think.
Bede 27th December 2013, 12:19 Quote
Forgive my ignorance on screen tech, but I find it astonishing that VBLANK is still a thing for non-CRT screens. Anyone know why it still exists?
Cthippo 27th December 2013, 21:07 Quote
Quote:
Originally Posted by Bede
Forgive my ignorance on screen tech, but I find it astonishing that VBLANK is still a thing for non-CRT screens. Anyone know why it still exists?

Probably because so many monitors still have VGA ports and once something gets into a standard it's impossible to get rid of it.

In this case, it turned out to be usable for something else, so win!
iggy 31st December 2013, 20:24 Quote
not interested at anything less than 120hz. just figured out how to get the projector running at the full whack, now its just like the old crt days, nice lag free, non-jittery 60 fps+ gaming. lets consign the 60hz nonsense to the consoles where it belongs eh?
alialias 3rd January 2014, 11:11 Quote
Seems like a logical move to me, the vast majority of things in computers are synchronised by clocks, why not the monitor too?
ZeDestructor 3rd January 2014, 11:24 Quote
Quote:
Originally Posted by alialias
Seems like a logical move to me, the vast majority of things in computers are synchronised by clocks, why not the monitor too?

This is actually the opposite: Nvidia is varying the refresh rate (clock) of the monitor panel!
Corky42 7th January 2014, 11:43 Quote
Is this AMD's answer to G-sync ?
http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014
Quote:
AMD has been relatively silent on the topic of NVIDIA’s variable refresh rate G-Sync technology since its announcement last year. At this year’s CES however, AMD gave me a short demo of its version of the technology.
ZeDestructor 8th January 2014, 07:55 Quote
Quote:

I have a sneaking suspicion it may not work very well for games (increased latency).. I mean, why else would they NOT use something that been in the spec for a few years?
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums