bit-tech.net

DisplayPort gets G-Sync like Adaptive-Sync tech

DisplayPort gets G-Sync like Adaptive-Sync tech

The VESA group has announced the inclusion of Adaptive-Sync into the DisplayPort 1.2a standard, offering a rival to Nvidia's G-Sync technology.

The Video Electronics Standards Association (VESA) has confirmed that Adaptive-Sync is now an official feature of the DisplayPort 1.2a video interface standard for external monitors, promising that products including the feature will be appearing on the market soon.

Adaptive-Sync, added to the Embedded DisplayPort (eDP) standard in 2009 but missing from the full-fat DisplayPort standard until now, is equivalent to Nvidia's G-Sync technology: the refresh rate of the connected display is dynamically altered to match the content being displayed. For gamers, it means smooth motion with no tearing or stuttering; for those viewing more static content such as web pages or documents, it means reduced power draw - hence its inclusion in eDP for mobile devices to use. It's related to AMD's FreeSync, which was donated to the VESA group several years before the technology was made public.

'DisplayPort Adaptive-Sync enables a new approach in display refresh technology,' claimed Syed Athar Hussain, Display Domain Architect at Nvidia rival AMD and VESA board vice chair. 'Instead of updating a monitor at a constant rate, Adaptive-Sync enables technologies that match the display update rate to the user’s content, enabling power efficient transport over the display link and a fluid, low-latency visual experience.'

Being a formal part of the standard, DisplayPort Adaptive-Sync - as the technology is now known - will come with a full suite of compliance testing and a certification system to guarantee compatibility between DisplayPort Adaptive-Sync graphics chips and external displays. The technology is also being offered to all VESA members - including Nvidia - free of charge, potentially helping to boost adoption of the technology above and beyond Nvidia's proprietary equivalent.

Although VESA has indicated that DisplayPort Adaptive-Sync is ready to deploy, neither the group itself nor its member companies have yet indicated when we'll be seeing the first official products hitting shop shelves.

19 Comments

Discuss in the forums Reply
Corky42 13th May 2014, 13:17 Quote
Forgive the ignorance, but is Adaptive-Sync identical to Nvidia's G-Sync.
If so i guess that means both the GPU and Monitor needs to support 1.2a
Reading some more it seems it will work with AMD FreeSync cards, do all AMD cards support FreeSync ?
[USRF]Obiwan 13th May 2014, 13:56 Quote
To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, a compatible AMD Radeon™ GPU with a DisplayPort™ connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.

The first discrete GPUs compatible with Project FreeSync are the AMD Radeon™ R9 290X, R9 290, R7 260X and R7 260 graphics cards. Project FreeSync is also compatible with AMD APUs codenamed “Kabini,” “Temash,” “Beema,” and “Mullins.” All compatible products must be connected via DisplayPort™ to a display that supports DisplayPort™ Adaptive-Sync.

AMD has undertaken every necessary effort to enable Project FreeSync in the display ecosystem. Monitor vendors are now integrating the DisplayPort™ Adaptive-Sync specification and productizing compatible displays. AMD is working closely with these vendors to bring products to market, and we expect compatible monitors within 6-12 months.

Taken from the forbes.com website.
DbD 13th May 2014, 14:51 Quote
Quote:
Originally Posted by [USRF
Obiwan]
Marketing blah...

Taken from the forbes.com website.

Remember this is just to enable the variable frame syncing stuff over display port, it already works on plenty of laptop displays. If freesync is good to go and just requires compatible monitors then why don't all those laptop users get to use it? It was telling that the only demo we've ever seen of this was a video basically running at a fixed fps (if a non standard one), not real games.

The nvidia solution has an add in chip with a big memory buffer in the monitor. It may well be that firstly the enabling of this basic variable syncing support doesn't equal the sort of control that the nvidia chip gets as it replaces controlling chip in the monitor itself, and secondly the buffer and the processing that nvidia chip does can't just be done by any standard gpu - you might need a new range of gpu's with a little sync control chip in them.
theshadow2001 13th May 2014, 15:35 Quote
My question is will nvidia adopt this. Or will they insist on you using a gsync enabled monitor.

Can this do away with gsync entirely or as was pointed above does gsync offer more.
ch424 14th May 2014, 00:50 Quote
This is exactly what I thought would happen

Gsync is massively over-engineered. It looks like nvidia are fully aware of this because their gsync board is based on a (very expensive!) FPGA and they've not moved to a custom ASIC or worked with TI/NXP to produce it for them.

Building a flexible refresh rate into the displayport spec will hopefully mean we all get it soon enough - on AMD and Intel's integrated graphics too, even if nvidia snub it!
Corky42 14th May 2014, 07:26 Quote
Quote:
Originally Posted by ch424
Building a flexible refresh rate into the displayport spec will hopefully mean we all get it soon enough - on AMD and Intel's integrated graphics too, even if nvidia snub it!

It's very unlikely that Nvidia will snub it seeing as they are one of the members of VESA, who are responsible for the displayport standard, and IIRC Nvidia said that one of the reasons for G-Sync was to push the adoption of a variable refresh rate standard.

Display vendors need something to kick them into actually making better displays.
http://techreport.com/news/26451/freesync-approved-adaptive-sync-added-to-displayport-spec
Quote:
The makers of display scaler and control chips still have work to do, and those companies have been rather sluggish in developing ASICs capable of supporting 4K resolutions at 60Hz. (That's why a lot of the early 4K displays use dual tiles, which is way less than optimal.)
Also of note in the techreport article is that they expect the new Adaptive-Sync standard will take from 6-12 months before we can get our grubby little mitts on them >:(
jb0 15th May 2014, 07:50 Quote
Quote:
Originally Posted by DbD

The nvidia solution has an add in chip with a big memory buffer in the monitor. It may well be that firstly the enabling of this basic variable syncing support doesn't equal the sort of control that the nvidia chip gets as it replaces controlling chip in the monitor itself, and secondly the buffer and the processing that nvidia chip does can't just be done by any standard gpu - you might need a new range of gpu's with a little sync control chip in them.
Actually, thinking about it it makes SENSE that nVidia needed a whole bunch of RAM since they WERE replacing the entire existing control circuit.
I doubt they really needed as much as they had, but... it was likely easier to get an FPGA with the capabilities they needed and a gig of RAM than one with the capabilities they needed and exactly the right amount of RAM.
(And why are they still using an FPGA? Haven't they had more than enough time to lay out a masked ASIC for a few of the more common panels by now?)


But once Adaptive Sync is in the market place, you SHOULD see exactly the kind of control nVidia offers with GSync, and without sacrificing all the existing functionality of your monitor.
I don't believe there was ever a good reason to push a proprietary technology to do something there was already a VESA spec for other than maximizing profits, and I'm glad that AMD and VESA are shutting them down on that front, especially with how long GSync has taken to come to market.
Corky42 15th May 2014, 11:28 Quote
Quote:
Originally Posted by jb0
I don't believe there was ever a good reason to push a proprietary technology to do something there was already a VESA spec for other than maximizing profits, and I'm glad that AMD and VESA are shutting them down on that front, especially with how long GSync has taken to come to market.

I'm not sure either AMD or Nvida truly planned to go it alone with their own solutions.

Reading between the lines it seems to me that Nvidia used G-Sync to show the companies responsible for developing ASICs how it can, or should be done on the hardware front. And at the same time AMD showed the other board members of VESA how to implement it into a new display standard, and made the software freely available for everyone to use.

Together they have shown the other board members of VESA, and the companies responsible for developing the ASICs that there is demand for variable frame rates beyond just using it to save battery life in portable devices.
Saivert 22nd July 2014, 19:37 Quote
G-Sync was basically the proof of concept that the world needed.
theshadow2001 22nd July 2014, 23:21 Quote
Quote:
Originally Posted by Corky42
I'm not sure either AMD or Nvida truly planned to go it alone with their own solutions.

Reading between the lines it seems to me that Nvidia used G-Sync to show the companies responsible for developing ASICs how it can, or should be done on the hardware front. And at the same time AMD showed the other board members of VESA how to implement it into a new display standard, and made the software freely available for everyone to use.

Together they have shown the other board members of VESA, and the companies responsible for developing the ASICs that there is demand for variable frame rates beyond just using it to save battery life in portable devices.

If you were a business why would you not want your device in the back of every monitor.

Nvidia didn't show anyone anything. It's not like they licensed the gsync module or the fpga source code for a third party to build (in an ARM like fashion)

Nvidia were the first to create a fuss over variable frame rate and AMD followed suit by realising they could modifying existing tech to do something similar.

If either company could go it alone they would. I mean come on. They're businesses. They're sole purpose is to make money. Any benefits to anyone else are merely coincidental.

Just like physx and mantle one side is not interested in the others technology beyond its competitive threat.

There's no way this was some sort of team effort to introduce variable frame rate graphics.
Corky42 23rd July 2014, 08:49 Quote
I wouldn't necessarily want my device in the back of every monitor if i could get the companies that already make those devices to adopt a new technology, as i have been asking them to do for the last 5-10 years.

Nvidia showed that there's demand for variable frame rates beyond just using it to save battery life, they demonstrated to the current manufactures of LCD controller boards that they need to actually produce new products, and not just use the same ASICs that they have been using for the last 10-15 years, expecting everyone to make do with what they are given, like fixed refreshes, no true 4k instead relying on upscaling.

Both AMD & Nvidia had been working on variable frame rates a long time before we got to hear about it, they had been telling the manufactures of LCD controller boards that it was something they wanted for years, but the manufactures of the ASICs wouldn't play ball. Much in the same way that develops had been telling Microsoft for years that they wanted changes in DirectX but were ignored, so AMD developed and released their own API that provided what they had been asking for.

You are correct that both companies sole purpose is to make money, but making money is very difficult when other companies you rely on refuse to innovate, instead churning out the same old tech because it cost money to develop new ASICs, or to develop a new DirectX.

Why would anyone want to buy your latest and greatest Graphics card if what it plugs into isn't able to show it of to it's full potential, why would anyone buy your Graphics cards if they are being crippled by a bad API.

EDIT: I found this article from the Tech report an interesting read on the subject, and if you have the time they also wrote another article from the other side of the fence.
theshadow2001 26th July 2014, 15:31 Quote
Right, so instead of selling/licensing something useful (and easy to integrate with your own components) to companies who can't or as you suggest won't develop it. You would rather they develop something similar themselves. Which would force you to work to their standard rather than the other way around (which is more difficult for you) and you make absolutely nothing from it? Sounds like an excellent business plan ;)

Bottom line its going to be less work and more profitable for you to create something for all monitor manufacturers to put in their monitors, than it is to have 10 different monitor manufactures come up with 10 different ways of doing the same thing and you have to comply with all of them.

Who's been asking for variable refresh rate for the last 10 years?

How do you know either side have been asking display manufacturers for variable refresh rates for years? How do you know the monitor companies refused to innovate? Which monitor companies specifically are you talking about?
Corky42 26th July 2014, 16:32 Quote
Quote:
Originally Posted by theshadow2001
Right, so instead of selling/licensing something useful (and easy to integrate with your own components) to companies who can't or as you suggest won't develop it. You would rather they develop something similar themselves. Which would force you to work to their standard rather than the other way around (which is more difficult for you) and you make absolutely nothing from it? Sounds like an excellent business plan ;)

Maybe i missed something but has NVidia said it's not going to license G-Sync at anytime in the future ?
Quote:
Originally Posted by theshadow2001
Bottom line its going to be less work and more profitable for you to create something for all monitor manufacturers to put in their monitors, than it is to have 10 different monitor manufactures come up with 10 different ways of doing the same thing and you have to comply with all of them.

Good job on explaining what G-Sync is. A module that monitor manufacturers put in their monitors and pay NVidia for the privilege of doing so. ;)
Quote:
Originally Posted by theshadow2001
Who's been asking for variable refresh rate for the last 10 years?
Well first off i said 5-10 years, although granted that was a guesstimate on my behalf.
I based that guesstimate on the fact that LCD's have been using fixed refresh rates that are a throwback to the days of CRT's and laptops have not, that it takes time to develop something like a G-Sync module.

Take my guesstimate or leave it, either way these thing don't happen overnight and probably wouldn't be undertaken unless you have no other choice (have been asking for a good length of time)
Quote:
Originally Posted by theshadow2001
How do you know either side have been asking display manufacturers for variable refresh rates for years?

I don't know for sure, but do you think NVidia & AMD would invest the money and time in developing something if they didn't have to ?
Quote:
Originally Posted by theshadow2001
How do you know the monitor companies refused to innovate?

I'm guessing by your questions that you didn't read the links i provided.
Quote:
The makers of display scaler and control chips still have work to do, and those companies have been rather sluggish in developing ASICs capable of supporting 4K resolutions at 60Hz. (That's why a lot of the early 4K displays use dual tiles, which is way less than optimal.)
Quote:
Originally Posted by theshadow2001
Which monitor companies specifically are you talking about?

Monitor companies don't make everything inside the monitor you know, look towards the companies that make the scaler and ASIC chips.
theshadow2001 26th July 2014, 17:06 Quote
Quote:
Originally Posted by Corky42
Maybe i missed something but has NVidia said it's not going to license G-Sync at anytime in the future ?
Whoever puts a gsync chip in the monitor is going to have to pay nvidia, either for the chip or some sort of license fee. There's a reason those monitors are more expensive

Quote:
Originally Posted by Corky42

Good job on explaining what G-Sync is. A module that monitor manufacturers put in their monitors and pay NVidia for the privilege of doing so. ;)
I was explaining why your idea of getting ASICs or monitor companies to develop a variable refresh rate technologies for you instead of doing it yourself is not necessarily a good idea. Its less profitable and makes life more complicated for you. Which all goes back to your point on neither company intended to go it alone. When it makes sense to go it alone because it can be more profitable and less complex.
Quote:
Originally Posted by Corky42

Well first off i said 5-10 years, although granted that was a guesstimate on my behalf.
I based that guesstimate on the fact that LCD's have been using fixed refresh rates that are a throwback to the days of CRT's and laptops have not, that it takes time to develop something like a G-Sync module.

Take my guesstimate or leave it, either way these thing don't happen overnight and probably wouldn't be undertaken unless you have no other choice (have been asking for a good length of time)
Grand, I'll leave it.

Quote:
Originally Posted by Corky42

I don't know for sure, but do you think NVidia & AMD would invest the money and time in developing something if they didn't have to ?
No one would invest time and money if they didn't envisage a payback from it.
Quote:
Originally Posted by Corky42

I'm guessing by your questions that you didn't read the links i provided.
I read both of them when they were released. I view the techreport daily. Neither article was pertinent to your points. There are now 4K 60hz panels so someone somewhere must be innovating.


Quote:
Originally Posted by Corky42

Monitor companies don't make everything inside the monitor you know, look towards the companies that make the scaler and ASIC chips.
No problem just replace my question mwith monitor companies and their suppliers.
Corky42 26th July 2014, 17:35 Quote
Quote:
Originally Posted by theshadow2001
Whoever puts a gsync chip in the monitor is going to have to pay nvidia, either for the chip or some sort of license fee. There's a reason those monitors are more expensive

Sorry, but wasn't it you that said what kind of business model is that ?
Quote:
Originally Posted by theshadow2001
I was explaining why your idea of getting ASICs or monitor companies to develop a variable refresh rate technologies for you instead of doing it yourself is not necessarily a good idea. Its less profitable and makes life more complicated for you. Which all goes back to your point on neither company intended to go it alone. When it makes sense to go it alone because it can be more profitable and less complex.

IIRC I didn't say either company intended to go it alone, in fact i think i said "I'm not sure either AMD or Nvida truly planned to go it alone with their own solutions." :?
Quote:
Originally Posted by theshadow2001
Grand, I'll leave it.

Your choice, i was merely putting forward my theory. You are welcome to come up with your own and share it with us.
Quote:
Originally Posted by theshadow2001
No one would invest time and money if they didn't envisage a payback from it.

But wasn't it you that said "If you were a business why would you not want your device in the back of every monitor." Ohh yea you did, right here.
Quote:
Originally Posted by theshadow2001
I read both of them when they were released. I view the techreport daily. Neither article was pertinent to your points. There are now 4K 60hz panels so someone somewhere must be innovating.

What you mean the point that i made, that the makers of display scaler and control chips have been rather sluggish in developing ASICs, as i quoted from the article.
Quote:
Originally Posted by theshadow2001
No problem just replace my question mwith monitor companies and their suppliers.

The maybe you need to E-mail Scott Wasson and ask him what companies he was referring to when he said...

"The makers of display scaler and control chips still have work to do, and those companies have been rather sluggish in developing ASICs capable of supporting 4K resolutions at 60Hz. (That's why a lot of the early 4K displays use dual tiles, which is way less than optimal.)"
theshadow2001 26th July 2014, 18:33 Quote
Quote:
Originally Posted by Corky42
Sorry, but wasn't it you that said what kind of business model is that ?


You said you wouldn't want your chip in the back of every monitor. Everything I have wrote is based upon why I think that notion is bad for business. And because having your chip in every machine is good for business, it also makes sense that you would want to go it alone as a company and not be part of some AMD/Nvidia super team to introduce variable referesh rates.
Quote:
Originally Posted by theshadow2001

I was explaining why your idea of getting ASICs or monitor companies to develop a variable refresh rate technologies for you instead of doing it yourself is not necessarily a good idea. Its less profitable and makes life more complicated for you. Which all goes back to your point on neither company intended to go it alone. When it makes sense to go it alone because it can be more profitable and less complex.
Quote:
Originally Posted by Corky42

IIRC I didn't say either company intended to go it alone, in fact i think i said "I'm not sure either AMD or Nvida truly planned to go it alone with their own solutions." :?


Yes, that is what you said. The quotation of me above is pointing to the fact that this is what you said. :(
Quote:
Originally Posted by Corky42

But wasn't it you that said "If you were a business why would you not want your device in the back of every monitor." Ohh yea you did, right here.
Yes I did, because its a good idea if they can pull it off. They invested that money because they will expect to see a pay back from it.
Quote:
Originally Posted by Corky42

What you mean the point that i made, that the makers of display scaler and control chips have been rather sluggish in developing ASICs, as i quoted from the article.
You said they refused to innovate and in the context of variable refresh rates as well. Nothing we are talking about is relevant to 4k. I don't even know why you are bringing details of 4k development into a discussion about variable frame rates. :? Being slow to develop is not refusing to innovate by the way.
Corky42 26th July 2014, 19:14 Quote
Quote:
Originally Posted by theshadow2001
You said you wouldn't want your chip in the back of every monitor. Everything I have wrote is based upon why I think that notion is bad for business. And because having your chip in every machine is good for business, it also makes sense that you would want to go it alone as a company and not be part of some AMD/Nvidia super team to introduce variable referesh rates.

I believe what i said was "I wouldn't necessarily want my device in the back of every monitor if i could get the companies that already make those devices to adopt a new technology" As in i wouldn't necessarily want to make them myself, i would want to license the tech to the companies that do make them, once i had shown them there is a market for the tech, or for them to develop it themselves.
Quote:
Originally Posted by theshadow2001
Yes, that is what you said. The quotation of me above is pointing to the fact that this is what you said. :(
If it makes sense to go it alone because it can be more profitable and less complex, how do you explain AMD providing free sync for free, how do you explain that it would have cost a lot of money to develop both G-Sync & Free Sync, how do you explain that G-Sync is arguably a complex solution.
Quote:
Originally Posted by theshadow2001
Yes I did, because its a good idea if they can pull it off. They invested that money because they will expect to see a pay back from it.

But what is better is to either get the companies that make the chips to invest in developing the tech themselves (no risk involved) or to develop it your self and license it to said companies (moderate risk)
The highest risk (imho) is to not only develop it yourself, but to also make it yourself, when you have little experience and would be competing against companies with years of experience, established contracts, infrastructure, etc, etc.
Quote:
Originally Posted by theshadow2001
You said they refused to innovate and in the context of variable refresh rates as well. Nothing we are talking about is relevant to 4k. I don't even know why you are bringing details of 4k development into a discussion about variable frame rates. :? Being slow to develop is not refusing to innovate by the way.

So being rather sluggish in developing ASICs for 4K doesn't indicate to you that they would be even slower to develop variable refresh rates ? The former has a target audience of billions (every display in the world, monitors, TV's) The latter has a target audience of at most a few million gamers.
theshadow2001 27th July 2014, 17:38 Quote
Quote:
Originally Posted by Corky42
I believe what i said was "I wouldn't necessarily want my device in the back of every monitor if i could get the companies that already make those devices to adopt a new technology" As in i wouldn't necessarily want to make them myself, i would want to license the tech to the companies that do make them, once i had shown them there is a market for the tech, or for them to develop it themselves.
The first statement appears to me to say you want someone else to develop everything. The second appears to be that you only don't want to manufacture anything.
Quote:
Originally Posted by Corky42

If it makes sense to go it alone because it can be more profitable and less complex, how do you explain AMD providing free sync for free, how do you explain that it would have cost a lot of money to develop both G-Sync & Free Sync, how do you explain that G-Sync is arguably a complex solution.

All easily explained. AMD already had developed Free-Sync, just not for a gaming application. It was for power saving in laptops. The pay off for its development was inclusion of their graphics processors in laptops because of the reduced power consumption it offered. It utilised a component of the embedded VESA standard. NVidia comes out with a model that in their eyes is the best technical solution to providing variable frame rate. Nvidia were first out of the gate here with variable frame rates applied to gaming.

The quickest way for AMD to react to that is to modify the laptop processors to do the same thing. Given how quickly they had their laptop demo available after the g-sync unveil, it probably wasn't the most challenging of modifications. They simply ask VESA to add the embedded feature to the next full display port iteration. Which is probably fairly straight forward to do. Its too late for AMD to do anything else. They are too far behind Nvidia to create a similar type of solution. NVidia already have large monitor brands on board.

AMD have taken the less profitable route here but it was their only choice. They have to react quickly or they will start losing customers. Especially on something as important as this.

I take it you meant to ask how is G-Sync an arguably less complex solution? For any given system its always easier to build it yourself from the ground up, rather than trying to work with other vendors, agreeing standards, making compromises you may not want to make etc. etc. Only when you don't have the resources either technical or physical to complete an element of a system is it worth involving a third party. Nvidia have the technical and physical resources to complete g-sync on their own.

If you have made the full eco-sphere from start to finish in the long run it will make future development easier. You want to do something that won't play well with the current g-sync version? No problem just do a firmware update for the g-sync module and carry on regardless. You want to do the same with a VESA standard? tough luck. You have to wait for VESA to make a move. Maybe they won't make the move you want. Maybe it will but its going to take a year or two for other stuff to be agreed. Maybe they sort of make a move you want but it requires serious redevelopment/redoing of the work you have completed. Its always better to design, own and control the full eco-system.

You discover a bug, it lies with a third party's system component. They check and says its not a problem on their side. And so begins the "problem tennis match" where the problem is being constantly whacked back and forth between two parties and neither one is actually getting the problem solved. You own everything, its easier to get the bugs worked out.
Quote:
Originally Posted by Corky42

But what is better is to either get the companies that make the chips to invest in developing the tech themselves (no risk involved) or to develop it your self and license it to said companies (moderate risk)
The highest risk (imho) is to not only develop it yourself, but to also make it yourself, when you have little experience and would be competing against companies with years of experience, established contracts, infrastructure, etc, etc.
There is risk in getting a third part involved in creating what you want as highlighted above, its certainly not "no risk".

Getting someone else to make something you have designed is only removing the manufacturing process. NVidia and AMD are constantly making electronic devices (granted its more than likely contracted out) But there isn't really any benefit in giving out the manufacturing to someone if you already do it yourself or you have trusted partners to do it for you.

Doing everything yourself from the ground up is only high risk if you don't have the technical and/or physical resources to do it yourself. NVidia clearly have all of that.
[/QUOTE]
Corky42 28th July 2014, 09:30 Quote
Quote:
Originally Posted by theshadow2001
The first statement appears to me to say you want someone else to develop everything. The second appears to be that you only don't want to manufacture anything.

Sorry for the confusion, i meant both really.
To me it seems like a sliding scale of risk vs reward, on one end you have someone else introducing something that is of benefit to you without you having to do anything other than nudge them in the direction you want. On the other end of the scale you have to do all the work because no one else will.

It's what caused AMD to release Mantle, developers had been asking for an API that allowed them to get closer to PC hardware for years, Microsoft was dragging its feet so AMD decided to take a risk and release Mantle, arguably DX12 may not have seen the light of day, or allowed for reduced draw calls if it wasn't for Mantle.
Quote:
Originally Posted by theshadow2001
All easily explained. AMD already had developed Free-Sync, just not for a gaming application. It was for power saving in laptops. The pay off for its development was inclusion of their graphics processors in laptops because of the reduced power consumption it offered. It utilised a component of the embedded VESA standard. NVidia comes out with a model that in their eyes is the best technical solution to providing variable frame rate. Nvidia were first out of the gate here with variable frame rates applied to gaming.


I'm not sure how long either of them had been working on variable frame rates, but EDP (Embedded DisplayPort) has been a VESA standard since 2008 so AMD isn't wholly responsible as the VESA board is made up of many companies across the industry. That's not to say they didn't propose or develop it, it's just saying EDP wasn't introduced just so one company could sell more GPU's.
As for what is the better technical solution we will have to wait and see, so far we have only seen Free-Sync work on laptops with EDP, we should get a clearer picture when we get DP 1.2a capable monitors.
Quote:
Originally Posted by theshadow2001
The quickest way for AMD to react to that is to modify the laptop processors to do the same thing. Given how quickly they had their laptop demo available after the g-sync unveil, it probably wasn't the most challenging of modifications. They simply ask VESA to add the embedded feature to the next full display port iteration. Which is probably fairly straight forward to do. Its too late for AMD to do anything else. They are too far behind Nvidia to create a similar type of solution. NVidia already have large monitor brands on board.

That is assuming that EDP can be easily modified from what i guess is a simple on or off situation, to a more linear shift in frame rates. Yes AMD demonstrated that EDP can do variable frame rates, but (afaik) they didn't provide many details of how it was done, what if any modifications they had to do to the hardware, or any other details.
Quote:
Originally Posted by theshadow2001
AMD have taken the less profitable route here but it was their only choice. They have to react quickly or they will start losing customers. Especially on something as important as this.

Personally i think it's less to do with money and more to do with pushing for a new standard, sure AMD could have licensed Free-Sync and made money from it but it probably wouldn't have received wide spread adoption.
Quote:
Originally Posted by theshadow2001
I take it you meant to ask how is G-Sync an arguably less complex solution? For any given system its always easier to build it yourself from the ground up, rather than trying to work with other vendors, agreeing standards, making compromises you may not want to make etc. etc. Only when you don't have the resources either technical or physical to complete an element of a system is it worth involving a third party. Nvidia have the technical and physical resources to complete g-sync on their own.

Well no i didn't mean to ask how G-Sync is an arguably less complex solution, sorry.
I mean G-Sync is a complex solution to what should be a simple problem, i mean come on a FGA addon board that sits between other parts in a display, i think they could only make it more complex if they had upgradable ram modules ;)

To me G-Sync seems more like NVidia showing the ASIC and scaler manufactures how it should be done, i doubt there is anything proprietary in the G-Sync module that would prevent them from copying what it does, DP 1.2a monitors should give us that answer.
Quote:
Originally Posted by theshadow2001
If you have made the full eco-sphere from start to finish in the long run it will make future development easier. You want to do something that won't play well with the current g-sync version? No problem just do a firmware update for the g-sync module and carry on regardless. You want to do the same with a VESA standard? tough luck. You have to wait for VESA to make a move. Maybe they won't make the move you want. Maybe it will but its going to take a year or two for other stuff to be agreed. Maybe they sort of make a move you want but it requires serious redevelopment/redoing of the work you have completed. Its always better to design, own and control the full eco-system.

Yes it will make future development easier but it will also lead to slow or low adoption rates, what is more interesting to AMD & NVidia is getting wide spread adoption of a new standard that is of benefit to their main business, variable frame rates fix so many problems and bring noticeable advances in graphics technology that will enable them to make cheaper GPUs and sell more of them.
Quote:
Originally Posted by theshadow2001
You discover a bug, it lies with a third party's system component. They check and says its not a problem on their side. And so begins the "problem tennis match" where the problem is being constantly whacked back and forth between two parties and neither one is actually getting the problem solved. You own everything, its easier to get the bugs worked out.

And a whole lot more expensive.
Quote:
Originally Posted by theshadow2001
There is risk in getting a third part involved in creating what you want as highlighted above, its certainly not "no risk".

When compared to doing it all your self i would say it's no risk, or at least minimal risk out of all your options.
Quote:
Originally Posted by theshadow2001
Getting someone else to make something you have designed is only removing the manufacturing process. NVidia and AMD are constantly making electronic devices (granted its more than likely contracted out) But there isn't really any benefit in giving out the manufacturing to someone if you already do it yourself or you have trusted partners to do it for you.

Unless you want to concentrate on your main business, sure companies branch out into other fields but when that takes focus away from your main business it may not be something you want to do.
Quote:
Originally Posted by theshadow2001
Doing everything yourself from the ground up is only high risk if you don't have the technical and/or physical resources to do it yourself. NVidia clearly have all of that.

Or if like Microsoft have done (imho) doing so means you have taken your eye of the ball and ended up harming your main business.
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums