AMD FreeSync 2 for Xbox One S and Xbox One X

Next week, in the Xbox One alpha release ring, Microsoft will enable AMD FreeSync 2 for the Xbox One S and the Xbox One X. This allows compatible displays, ones that accept FreeSync variable refresh rate signals over HDMI, to time their refresh rate to the console’s rendering rate and removes the micro-stutter that could be seen due to this mismatch.

Because it is FreeSync 2, it will also work with HDR content.

As stated, FreeSync over HDMI will be required to use this feature, which has two caveats. The first is that DisplayPort will not work, so that’s something to be careful about if you’re planning to buy something (either a display or an Xbox itself) for this feature. The second is that, as far as I know, not a single TV currently supports FreeSync – but that could change. There is now a major console manufacturer pushing the standard, which is a stronger use case than “maybe someone with an AMD (or potentially Intel someday) GPU will plug their PC into this TV”.

The menu to enable FreeSync on the Xbox One

The Xbox Insider Program Alpha Preview ring is invite only. It will then trickle to Beta, Delta, and Omega, before being released to the general public.

This is a great thing for PC gamers. The sooner nvidia drops their gshit and starts supporting open standards the better. Of course this is nvidia though and didn't they just release a gsync TV that can't be used for the largest market for gaming TVs?

The problem is that VESA should have made AMD give up the Freesync Branding as a condition of VESA's adoption of that into VESA'a DP standard/s(1.2a) and only allowed makers to use the VESA DisplayPort Adaptive-Sync branding. We all know about Nvidia's Pathological Aversion to any competition's Branding lately and that's nothing new as far as Nvidia's(Ego Complex) is concerned.

VESA, as well as other, industry standards organization needs to have neutral branding on any technology assoicated with any industry wide standard that is created for an entire industry to make use of. AMD/Nvidia has no business trying to use their proprietary Marketing/Branding on any tehcnology that is been given over to a standards body such as VESA.

Industry Standards Bodies Like VESA are non-profits anyways but as far as standards VESA has a legal responsibility to assure compliance with its standards neutral Branding(DisplayPort Adaptive-Sync) and that's part of the problem.

Freesync/Freesync 2 are trademarked by AMD so how can VESA have allowed that sort of branding confusion to continue once the First generation of Freesync was a included in VESA's DisplayPort Adaptive-Sync 1.2a standard.

Oh waaaaa you fucking baby god forbid AMD has its name on something. Intel/Nvidia got their name plastered on everything. AMD was the ones to push for a FREE option not Nvidia their name should be on it.

No not a baby just there to point out that VESA is non profit Standards Organization and that tax break requires that they not be mixed up too much directly with any one company's branding. Ditto for Nvidia's Abusive Monopolistic tactics with messing around with the third party AIB's PC/Laptop OEM Gaming Branding and trying to exclude AMD under those third party entities' long established gaming brands in an unfair way.

Jeffrey Hramika has little knowledge of Antitrust law Antitrust history or fair and equitable market play as codified by the antitrust laws on the books for over a century. As well as what rules and standards that non profit industry standards bodies lika VESA must be held to when VESA is not supposed to be seen as involved in any single one of its many members marketing divisions. That's a big No No for VESA as an impartial standards organization.

AMD as a member of VESA took VESA's eDP standars and created what was to become the VESA DisplayPort Adaptive-Sync standard as adopdet under VESA's DP 1.2a standard. Nvidia, Intel, and every-compute-cpmpany and there compute-dog are members of VESA so they all have some form of say in the matter as dues paying members of VESA. VESA is being very un-standards-organization like with that FreeSync/AMD trademarked branding gitting mixed in where it does not belong.

You are trying to make an issue where there isn’t one. You might as well get mad at Samsung for calling their quantum dot implementation “QLED” or Sony for calling their quantum dot implementation “TRILUMINOS”. Those are just a trademarked name that others cannot use, little else. The adaptive sync is just an optional part of the standard. AMD and anyone else can use whatever name they want for their implementation of the standard, or any other feature really. There isn’t much of any restrictions on use of FreeSync branding. The display could implement a tiny adaptive sync range (like 40 to 60) and still claim that they are FreeSync compatible. For FreeSync2, AMD places some restrictions on what features need to be included to claim FreeSync2 compatibility, which is a good thing. It doesn’t just have to support some minimal range of adaptive sync, it has to support all of the stuff AMD specifies to use the branding. It has no bearing on VESA adaptive sync. Manufactures can still support adaptive sync but they cannot claim FreeSync2 compatibility without supporting specific levels of features that AMD has included. Such a branding name for a set of features is really quite useful. If a display just claimed support for VESA adaptive sync, that doesn’t really tell you much of anything by itself. A FreeSync2 display must implement a minimum set of features, like no minimum frame rate (frame doubling) and support for some HDR features.

Nvidia is completely different though. They tried to make a proprietary solution without changing the standard, which is completely bogus. We do not need a vendor lock-in on displays and video cards. The display standards exist exactly for this purpose. The right way to do it was to add it to the standard, which is exactly what AMD and VESA did. It is still optional, so everyone can decide to add it or not. Nvidia chooses not to add it and to stick to their more expensive proprietary solution, so Nvidia customers will continue to need to pay extra for essentially a proprietary Nvidia TCON. Nvidia isn’t the most consumer friendly company if you actually follow what is going on. AMD makes very forward looking video cards that are often usable for years. Nvidia makes video cards that are the latest and greatest for about 6 months to year, and then they are outdated. I am running an old Intel/Nvidia system now, but my next system will probably be a Ryzen 2. Not sure what to do about the video card yet.

The issue here is Folks complaining about Nvidia not supporting FreeSync where everybody and their dog knows that Nvidia cares more about its revenue stream than any standards that the whole industry supports. The need to push marketing out of the technology standards naming is due to marketing’s deleterious reputation with regards to marketing’s nefariously obfuscating technology to the consumer.

Marketing and Marketing Branding from a single company’s Proprietary/Trademarked Branding has no place taking over any standards body's labeling where that standards body job/tax exemption requires that they refrain from any favoritism toward a single member company of that standards organization. The best way to keep the other companies from not supporting any new standard is to potentially be using the proprietary branding of a company that’s a competitor to those other member companies. VESA’s main job is to get its standards widely adopted in the display industry and refrain from any commercial tie-ins or conflicts of interests.

Both Nvidia and AMD can have their proprietary standards but really AMD utilized VESA’s eDP standard to make “FreeSync”(first generation) and VESA should have required AMD to keep that Branding that resulted from VESA’s eDP standard under the VESA DisplayPort 1.2a standard which became called VESA DisplayPort Adaptive-Sync not FreeSync. AMD did all that engineering work as a member of VESA with those representative AMD engineers working under their VESA standards committee roles.

AMD is free to label its GPUs as using FreeSync Technology but on the Monitor side that needs to be restricted to the VESA DisplayPort Adaptive-Sync standards labeling/branding. The other members of VESA have their own Proprietary/Trademarked branding that must be kept separate for VESA’s labeling also.

VESA needs to be very careful about alienating any of its member companies or being seen as favoring any company's marketing departments where VESA has no business as a non profit being involved. Let the VESA’s member companies do that in a unrelated to the Standards Organization manner.

No one likes Nvidia's Vendor Lock-in with G-Sync or CUDA but that's how Nvidia will roll if Nvidia's market share is not damaged too much from Nvidia not supporting Open Display Standards or OpenCL based AI/other libraries. Too many Gamers only Judge a successful GPU on its FPS metrics and Gamers appear to be a little too interested in high FPS metrics above all else and GPUs as status symbols.

Really Nvidia has more money than AMD to invest in GPUs and AMD, currently, has such a small share of the discrete gaming GPU market compared to Nvidia that AMD can not be hurt as much revenue wise if gaming GPU sales are affected by mining/compute. Nvidia still obtains the majority of its revenues from discrete consumer desktop/mobile GPU sales. AMD will see more profits and revenue growth from its Epyc CPU sales in a market that Nvidia has no access to because Nvidia can not obtain an x86 license. AMD has its Raven Ridge APUs that will get more integrated market share for Vega graphics and attract even more games/gaming engine development attention much to Nvidia chagrin.

Nvidia damn well knows that AMD’s First Vega 10(AMD Only limited funds to afford Vega 10) base die tape-out design was intentionally tuned for compute and the Vega 10 base die design was frozen long before Nvidia needed to even think about releasing that GP102 based GTX 1080Ti. Vega 10 had to do dual duty as a professional Compute/AI tape-out and a Gaming tape-out to compete with the GTX 1080(GP104 based). Nvidia also knows that AMD at the right time can now afford to re-spin a New Vega base die tape-out with more that 64 ROPs(88 ROPs or more) to again compete with the GTX1080Ti. AMD can and has done in that past created dual GPU dies on a single PCIe card variants that have taken the Flagship crown and any Dual Vega die based gaming SKUs can very well make use of the Infinity Fabric as Vega supports that IP also. So dual Vega dies wired up via the Infinity Fabric and not PCIe, and JHH over at Nvidia knows that can happen.

Gamers are making a big mistake confusing the Vega GPU micro-architecture with that First Vega 10 base die tape-out and using that intentionally compute/AI focused shader heavy design to judge the efficiency of the Vega GPU micro-architecture. Just look at the Raven Ridge Integrated Vega Graphics(Uses system DRAM as VRAM) and even without all that HBM2 bandwidth available that integrated graphics is not too power hungry and does well in games with that 11 nCUs offered on the Ryzen 5 2400G desktop APU. Wait until Discrete Mobile Vega arrives before there can be any judgment about the Vega GPU micro-architecture’s efficiency. And JHH over that Nvidia is not stupid, he sees the implications of AMD’s Infinity Fabric and how that part of the what will make Navi such a threat to Nvidia is already there inside of Vega. The only reason that Vega 64 can not beat the GTX 1080Ti is all down, mostly, ROP Counts with the Ti having 88 and Vega 64(Vega 10 base die based) only 64 ROPs. All that AMD has to do with Vega is re-spin up a dual GPU on one die variant or a new Vega micro-architecture based die tape-out with 88 ROPs and maybe that will happen soon.

AMD’s getting Vega in Raven Ridge APUs and that’s getting more games/games developer attention for AMD than all of AMD’s Flagship, or even mainstream GPUs, could hope for currently. AMD’s Next generation of consoles will most likely be Zen/Vega based and AMD’s console APUs get loads of games/game developer attention. So Games/Gaming engine developers will already want to target Raven Ridge/Vega to get a head start on developing for 16 bit in games, and Vega’s Explicit Primitive shaders that will eventually be used in the next generation of console APUs.

BS
AMD is giving the FreeSync logo to monitors that have passed it's testing. Anyone who wishes to not put an AMD logo on it, can use the Adaptive Sync branding. Nvidia could also support Adaptive Sync. But in that case, GSync monitors would be staying on the selves until well,... forever probably.

Not Really BS when you look at VESA's Tax Exempt status and VESA's responsibility to all of its dues paying membership, AMD and Nvidia Included. Lets get things under that Standards body's branding and those AMD and Nvidia/Intel Others' Parts Suppliers commercial branding out and away from The Standards and even OEM branding that should take precedence.

AMD, Nvidia/Intel/others who are really only parts suppliers to the AIBs and PC/laptop OEMs need to keep their Branding secondary and supplicant to the OEM's own Branding.
So I'm for regulations and antitrust law extentions that specifically prohibit mere parts suppliers to any OEMs from supplanting or influncing to any degree the OEM's chosen branding for those OEM's products. Who is doing that wagging here the dog or its tail.

John GR is another good example of one who lacks personal integrity among that gaming addict demographic and a technology market that needs to be reined in of its abusive monopoly intrest antitrust violations and antitrust laws that have not been enforced to the letter of the law for the past 40 odd years.

Let's put all parts suppliers to any OEM/AIB or other third party OEMs back in their proper place and the free market back in the technology markets that are in fact worse than that old Oil market of the 19th and early 20th century before the antitrust laws where created out of necessity to enforce the fair and free markets. These technology Trusts need to be seriously reined in.

No you clueless bumpkin, AMD asked of VESA and got approval to take the VESA eDP Standard and bring that eDP VESA variable refresh for laptops eDP standard out to make it external(For External Monitor USAGE) and used eDP to make their so called "FreeSync" and VESA then took that external VESA eDP and incorporated that into VESA's DisplayPort Adaptive-Sync 1.2a standard. At that very point of being accepted into the VESA DP 1.2a standard the standards BODY's(VESA) branding takes precedence in all fairness to Intel, Nvidia and others who are members of the same standards body.

It is imperative that VESA not use any external marketing Branding from any of its many members as VESA is only there to support Industry Wide Adoption of the Various VESA standards. VESA gets its non profit status for a reason to impartially promote industry standardization and make PC/other devices with any graphics capabilities have hardware/software/firmware inter-operation with all monitors that adhere to VESA's display standards.

I do not care about AMD’s outside of VESA competition in that Branding war between AMD’s FreeSync and Nvidia’s G-Sync and that needs to remain outside of any association with VESA’s DisplayPort Adaptive-Sync and unless AMD Brands its own proprietary standard outside of VESA then VESA’s branding takes precedence.

That Nvidia Abusive Monopoly Branding land grab among the various third party AIBs, PC/Laptop OEMs with Nvidia trying(illegally) to lord over those Third Party Independent AIB's/OEM's respective third party branding is a matter for the US Justice Department’s Antitrust division and the US courts that are together responsible for enforcing antitrust laws.

But Really the Marketing Folks at AMD are out of their collective mind when it comes to their "profession’s" obsession with all that branding crap and obfuscating(Re-branding and confusing Nomenclature) that seeks to confuse the consumer, ditto for Nvidia’s/Others branding/re-branding/other marketing driven nefariousness. AMD and Nvidia's/Others' white-papers have been so dumbed down to a cretinous level in order to placate the various companies’ marketing departments, much to the detriment of the proper understanding of any of AMD’s/Others products' real computing potential.

It’s hayseeds like yourself, in that Brand Madness of insane company partisanship(Red and Green), that have brought computing and computing sciences down to a lowest common denominator level of misunderstanding and confusion.

Screw those damn corporations'(MBAs and Marketing Monkeys) but laud their engineers and computing needs to be taken back from the Gamers and their mind numbing stupidity.

All props to AMD’s engineers and if AMD’s marketing morons can be kept out of the way things will continue to improve for AMD. But hey Joe Sixpack GPU/CPU makers are not football teams and high technology is way beyond your collective pay-grades. It’s a good thing that these Gamer Folks where not around when they were framing the Constitution or leaders would most certainly be wearing Wigs and Crowns in North America now.

It’s about time to seriously reign in the Technology Trusts and get the free and fair markets back where proper competition will lead to great advancements. Really marketing folks are not that bright by Engineering/Sciences or any academic standards but just look that how the gaming bumpkins fall for all that marketing snake-oil symbolism.

So NVidia should spend MILLIONS of dollars telling engineers to go back to the drawing board, create a MODULE, work with monitor manufacturers etc. then give all that information over to the public?

Remember the "open standard" that Freesync is based on is not the ideal solution. The ideal solution requires a physical module for reasons to long to discuss here but they do include proper overdrive values to prevent color shift when the frametime is variable. That's even more critical once HDR enters the picture.

I do not like being walled into a garden of NVidia + NVidia either but frankly AMD is arguably no better as an NVidia GPU can't work with FREESYNC monitors.

So how come you don't give AMD shit for not allowing NVidia GPU's to work on their Freesync monitors?

The good news is that when HDTV's start using Freesync 2 for these game consoles there may be pressure on NVidia to work out some sort of deal.

Remember it's all BUSINESS decisions. Don't expect either company to just do something out of the goodness of their hearts.

Not when Nvidia is making money off of G-Sync and more vendor lock-in for profit. Nvidia will only come around if being G-Sync only begins to reduce Nvidia's revenues and market share.

That Integrated Vega Graphics that comes along with the Zen CPU cores on Raven Ridge APUs is what's going to get AMD it's most immediate market share gains against Intel. And that's a big deal for AMD and the numbers of game/gaming engine developers that target integrated graphics because most PCs/Laptops are already coming with integrated graphics regardless of what discrete GPU the user may go with.

So Laptop makers will be able to support external DP Adaptive-Sync monitors connected to laptops that support VESA's DP Adaptive-sync and that Adaptive-sync over HDMI standards as well as the laptop's internal display's eDP variable sync support that's been around for ages and that eDP is what DP Adaptive-sync is based off of anyways.

I don't care about freesync branding. Any company is free to use the variable refresh standard, which is what freesync is. Nvidia could support variable refresh and call it Nvidia max tech 2000 for all I care as long as I can plug the monitor in and have it work just the same.

And I never said nvidia should open up gsync, I simply want them to support variable refresh as well. If they're confident people will still prefer gsync then why shouldn't they?

And yeah I completely agree that we shouldn't expect companies to do anything nice, ever. I expect them to employ slavery, murder, anything to make a dollar more. Anything at all they can get away with they will. Punching babies to death infront of their mothers in the delivery room included if they could make a penny more.

That's my point. If they can make a module, etc., and so forth for G-Sync, and prove that the value it adds over what's possible with VESA adaptive refresh is worth it, then that's great. Support the standard, and provide a premium option for the users to choose.

If the only value it adds, however, is that they're not going to let you use the open standard, then that's crap. This is especially true when there's a reason to get a non-G-Sync display, such as price, a lack of model with a specific feature set, etc. You're just hurting your users at that point.

AMD isn't preventing NVIDIA from using Adaptive Refresh. It's a VESA standard. NVIDIA could make their own implementation of the vendor-specific parts. Hell, they could call it "G-Sync Lite".

And, no, I never said that NVIDIA needs to open-source their G-Sync technology. I don't know where you got that idea from.

Nvidia GPUs could work on FreeSync monitors from day one. But then, who would pay for the $200 more expensive version of the same monitor with the GSync board? By the way, that cost difference was more like $100-$150 when Nvidia had competition on the high end. Guess what. AMD didn't increased the price of Nvidia's GSync module. I know you where thinking that even in that case it was AMD's fault, but the truth is that Nvidia's greed is the common denominator here.

It doesn’t require a proprietary physical module. It just requires a TCON (timing controller) unit capable of adjusting overdrive, and whatever other control values are available, for a variable frame time. Displays already had timing controller units, they just were not designed for adaptive sync. What you are missing is that the g-sync module is just a proprietary TCON. If you had a compatible panel, you could probably pull a standard TCON out, and replace it with a g-sync module if it was programmed for that panel. A g-sync module is a physical module that replaces another physical module. There isn’t too much of a reason for it to be an overpriced, Nvidia branded physical module. It was using an FPGA, which is an expensive solution compared to a high volume ASIC like most standard TCON chips. They had to add a bunch of extra stuff to try to make it work without changing the standard, while the proper solution was to just change the standard.

Initial FreeSync displays probably weren’t very good at adjusting overdrive and such, but it has been a few years, and most TCON makers have probably updated their designs. The ideal solution is to do it in a proper, standards conforming manner, such that that there is no unnecessary vendor lock-in between displays and video cards. We have been able to plug just about any display into any computer for a long time. Apple had a proprietary display connector for a little while back in the late 90’s to early 00’s, but even Apple just uses standard connectors these days. Nvidia g-sync needs to just die and be replaced by the standard.

So nVidia is going to tell customers that want to fully utilize their existing hardware combinations that because it doesn't meet "their" standards they choose not too? I don't see that going over very well.

It would only take a driver update to allow nVidia cards to support adaptive sync. The only reason with a ring of truth is that they don't want to do it is because they want to force customers to purchase the displays THEY deem acceptable ... and the extra royalty fee for the module.

All this BS about the differences between a GSync and Adaptive Sync/FreeSync technologies is irrelevant if you bought a TV that has one (to be a TV) and find that it won't work with a feature of another device you own because of a corporations hubris.

As for Freesync versus Gsync, Gsync is the superior implementation, even if it does cost more.

But at this point, it's clear neither Freesync or Gsync are going to take off. With VRR being put into the HDMI 2.2 specification, that will likely be the de-facto winner as that will get VRR into mainstream TV units.

Every day the TV market further embarrasses the computer monitor market. We are still waiting for someone to make affordable displays with multi-zone backlighting or OLED and HDR 4k with VRR. The truth is that they can't because no one buys small displays anymore. If the 2019 LG OLED TVs have VRR, OLED, 4k @120fps, and REAL HDR (Not this bullshit 400cd brightness level on computer monitors). What computer monitor compares to that? Even Nvidia bought a cheap multi-zone backlit TV panel, slapped a gsync unit onto it and now 65 inches is the new "computer monitor". If you want a small display your fucked... economy of scale on TVs is going to make small displays a think of the past.

My 55inch 4k LG OLED looks so incredible in HDR games like Shadow of War, Destiny 2 and Final Fantasy 15. Best PC purchase I've ever made. The 1080Ti is more or less required for these games though.

I don’t know if we will be getting OLED computer displays anytime soon. While they say burn in isn’t an issue, I still find stories about possible burn in on OLED TVs. I also assume that there is a reason why we don’t really have OLED computer displays. Computer displays use static images all of the time; they are the opposite of a TV usually. Since the pixels do age, it makes sense that if you leave it on one color for a long time, it may age faster than those around it, leaving it dimmer after some period of time. I have specifically seen some mention of the red in the MSNBC logo eventually causing burn in like issues. Since I have someone in my houshold that leaves news channels on all day, I have been planning on just going with a good quality LCD. It will not match the OLED in some respects, but it can go a lot brighter, which might be useful where my television is. It will be nice if they can get the micro led technology to a manufacturable (for a reasonable price) state, since that could be better than OLED. It will be difficult to scale down to smaller, high pixel density displays through. The Samsung Wall was 146 inches for a reason.

MicroLED is, and always has been, the way forward. Unless someone decides to revive SED/FED, that is. Sony demonstrated a working 55" unit like 6 years ago and a massive mLED wall display a year before Samsung's Wall appeared. Regular consumers won't be buying any until 2022, most likely.

As a protocol, G-Sync isn't superior to VESA AdaptiveSync, AMD FreeSync, or HDMI 2.1 VRR (HDMI 2.2 is not a thing, you're confusing it with HDCP 2.2, the DRM). That said, G-Sync displays tend to have better quality control (ranging from panel uniformity to calibration tools) - and significantly higher cost to go with it.

HDMI 2.1 Game Mode with Variable Refresh Rate will no doubt be a massive win for everyone, but it will take time to spread. So far, none of the announced 2018 HDTVs support it (that could change) and the only device on the market (so far) that claims to is the Xbox One X, but their support is contingent upon full ratification of the HDMI 2.1 spec.

NVIDIA is showing no fear of competition, in fact they are intending to move into the living room as evidenced by their new BFG Displays.

Feels like GSYNC is dying to me. I keep up to date on every official driver release thread from Nvidia's forums, and see several complaints lately from people with GSYNC monitors. Nvidia seems either unwilling/incapable to make GSYNC work well with Windows 10 or is just slowly dropping support for when HDMI VRR takes over.