We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.

More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]

Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.

On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.

Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.

Comments Locked

118 Comments

View All Comments

  • chizow - Thursday, January 8, 2015 - link

    Interesting, so is AMD once again perpetuating the myth FreeSync has no additional cost? LMAO. Sticker shock incoming for AMD fanboys, but yes, I am looking forward to further dissection of their half-baked, half-assed solution.
  • Intel999 - Friday, January 9, 2015 - link

    The one company that is offering both G-Sync and Free Sync monitors has priced the Free Sync $150 less. So even if Free Sync isn't free it is the more reasonable of the two.
  • TheJian - Friday, January 9, 2015 - link

    You do understand that the "crew" has not seen it running a GAME yet correct? The "crew" saw the same things anandtech did. Windmill demos etc. Get back to us when we all see them running GAMES and NOBODY can tell the difference at THAT point in time. Currently we know specific demos set up to show the effects work, but have ZERO idea how it works in games because for some reason (umm, inferior tech?), they even shy away from showing it at a CES show 3 months before they hit supposedly. Hmmf...Not much confidence in the gaming part right or why wouldn't you have shown a dozen games working? How hard is it to have games running?

    Freesync is looking pretty shaky to me or they'd be showing games running. Anandtech mentions the costs (and forgets testing for certification also costs, panel makes must pay this, testing equipment costs also), so it isn't free. Free for AMD maybe, but that's it and they have to pay to R&D the cards to comply also or all their cards would work (Nvidia also). There is a reason NV won't support it for at least another generation (cards not compatible as AMD even suggests), and also a reason OLD AMD cards won't work either. Compliance from all sides is NOT free. Scaler tech had to be modified (so that required some "SPECIALIZED HARDWARE" correct?), monitors need to be VESA certified to wear that label, and AMD/NV have to mod their cards. I could go on but you should get the point. NONE of that is free.

    "FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers."

    Did you miss that part of the article? The standard is free, but that is it. The rest COSTS money and WE will be paying it. The only question is HOW MUCH?
  • chizow - Thursday, January 8, 2015 - link

    @Creig: why are you still quoting dated spec sheets that clearly do not reflect reality now that we have seen ACTUAL FreeSync monitors on the market that clearly show those specs are inaccurate, if not to purposefully mislead? AMD can say 6 months ago, that FreeSync *CAN* support anywhere from 9 to 240Hz, but if the actual scalers that go into production only actually support FreeSync in the 40-60Hz band, or the 30-144Hz band (on the BenQ), is that 9 to 240Hz statement accurate? Of course not, we can just file it under more nonsense AMD said about FreeSync prior to actually y'know, doing the work and producing an actual working product.

    And no I am not wrong about AMD claiming monitors might essentially get a Free upgrade to FreeSync because Raja Koduri was telling anyone who would listen this time last year, that there might be existing monitors on the market that could do this with just a firmware upgrade. But I know, just more FUD and misinformation from AMD as they scrambled to throw together a competing solution when Nvidia completely caught them with their pants down by introducing an awesome, innovative new feature in G-Sync.

    http://techreport.com/news/25867/amd-could-counter...
    "The lack of adoption is evidently due to a lack of momentum or demand for the feature, which was originally pitched as a power-saving measure. Adding support in a monitor should be essentially "free" and perhaps possible via a firmware update. The only challenge is that each display must know how long its panel can sustain the proper color intensity before it begins to fade. The vblank interval can't be extended beyond this limit without affecting color fidelity."

    What's even more interesting is that TechReport subsequently edited their story to exclude mention of Koduri by name. Wonder why? I guess he probably asked them to redact his name as he tried to distance himself from comments that were obvious misinformation at a time they had no solution and were just grasping at straws. But revisionist history aside, you and every other AMD fanboy was screaming from the rooftops a year ago saying FreeSync would be better because it would be "Free" and Nvidia's G-Sync was charging an overpriced $200 premium. Now, its a $50-100 premium, maybe, yet still better. Interesting what a year and actual product will do to change the situation.

    And yes, FreeSync locks you in to AMD solutions only, and only newer AMD solutions at that. Who else supports FreeSync other than AMD right now? What makes you think Intel has any interest in FreeSync, or that their display controllers can even support it, given even many of AMD's own cannot? What's even more funny is Intel has shown more interest in Mantle than FreeSync, and yet, AMD was happy to deny them there, but FreeSync which they are freely giving away, no interest from Intel whatsoever. So yes, if you buy a FreeSync monitor today, you lock yourself into AMD and should have no expectation whatsoever that any other GPU vendor will support it, simple as that.

    And where I was going with support? Where I was going is simple, you can't naively assume Nvidia, Intel, Qualcomm or anyone else can simply support FreeSync when it was a spec that AMD designed to work with their hardware, and only works with SPECIFIC AMD hardware at that. Again, you assume many things but as usual, you are wrong, the only question is whether or not you are attempting to deceive purposefully or you are just speaking ignorantly on the topic. Tahiti and other GCN 1.0 cards are NOT supported for 3D games with FreeSync, from AMD's own FAQ, only a handful of newer GCN 1.1+ ASICs (Hawaii, Tonga, Bonaire) and a few newer APUs. But this is par for the course for AMD and their half-baked, half-supported solutions, like TrueAudio, Virtual Super Resolution, CF Frame Pacing etc., only SOME of their cards in any given timeframe or generation are supported in all mode and there is no clean cut-off. Nvidia on the other hand is easy, anything Kepler and newer. The way of the world is, Nvidia is much better at supporting new features on legacy hardware. AMD does a much worst job, but to say anyone can just support FreeSync if they want to is a bit laughable given AMD can't even support FreeSync on all of their still relevant cards, and I am SURE they want to. :D

    Of course you take the position AMD FreeSync has a better position in the market, but unfortunately, reality says otherwise. Nvidia holds a commanding lead, even bigger in the last few months where these relevant affected SKUs were sold, in the only TAM that matters in the use-cases these displays will be sold and deployed: gaming. That means mobile and dGPU. Overall graphics market share means absolutely nothing here because over 50% of that is Intel which might as well not exist in the discussion. That brings us back to over 70% market for Nvidia (Kepler + Maxwell) vs. 30% or less for AMD (3 ASICs + a few APUs in same last 3 years). Oh and 0 monitors on the market, while Nvidia has already brought at least 1 model to market from 7 different partners, with even more on the horizon. But yes, some idiot will sit here and tell you something you can't even buy on the market, is inferior to the established solution G-Sync, and is only supported by the market underdog is somehow better suited to succeed! Amazing.

    You haven't read any reviews because you choose ignorance, that's all it comes down to. Honestly, is your reading comprehension so poor that you didn't read my reference to PCPers assessment, based on THEIR DISCUSSIONS WITH AMD, live from CES? Ryan Shrout said plenty to indicate FreeSync is already inferior to G-Sync (worst minimums, tearing above refresh or Vsync) and that's not even addressing the remaining questions whether or not FreeSync even improves latency over V-sync.

    Here you go, so you can't continue to feign ignorance, you can also update your frame of reference regarding nonsensical claims of 9-240Hz FreeSync support, too, while you are at it.
    https://www.youtube.com/watch?v=8rY0ZJJJf1A#t=2m20...
  • MTRougeau - Thursday, January 8, 2015 - link

    Mark Rejhon over at blurbusters.com got a chance at some FreeSync monitors at CES today, and his initial impression is that AMD's implementation is on par with G-sync. "Now, my initial impressions of FreeSync is that it's on an equal footing to GSYNC in motion quality. At least by first impression, without looking closely at them "under a microscope". FreeSync certainly eliminated stutters and tearing, just like GSYNC does, even if the methods/technologies work somewhat differently. A future article will probably compare GSYNC and FreeSync. Many sources have reported various pros and cons of GSYNC and FreeSync, but a major one that sticks out: Lower cost of implementing FreeSync." http://forums.blurbusters.com/viewtopic.php?f=16&a...

    Of course, we will have to wait for more detailed analysis, but early impressions are encouraging.
  • MTRougeau - Thursday, January 8, 2015 - link

    A bit more from his post: "...I played around with the options of the windmill. It had an option to sweep the framerate. The sweep was seamless, seeing framerate bounce around from 40fps through 60fps without stutter. It looked every bit as good looking at G-SYNC at the same rates (40-60fps in Pendulum Demo). Disable FreeSync and VSYNC brought about ugly tearing and stutters, so it was certainly cool to see FreeSync doing its job.
  • chizow - Friday, January 9, 2015 - link

    Thanks for the link but again, this picture here, should be a huge red flag and reason for concern for anyone interested in FreeSync, or G-Sync for that matter:

    http://www.blurbusters.com/wp-content/uploads/2015...

    Why is Vsync on at all? Why does the associated quote confirm VSync is enabled when it says "Disable FreeSync and Vsync brought about ugly tearing...."

    To me, it sounds like they are still relying on triple buffered Vsync to achieve the incremental framerates between 40-60 (rather than the full denominational increments of max refresh) and then using Adaptive-Sync/VBlank signal to change the refresh rate on the monitor's scaler as the framerate changes (most likely, 1 or 2 frames latency here too).

    But the key is, all that Vsync lag is still going to be present, even if the tearing and stuttering associated with no VSync is gone. That was only half the problem and compromise re: Vsync On/Off, but I guess we will need to observe it under a microscope to see if FreeSync addresses latency at all over Vsync On as well as G-Sync does. My bet is, it does not.

    I am also interested to see what granularity the scalers on the monitors are capable of. Is it 1Hz frequency differences, or is larger chunks?
  • TheJian - Friday, January 9, 2015 - link

    "I played around with the options of the windmill."

    Let us know when someone plays around with GAMES. Windmills mean nothing IMHO. We don't play windmills ;) They saw the same crap demos as everyone else at CES. NO GAMES. Until we see that running on many sites (which will mean we'll see many games as they don't all use the same games), we really don't know how good the tech is.

    I challenge anyone to explain why AMD would NOT run games at CES if everything was working as advertised?
  • FlushedBubblyJock - Tuesday, February 24, 2015 - link

    Uhh, AMD didn't use games because (accepting YOUR challenge) at CES, and everywhere else, thousands of drooling idiot AMD fans don't care ! So long as they can scream nVidia must NOT ever have proprietary technology no matter what, and nVidia "IS RUINING GAMING FOR EVERYONE ! ", AND "nVidia will cost more and everyone will be unfairly locked in! " - they could care less if freesync actually works as well or works at all - because no matter how limited (40-60 fps only for instance) or crappy it is, the AMD fanboys will scream deliriously, forever, it is perfect and just as good and better than nVidia's "greed driven proprietary locked in payware" !
    Since even that level of astounding partisanship is not enough for the drooling amd fans, they will go one step further and AMD knows it - if it doesn't work at all they will all scream in unison " I don't want it and no one can tell at those framerates anyway ! "
    As if that still weren't enough, it could go on for literally YEARS not working correctly while anyone who objected was told they were full of it, as it did with crossfire and massive frametime stuttering, dropped and runt frames, false framerates on all those game tests at all the official websites including here that meant totally unfair unwitting LIES pumping up AMD beyond it's true FPS values, and then when those years pass and it's FINALLY EXPOSED as broken and fraudulent, it could get fixed... and all those amd fans that hate nVidia with their passions, could care less, they will be so HAPPY that AMD finally fixed the broken junk peddled as perfect for YEARS, that they will instantly renew their fan fervor for AMD and launch it to it's highest personal emotional peak positive, ever, while simultaneously complaining that nVidia caused all the trouble with it's money grubbing Gsync crap.
    So, there's why AMD hasn't a worry in the world.
  • tuxRoller - Friday, January 9, 2015 - link

    I know this won't matter to you as you've made your position clear but there's a difference between a specification and implementation. The problem seems to be that most desktop monitors don't work below 30hz due to short pixel memory (igzo can be better, but that's hardly standard).

Log in

Don't have an account? Sign up now