FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • eanazag - Thursday, March 19, 2015 - link

    The AMD and Nvidia haters all come out of the wood work for these type articles.

    Intel needs to chime in. I suspect they will go the FreeSync route since it is part of the spec and there are no costs.

    I understand Nvidia has some investment here. I fully expect them to support adaptive sync - at least in 5 years. They really need to do something about Phys-X. As a customer I see it as irrelevant. I know it isn't their style to open up their tech.
  • eddman - Thursday, March 19, 2015 - link

    Not to go off-topic too much, but physx as a CPU physics engine, like havok, etc., is quite popular. There are hundreds of titles out there using it and more are coming.

    As for GPU physx, which is what you had in mind, yes, it'd never become widely adopted unless nvidia opens it up, and that would probably not happen, unless someone else comes up with another, open GPU accelerated physics engine.
  • mczak - Thursday, March 19, 2015 - link

    Minor nitpick, intel's solution won't be called FreeSync - this is reserved for AMD certified solutions. Pretty sure though it's going to be technically the same, just using the adaptive sync feature of DP 1.2a.
    (My guess would be at some point in the future nvidia is going to follow suit, first with notebooks because gsync is more or less impossible there though even then it will be initially restricted to notebooks which drive the display from the nvidia gpu which aren't many but everything else is going to require intel to support it first. I'm quite confident they are going to do this with desktop gpus too, though I would suspect they'd continue to call it GSync. Let's face it requiring a specific nvidia gsync module in the monitor just isn't going to fly with anything but high-end gaming market whereas adaptive sync should trickle down to a lot more markets, thus imho there's no way nvidia's position on this doesn't have to change.)
  • anubis44 - Tuesday, March 24, 2015 - link

    @eanazag: nVidia will be supporting FreeSync about 20 minutes after the first hacked nVidia driver to support FreeSync makes it onto the web, whether they like it or not.
  • chizow - Tuesday, March 24, 2015 - link

    Cool, I welcome it, one less reason to buy anything AMD related.
  • chizow - Thursday, March 19, 2015 - link

    There's no need to be disappointed honestly, Jarred just copy/pasted half of AMD's slide deck and then posted a Newegg Review. Nothing wrong with that, Newegg Reviews have their place in the world, its just unfortunate that people will take his conclusions and actually believe Freesync and G-Sync are equivalents, when there are already clear indications this is not the case.

    - 40 to 48 minimums are simply unacceptably low thresholds before things start falling apart, especially given many of these panels are higher than 1080p. 40 Minimum at 4K for example is DAMN hard to accomplish, in fact the recently launched Titan X can't even do it in most games. CrossFireX isn't going to be an option either until AMD fixes FreeSync + CF, if ever.

    -The tearing/ghosting/blurring issues at low frame rates is significant. AMD mentioned issues with pixel decay causing problems at low refresh, but honestly, this alone shows us G-Sync is worth the premium because it is simply better. http://www.pcper.com/files/imagecache/article_max_...
    Jarred has mused multiple times these panels may use the same one as the one in the Swift, so why are the FreeSync panels faling so badly at low refresh? Maybe that G-Sync module is actually doing something, like actively sync'ing with the monitor to force overdrive without breaking the kind of guesswork framesync FreeSync is using?

    -Input lag? We can show AMD's slide and take their word for it without even bothering to test? High speed camera, USB input double attached to a mouse, scroll and see which one responds faster. FreeSync certainly seems to work within its supported frequency bands in preventing tearing, but that was only half of the problem related to Vsync on/off. The other trade off for Vsync ON was how much input lag this introduced.

    -A better explanation of Vsync On/Off and tearing? Is this something the driver handles automatically? Is Vsync being turned on and off by the driver dynamically, similar to Nvidia's Adaptive Vsync? When it is on, does it introduce input lag?

    In any case, AnandTech's Newegg Review of FreeSync is certainly a nice preview and proof of concept of FreeSync, but I wouldn't take it as more than that. I'd wait for actual reviews to cover the science of display technology that actually matter, like input lag, blurring, image retention etc that can only really be captured and quantified with equipment like high speed cameras and a sound testing methodology.
  • at80eighty - Thursday, March 19, 2015 - link

    Waaa
  • chizow - Thursday, March 19, 2015 - link

    Another disappointed AMD user I see, I agree, FreeSync certainly isn't as good as one might have hoped.
  • at80eighty - Friday, March 20, 2015 - link

    had more nvidia cards than amd; so keep trying.
  • chizow - Friday, March 20, 2015 - link

    Doubt it, but keep trying.

Log in

Don't have an account? Sign up now