FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • Midwayman - Thursday, March 19, 2015 - link

    If you have a display with backlight strobing (newest light boost, benq blur reduction, etc) the difference is readily apparent. Motion clarity is way way way better than with out. The issue is its like a CRT and strobbing is annoy at low rates. 75hz is about the absolutely min, but 90hz and above are better. I doubt any of the displays support strobing and adaptive sync at the same time currently, but when you can push the frames, its totally worth it. The new BenQ mentioned in the article will do both for example (Maybe not at the same time.) That way you can have adaptive sync for games with low FPS and strobing for games with high fps.
  • darkfalz - Thursday, March 19, 2015 - link

    Games at 100+ FPS look much smoother. Think of it like perfect motion blur. If you can keep your game between 72-144 Hz it's gaming nirvana.
  • eddman - Thursday, March 19, 2015 - link

    I'm not a fan of closed, expensive solutions, but this hate towards g-sync that some here are showing is unwarranted.

    nvidia created g-sync at a time where no other alternative existed, so they created it themselves, and it works. No one was/is forced to buy it.

    It was the only option and those who had a bit too much money or simply wanted the best no matter what, bought it. It was a niche market and nvidia knew it.

    IMO, their mistake was to make it a closed, proprietary solution.

    Those consumers who were patient can now enjoy a cheaper and, in certain aspects, better alternative.

    Now that DP adaptive-sync exists, nvidia will surly drop the g-sync hardware and introduce a DP compatible software g-sync. I don't see anyone buying a hardware g-sync monitor anymore.
  • Murloc - Thursday, March 19, 2015 - link

    you don't understand the hate because you think nvidia will drop g-sync immediately.
    It's likely you're right but it's not a given.
    Maybe it will be a while before the market forces nvidia to support adaptive sync.
  • MikeMurphy - Thursday, March 19, 2015 - link

    nVidia will protect manufacturers that invested resources into G-Sync. They will continue support for G-Sync and later introduce added support for Freesync.
  • ddarko - Thursday, March 19, 2015 - link

    The fact that only AMD cards work with Freesync now is not because Freesync is closed but because Nvidia refuses to support it. It takes a perverse kind of Alice in Wonderland logic to use the refusal of certain company to support an open standard in its hardware as proof that the open standard is in fact "closed."

    Freesync is open because it is part of the "open" Displayport standard and any display and GPU maker can take advantage of it by supporting that relevant Displayport standard (because use of the Displayport standard that Freesync is part of is free). Nvidia's Gsync is "closed" because Nvidia decides who and on what terms gets to support it.

    Whatever the respective technical merits of Freesync and Gsync, please stop the trying to muddy the water with sophistry about open and closed. Nvidia GPU can work with Freesync monitors tomorrow if Nvidia wanted it - enabling Freesync support Nvidia a dime of licensing fees or requirement the permission of AMD or anyone else. The fact that they choose not to support it is irrelevant to the definition of Displayport 1.2a (of which Freesync is a part of) as an open standard.
  • mrcaffeinex - Thursday, March 19, 2015 - link

    Are NVIDIA's partners able to modify their cards BIOS and/or provide customized drivers to support FreeSync or do they have to rely on NVIDIA to adopt the feature? I know different manufacturers have made custom cards in the past with different port layouts and such. I never investigated to see if they required a custom driver from the manufacturer, though. Is it possible that this could be an obstacle that an EVGA, ASUS, MSI, etc. could overcome on their own?
  • JarredWalton - Thursday, March 19, 2015 - link

    It would at the very least require driver level modifications, which the card manufacturers wouldn't be able to provide.
  • chizow - Thursday, March 19, 2015 - link

    How is this even remotely a fact when AMD themselves have said Nvidia can't support FreeSync, and even many of AMD's own cards in relevant generations can't support it? Certainly Nvidia has said they have no intention of supporting it, but there's also the possibility AMD is right and Nvidia can't support it.

    So in the end, you have two effectively closed and proprietary systems, one designed by AMD, one designed by Nvidia.
  • iniudan - Thursday, March 19, 2015 - link

    Nvidia cannot use FreeSync as it is AMD implementation of VESA's Adaptive Sync, they have to come up with their own implementation of the specification.

Log in

Don't have an account? Sign up now