FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • YukaKun - Thursday, March 19, 2015 - link

    Until we don't have video showing the 2 of them going in parallel, we can't decide for a winner. There might be a lot of metrics for measuring "tearing", but this is not about "hard metrics", but how the bloody frame sequences look on your screen. Smooth or not.

    Cheers!
  • eddman - Thursday, March 19, 2015 - link

    The difference cannot be shown on video. How can a medium like video which has a limited and constant frame-rate be used to demonstrate a dynamic, variable frame rate technology?

    This is one of those scenarios where you can experience it only on a real monitor.
  • Murloc - Thursday, March 19, 2015 - link

    putting it on video makes the comparison kinda useless.
  • invinciblegod - Thursday, March 19, 2015 - link

    I am one of those who switch every time I upgrade my GPU (which is every few years). Sometimes, AMD is on top while other time Nvidia is better. Now, I must be locked into one forever or buy 6 monitors (3 for eyefinity and 3 for nvidia surround)!
  • jackstar7 - Thursday, March 19, 2015 - link

    If they can put out a confirmed 1440p 21:9 w/Freesync they will get my money. The rumors around the Acer Predator are still just rumors. Please... someone... give me the goods!
  • Black Obsidian - Thursday, March 19, 2015 - link

    It's pretty likely that LG will do just that. They already make two 1440p 21:9 monitors, and since it sounds like FreeSync will be part of new scalers going forward, you can probably count on the next LG 1440p 21:9 picking up that ability.
  • xthetenth - Thursday, March 19, 2015 - link

    I'm right there with you. I'm already preparing to get the update on the LG 1440 21:9 and a 390X, because if the rumors for the latter are anything like what the card is, it's going to be fantastic, and after getting a 21:9 for work I can't make myself use any other resolution.
  • Black Obsidian - Thursday, March 19, 2015 - link

    Same deal here. If nVidia supported FreeSync and priced the Titan X (or impending 980 Ti) in a more sane manner I'd consider going that way because I have no great love for either company.

    But so long as they expect to limit my monitor choices to their price-inflated special options and pretend that $1K is a reasonable price for a flagship video card, they've lost my business to someone with neither of those hangups.
  • kickpuncher - Thursday, March 19, 2015 - link

    I have no experience with 144hz screens. I've been waiting for freesync to come but you're saying the difference is negligble with a static 144hz monitor? Is that with any FPS or does the FPS also have to be very high? (in regards to 4th paragarph on last page). Thanks
  • JarredWalton - Thursday, March 19, 2015 - link

    I'd have to do more testing, but 144Hz redraws the display every 6.9ms compared to 60Hz redrawing every 16.7ms. With pixel response times often being around 5ms in the real world (not the marketing claims of 1ms), the "blur" between frames will hide some of the tearing. And then there's the fact that things won't change as much between frames that are 7ms apart compared to frames that are 17ms apart.

    Basically at 144Hz tearing can still be present but it ends up being far less visible to the naked eye. Or at least that's my subjective experience using my 41 year old eyes. :-)

Log in

Don't have an account? Sign up now