Even though CES proper doesn’t kick off for one more day, NVIDIA and ASUS are getting in an announcement a bit ahead of the curve relating to a new monitor. The two companies have been working on a new high-performance 1080p display aimed at competitive gaming that offers a blistering 360Hz maximum refresh rate,  50% faster than the current crop of 240Hz market leaders. The new monitor will be sold by ASUS as the ROG Swift 360, and is set to be available later this year.

Like other G-Sync projects, NVIDIA’s latest display endeavor looks to be an effort for the company to differentiate itself and its technology in a crowded market, this time by offering an LCD monitor with a rather absurd refresh rate.

The ROG Swift 360 itself is based on a 24.5-inch panel being supplied by AU Optronics, and which is capable of running at up to 360Hz. This is 120Hz faster than the current generation of high-end gaming monitors, which top out at 240Hz, and is an uncommonly large jump to make in a single generation. Unfortunately NVIDIA isn’t disclosing what type of panel is being used here, however all of the 24.5-inch 240Hz G-Sync monitors that have been released to date have been TN, so that’s a safe bet, especially with the kind of refresh rate NVIDIA and ASUS are looking to hit. The bigger question is whether this is a new panel from AUO, or if it’s a further overclocked version of their popular M250HTN01 panel.

NVIDIA and ASUS have not released much else with regards to technical details at this time. The monitor isn’t set to ship until later this year – which as we’re at the very start of the year, likely means it’s still several months off – so the companies undoubtedly still have some technical details to hammer out. Though as most of the work is in the panel and controller, it shouldn’t be too different from ASUS’s 240Hz ROG Swift PG258Q.


ASUS 360 Hz on left, 240 Hz on right. It is a TN Panel

One thing that NVIDIA has confirmed, however, is that this will be a true G-Sync monitor, with NVIDIA supplying the monitor’s display controller. This is an important distinction not only for compatibility purposes, but because it means that the monitor supports full variable overdrive functionality, which will become increasingly important with such a wide refresh rate range. Even fast TN monitors still need to overdrive their pixels to hit high refresh rates, so the overdrive logic is practically as important as the panel itself in order to minimize ghosting. Though it goes without saying that with NVIDIA aiming to beat current generation monitors by 50%, it’ll be interesting to see just how well 360Hz works in practice, and if the monitor really is fast enough to make a 360Hz refresh rate useful.

The target market for the ROG Swift 360, in turn, is competitive gaming, as well as anyone who wants a competition-grade monitor. The benefits of higher refresh rates are pretty well known at this point – higher refresh rates make for smoother experiences and reduce rendering latency – and NVIDIA thinks that there’s still enough benefit to justify a 360Hz monitor. That said, with 240Hz monitors offering an already tiny 4.16ms frame time, the absolute benefit of 360Hz is small: that 50% jump in refresh rate only shaves off a further ~1.4ms at the monitor level (for a total of 2.77ms), which isn’t nearly as great as the jump from 60Hz to 120Hz, for example.

Still, NVIDIA has done their own research around the new monitor and player performance, finding that even with the diminishing returns, a 360Hz monitor did improve flick shot performance in Overwatch. As well, they measured a 14.5ms reduction in total (end-to-end) latency, going from 34.5ms at 240Hz to 20ms at 360Hz.

NVIDIA and ASUS will have the ROG Swift 360 on display this week at CES. So while the monitor won’t be on sale for a while yet, we should be able to get a first-hand look at how well NVIDIA’s future gaming monitor is set to perform.

Source: NVIDIA

Comments Locked

38 Comments

View All Comments

  • willis936 - Monday, January 6, 2020 - link

    Also physics engines can have variable frame rates in game engines too. There’s no reason that too could not be cranked into the hundreds of Hz, except for needing a CPU capable of doing the work.
  • HideOut - Sunday, January 5, 2020 - link

    The human eye can only see to about 70fps, if your above 100hz then you are capped out. This is pretty much a marketing gimmick.
  • Yojimbo - Sunday, January 5, 2020 - link

    The human eye probably can barely tell a steady 30 from a steady 60. But this is not about how the human eye accepts motion from still images, this is about latency. When you issue a command it takes longer on average to register on the screen when the game is running at a lower fps. When change of motion of whatever the eye is tracking on the screen happens, it probably will be visible sooner with a high fps than with a lower one. So the player with the higher fps is given a small boost to his reaction times and that will affect the probability of who kills whom in the match.
  • willis936 - Monday, January 6, 2020 - link

    >The human eye probably can barely tell a steady 30 from a steady 60.

    Oh come on. I thought we had moved past people spouting easily falsifiable nonsense opinions.

    https://www.ncbi.nlm.nih.gov/books/NBK11559/#!po=1...
  • mdrejhon - Tuesday, January 7, 2020 - link

    You clearly haven't seen the latest research.

    https://www.blurbusters.com/1000hz-journey

    The diminishing curve of returns don't disappear until well beyond 1000Hz. There's stroboscopics and sample-and-hold effects to consider, which are different effects than other tests (including the fighter pilot 1/250sec tests).
  • SaberKOG91 - Sunday, January 5, 2020 - link

    Not as much of a gimmick as you might think. I remember reading that a fighter pilot could detect subtle changes in their environment as quickly as 1/250th of a second. This indicates to me that there's also a degree of training and experience that goes into how easily you notice the changes. You would expect similar results from competitive gamers. The intensity of what you are seeing matters a lot too. The brain responds to small changes in intensity faster than large changes because of persistence of vision. So yeah, black to white and back might cap out at a pretty low frame-rate, small variations might be perceived at much higher frame rates. There's also a big difference in knowing exactly what changed and perceiving a change.
  • Tomatotech - Monday, January 6, 2020 - link

    Agree. Most people can’t detect flicker at over 60hz. This isn’t aimed at them. Many fluro lights / poor quality leds noticeably flicker to me, but not to other people, so I’m willing to believe a very small number of people, especially twitch gamers, are even more sensitive than me.
  • SaberKOG91 - Monday, January 6, 2020 - link

    Yup and it's even more noticeable when your head is moving relative to the stationary light.
  • mdrejhon - Tuesday, January 7, 2020 - link

    The thresholds are actually MUCH higher than that test; diminishing curve does not end until well beyond 1000 Hz.

    The fighter pilot test does not take into account of Talbot-Plateau Law. It was a brief-flash object identification test, and it did not flash twice as bright during half-brevity flashes.

    There are other effects of ultra-high-Hz gaming monitors that benefits users, that are not tested in that fighter pilot study.
    -- Latency effects
    -- Motion blur effects (sample-and-hold), https://www.blurbusters.com/1000hz-journey
    -- Stroboscopic effects (phantom arraying), https://www.blurbusters.com/stroboscopics
  • cosmotic - Monday, January 6, 2020 - link

    Citation needed.

Log in

Don't have an account? Sign up now