NVIDIA Updates on G-Sync HDR: 4Kp144 Monitors On Sale at End of May, Other Models Coming Later This Yearby Nate Oh on May 16, 2018 3:00 PM EST
While NVIDIA's upcoming ultra-premium G-Sync HDR monitors have been in the public eye for some time now, the schedule slips have become something of a sticking point, prompting the company in March to state that 27” 4K 144 Hz models would be shipping and on the market in April. Needless to say, those displays are yet to launch, though preorders for the Acer Predator X27 and ASUS ROG Swift PG27UQ were listed in Europe in mid-April.
Putting some amount of speculation to rest, NVIDIA has indicated the end of May for shipping and e-tail availability of the Acer Predator X27 and ASUS ROG Swift PG27UQ, though ultimately this decision is in the hands of Acer and ASUS. On that note, Acer stated that they had no updates on availability at this time. Both models were first showcased as reference prototypes during CES 2017, and as part of the larger G-Sync HDR lineup, the Predator X27 and PG27UQ will be the first monitors on the market.
|NVIDIA G-SYNC HDR 2018 Monitor Lineup|
ROG Swift PG27UQ
ROG Swift PG35VQ
ROG Swift PG65
OMEN X 65 BFGD
|Panel||27" IPS-type (AHVA)||35" VA
|Resolution||3840 × 2160||3440 × 1440 (21:9)||3840 × 2160|
|Pixel Density||163 PPI||103 PPI||68 PPI|
|Max Refresh Rates||OC
(4:2:2 chroma subsampling)
(4:4:4 chroma subsampling)
|Variable Refresh||NVIDIA G-Sync HDR Scaler/Module||NVIDIA G-Sync HDR Scaler/Module||NVIDIA G-Sync HDR Scaler/Module|
|Response Time||4 ms||4 ms?||Unknown|
|Brightness||1000 cd/m²||1000 cd/m²||1000 cd/m²|
|Backlighting||FALD (384 zones)||FALD (512 zones)||FALD|
|HDR Standard||HDR10 Support||HDR10 Support||HDR10 Support|
|Inputs||2 × DisplayPort 1.4
1 × HDMI 2.0
|Availability||May/June 2018||Q4 2018?||Summer 2018?||Fall 2018|
NVIDIA did not mention the comparable 27" and 35" AOC models (AGON AG273UG, AGON AG353UCG) were not mentioned but are presumably operating on a similar release timeline. There was also no mention of Acer's Predator XB272-HDR.
While the 27” 4K 144 Hz models were originally slated for a late 2017 launch, Acer and ASUS made a surprising announcement last August on delaying their G-Sync HDR flagships to Q1 2018. Even with NVIDIA’s current end-of-May assessment, ASUS remarked offhand that a June launch was more likely, as firmware and other development work was still ongoing. Not that those were the only G-Sync HDR displays announced – at Computex 2017, Acer and ASUS unveiled 35” curved ultrawide 200 Hz G-Sync HDR displays originally for a Q4 2017 release. Meanwhile, at CES 2018 NVIDIA had already pushed forward with revealing 65-inch smart TV esque G-Sync HDR monitors with integrated SHIELDs, dubbing them “Big Format Gaming Displays” (BFGDs). No update was provided on the schedule for the 35” and BFGDs, only that they were due to come later this year.
Despite the launch on the horizon, full specifications and pricing have yet to be published. The features and specifications remain the same as we have known earlier: utilizing AU Optronics’ M270QAN02.2 AHVA panel, the 27” G-Sync HDR monitors bring 3840×2160 resolutions with up to 144 Hz refresh rate (at half chroma) and quantum dot film, and offering DCI-P3 color gamut and HDR10 support, a peak brightness of 1000 nits brightness, and full array local dimming (FALD) functionality with a 384 zone direct LED backlighting system. On top of that, Acer and ASUS include their own monitor features and OSDs; for the actively cooled ROG Swift PG27UQ, this includes Tobii eye-tracking and ultra-low motion blur (ULMB).
For the few refresh rate asterisks, the amount of bandwidth needed for HDR at 10-bit 4Kp144 with 4:4:4 chroma subsampling exceeds DisplayPort 1.4's capability, and so setting the refresh rate above 98 Hz will have the monitor drop to 4:2:2 chroma and use dithering with 8bit and frame rate control (FRC). Though this bandwidth bottleneck is largely out of ASUS/Acer and NVIDIA’s hands, with DisplayPort 1.5 and even HDMI 2.1 very much too new to be incorporated in these models.
That being said, a straight conversion of European preorder prices sans VAT puts the price range at $2500 to $3000, though this does not directly implicate US pricing. As G-Sync HDR flagship gaming monitors, they have essentially every feature of ultra-high-end consumer/gaming monitors. But additionally, the AU Optronics AHVA panel is pricier as it is only purchased as combined LCD and backlight unit, and with the cost of the upgraded G-Sync HDR scaler and module, adds a cost premium that is passed onto consumers.
Ultimately, the HDR situation itself is also a little murky. Outside of HDR10 support, NVIDIA’s G-Sync HDR certification mandates and requirements are kept between them and the manufacturers, and so the specific dynamic range of G-Sync HDR isn’t clear. Presumably there are quantized public guidelines with minimum peak brightness, localized dimming capability, minimum percentage coverage of DCI-P3 gamut, and the like. In other words, much like VESA’s open DisplayHDR 400, 600, and 1000 standards that are both directly linked to numerical performance metrics but also can be independently verified by consumers themselves with VESA’s open test tools and test benchmarks.
At the time, NVIDIA did not express any standardized HDR specifications outside of their G-Sync HDR certification process, reiterating that G-Sync HDR represented a premium gaming experience and expecting OEMs and monitor manufacturers to list any HDR specs relevant to their models. In any case, these 27” monitors adhere and thus carry the UHD Alliance’s Ultra HD Premium logo, which is 4K specific. For the publicly-announced G-Sync HDR displays, it appears that only three types of AU Optronics panels are involved, and so capabilities and featuresets will naturally align closely. NVIDIA also stated that G-Sync HDR had no particular focus on Windows HDR support.
As HDR monitors are a burgeoning market segment, NVIDIA brought up the need for consumer education on the wide spectrum of HDR performance. For consumers looking for a high-end variable refresh rate monitor with HDR, an HDR brand not strictly attached to performance parameters isn’t quite as elucidating as "4K", "IPS", "144 Hz refresh rate", or "4ms response time". As new panel technologies are developed and mature, it will be interesting to see how these changes would be conveyed through the “G-Sync HDR” brand.
Post Your CommentPlease log in or sign up to comment.
View All Comments
flashbacck - Wednesday, May 16, 2018 - linkNvidia cards are all technically capable of freesync too, but no way in hell are they ever going to enable it *sigh*
Dr. Swag - Thursday, May 17, 2018 - linkNot true... You need hardware support for it. That's why cards like the 280 didn't support it while the 290 and 285 did.
foxtrot1_1 - Wednesday, May 16, 2018 - linkYes, there is a benefit. G-sync still has a wider refresh rate and, because of Nvidia’s control over the scalar, will deliver more consistent performance than a simple certification. Is it worth the premium? Probably not, but nobody buying a $3,000 monitor is going to quibble. You want the very best of everything at that price point, and the fastest GPUs available today are all G-Sync compatible.
imaheadcase - Thursday, May 17, 2018 - linkHow is it a money grab when you literally have the card that works with it? rofl
Ikefu - Thursday, May 17, 2018 - linkThey make the GSync add in card that goes in the monitor, not the video card. Its around $100 price premium for the add-in card that gets added to the cost of the monitor. Freesync is a free spec that anyone can implement without paying royalties.
Yojimbo - Thursday, May 17, 2018 - linkYeah, I've been disappointed that NVIDIA hasn't yet brought out anything more with G-Sync that justifies their method of implementing it.
I don't have experience comparing Freesync and G-Sync. I have heard that G-Sync does some things slightly better, but it doesn't seem to be enough to make the difference in price worth it. NVIDIA says that the prices are higher because monitor manufacturers are able to charge a premium for G-Sync monitors, and there may be some truth to that, but I think the manufacturers' component costs for a G-Sync monitor are likely higher than for a Freesync monitor.
On the other hand, I think there wouldn't be any Freesync without G-Sync, and NVIDIA probably wouldn't have made G-Sync, either, if it meant implementing it the way Freesync is implemented. Without G-sync already existing, there just isn't much incentive for anyone to develop something like Freesync. Freesync was a reactive scramble by AMD to neutralize any competitive advantage NVIDIA might get with G-Sync.
Manch - Friday, May 18, 2018 - linkFreeSync was a reaction to G-Sync but it is just the implementation of a VESA std that had already been in place long before G-sink ever came out. Not a lot of work on AMD's part at all. They just implemented support for it their GPU's(Again, not a lot of work). There was even talk that the current at the time monitors could technically support FreeSync but good luck getting manufacturers to upgrade the monitors firmware to support it. Then not long after, "new" monitors with FREESYNC!!! G-SYNC is just a vendor lock in.
Yojimbo - Sunday, May 20, 2018 - linkListing facts that are all incorrect is highly counterproductive.
From everything I remember, the VESA standard was worked on and created because of G-Sync. There's no such thing as "just an implementation of a standard" as if a standard is some magic ether that is free and effortless. Designing Freesync cost money and implementing it similarly costs money. Making sure that implementations are up to standard also costs money. I am under the impression that part of the extra cost of G-Sync is the stricter and more costly validation program that NVIDIA makes monitor makers participate in as part of G-Sync when compared with AMD's Freesync designation.
Monitors need more than just firmware to support Freesync (or the VESA standard that Freesync implements). Freesync monitors became available significantly after G-Sync monitors were available.
Manch - Monday, May 21, 2018 - linkNo, the VESA standard known as Adaptive Sync was established before but rarely if ever implemented until Nvidia came out with G-Sync. As a counter AMD released Free-Sync, which is VESA's Adaptive Sync implemented to work on their cards aka(not a lot of work). AMD has even admitted that FreeSync is basically Adaptive Sync. Never said it was free or magic but AMD didn't put a lot of work into it as the standard was already fleshed out(that's why it's called a std ;) ). AMD simply asked how far can we push this with current tech without additional HW like G-SYNC. The answer? 48hz-75 hz which is why you got what you got. Now FreeSync 2 with LFC or whatever is an extension of that standard and that requires changes to HW. As far as Monitors go, there were discussions about whether or not some current at the time panels could utilize FreeSync but alas it would require upgrading the firmware as there was not a HW restriction preventing them from implementing the standard. Some have even attempted with limited success to enable it on various nonfreesync monitors.
Yojimbo - Wednesday, May 23, 2018 - link"No, the VESA standard known as Adaptive Sync was established before but rarely if ever implemented until Nvidia came out with G-Sync. As a counter AMD released Free-Sync, which is VESA's Adaptive Sync implemented to work on their cards aka(not a lot of work)."
Companies either do work and submit it to a standards group to vote on inclusion, or various companies send people into a SIG to hammer out a standard. In this case, AMD did the work on Freesync in response to NVIDIA's G-Sync. They then submitted it to VESA and VESA accepted it and named is Adaptive Sync, as part of an extension to the DisplayPort standard. So, yes, Freesync is basically AdaptiveSync. But AMD did do the work for it. A simple web search will lead you to multiple sources talking about this. Here's one:
The difference between AdaptiveSync and Freesync is that "Freesync" includes an AMD certification procedure, although apparently the certification procedure that is not as strict as NVIDIA's certification procedure for G-Sync.
The original Freesync does require changes to hardware, it requires a special scalar chip. From https://www.kitguru.net/components/graphic-cards/a...
"To build monitors supporting Adaptive Sync and FreeSync technologies special display scalers are required."
There was a lot of confusion about Freesync when it first came out. There was a lot of misinformation being bandied about. That confusion was caused or at least exacerbated by AMD's PR. But what I am telling you in my posts is what seems to be the reality of the situation. Go research it for yourself to find out if you don't believe me.
"Some have even attempted with limited success to enable it on various nonfreesync monitors. "
Some have attached lawnmower engines to bicycles but that doesn't make them Harley-Davidsons.