Acer XB280HK: Introduction, Design and Specs

When it comes to gaming, 4K displays present a conundrum (beyond 4K being used incorrectly, but I’ll still use it). On the one hand, all the extra pixels allow for far more detail. On the other, that is a lot of pixels to push for a GPU. Even with the best GPUs out there, you might – okay, will – have to disable certain features and start to introduce aliasing and other artifacts. A solution to this might be G-SYNC to enable gaming that looks smooth even when running below 60 FPS, and that's what we're looking at today.

G-SYNC, only available on video cards from NVIDIA, allows frame rates below the normal optimal speed of 60FPS to still look very smooth. The Acer XB280HK is the first G-SYNC display to also feature a 3840x2160 resolution. Unlike some other G-SYNC displays the Acer only runs at 60Hz and below, though I don’t believe running faster than 60Hz at 4K resolutions will be much of an issue right now. Anand previously reviewed G-SYNC and described the details of how it works.

Like all currently shipping G-SYNC displays (with the exception of the IPS Acer display announced at CES 2015), the Acer uses a TN-panel. For 120Hz or 144Hz G-SYNC panels you often need to use TN, but 60Hz would allow for IPS. The likely culprit here is cost, as the Acer currently sells for under $800. Other 4K 28” IPS displays cost at least as much and lack G-SYNC, making them a much worse choice for gaming than the Acer. Since I am not a gamer myself, all the gaming comments for this review will be done by Jarred Walton. Aside from some WiiU or Magic Online, my gaming days are well behind me (or ahead of me).

Like most G-SYNC displays, the Acer has but a single DisplayPort input. G-SYNC only works with DisplayPort, and if you didn’t care about G-SYNC you would have bought a different monitor. It also has a USB 3.0 hub with two ports on the rear-bottom and two on the side. There are no headphone connections or speakers, so it is fairly bare-bones as far as connections and extra features go.

The included stand is very good overall. Built-in adjustments for height, tilt, swivel and pivot make it a very flexible option, and though running a TN panel in portrait mode can be problematic at best, the ability to pivot does provide for easier access to the bottom ports when connecting peripherals. It also has 100mm VESA mounting holes if you desire to use another stand or even wall mount it. The outer bezel is a shiny plastic, which is not my favorite as it shows fingerprints and smudges very easily. Though an $800 monitor should have a nice stand, many displays choose form over function but Acer does it correctly here. I really see no reason to replace the stand they provide.

The OSD works well, with a row of buttons on the bottom of the screen and icons directly above them indicating what they do. There's no guessing which is correct, and no touch-sensitive buttons that don’t work well. Acer provides basic, simple, effective controls that everyone should be happy with. There are a decent number of controls available, including gamma and color temperature. There is also an optional frame rate indicator that you can see on the left side of the screen. This gives you a quick indication of what your actual frame rate is, since G-SYNC should remain smooth even when it drops below 60Hz.

From a user interface perspective, the Acer XB280HK hits all the right notes. The stand is very adjustable while the controls are easy to use. The only real thing I would change is to make the bezel a matte finish instead of glossy to avoid fingerprints, and because I think it just looks better.

Looking just at the specs and the exterior design, the Acer XB280HK has a lot going for it. The big questions are how well will it perform when gaming at 4K with G-Sync, and how does the monitor perform on our objective bench tests?

Acer XB280HK G-Sync
Video Inputs 1x DisplayPort 1.2
Panel Type TN
Pixel Pitch 0.1614mm
Colors 16.7 Million
Brightness 300 cd/m2
Contrast Ratio 1000:1
Response Time 1ms GtG
Viewable Size 28"
Resolution 3840x2160
Viewing Angle (H/V) 170 / 160
Backlight LED
Power Consumption (operation) 42.5W
Power Consumption (standby) 0.5W
Screen Treatment Anti-Glare
Height-Adjustable Yes
Tilt Yes, -5 to 35 degrees
Pivot Yes
Swivel Yes, 45 Degrees
VESA Wall Mounting Yes, 100mm
Dimensions w/ Base (WxHxD) 25.9" x 22" x 9.6"
Weight 17.2 lbs.
Additional Features 4x USB 3.0, G-Sync
Limited Warranty 3 Years
Accessories DisplayPort Cable, USB 3.0 Cable
Online Price $785
G-SYNC Gaming Experience at 4Kp60
Comments Locked

69 Comments

View All Comments

  • JarredWalton - Thursday, January 29, 2015 - link

    This remains to be seen. Adaptive VSYNC is part of DisplayPort now, but I don't believe it's required -- it's optional. Which means that it almost certainly requires something in addition to just supporting DisplayPort. What FreeSync has going against it is that it is basically a copy of something NVIDIA created, released as an open standard, but the only graphics company currently interested in supporting it is AMD. If NVIDIA hadn't created G-SYNC, would we even have something coming in March called FreeSync?

    My bet is FreeSync ends up requiring:
    1) Appropriate driver level support.
    2) Some minimum level of hardware support on the GPU (i.e. I bet it won't work on anything prior to GCN cards)
    3) Most likely a more complex scaler in the display to make adaptive VSYNC work.
    4) A better panel to handle the needs of adaptive VSYNC.

    We'll see what happens when FreeSync actually ships. If Intel supports it, that's a huge win. That's also very much an "IF" not "WHEN". Remember how long it took Intel to get the 23.97Hz video stuff working?
  • Black Obsidian - Thursday, January 29, 2015 - link

    I agree with you on 1 and 2, and *possibly* 3, but I wouldn't bet on that last one myself, nor on #4. The monitor reviewed here--with a 60Hz maximum and pixel decay under 40Hz--would seem to suggest that a better panel isn't at all necessary.

    I also completely agree that, absent G-SYNC, Freesync very likely wouldn't exist. But that's often the way of things: someone comes up with a novel feature, the market sees value in it, and a variant of it becomes standardized.

    G-SYNC is brilliant, but G-SYNC is also clumsy, because of the compromises that are necessary when you can't exert control over all of the systems you depend on. Now that a proper standard exists, those compromises are no longer necessary, and the appropriate thing to do is to stop making them and transfer those resources elsewhere. This, of course, assumes that Freesync doesn't come with greater compromises of its own, but there's presently no reason to expect that it does.

    As for Intel, the 23.97Hz issue persisted as long as it did because you could round down the number of people who really cared to "nobody." It's possible that the number of people who care about Freesync in an IGP rounds similarly too, of course.
  • andrewaggb - Thursday, January 29, 2015 - link

    Freesync in an IGP for laptops and tablets would be a big deal I think.
  • nos024 - Friday, January 30, 2015 - link

    That's exactly what I am saying. Basically, we have only two GPU choices for PC gaming, nVidia or AMD. I'd understand the vendor-lock argument if there was a third and fourth player, but if nVidia doesn't support Free-Sync, you are basically locked into AMD GPUs for Freesync gaming.

    I'm sure nVidia can reduce the royalty fee or eliminate it completely, but you know what? There's nothing competing against it right now.

    nVidia seems to get away with lots of things, e.g. for a MB to implement SLI, it needs to license it and only come in enthusiast chipsets (Z77/Z87/Z97). Xfire comes free with all Intel chipsets - yet SLI is pretty popular still...just saying.
  • anubis44 - Tuesday, February 3, 2015 - link

    I say let nVidia be a bag of dicks and refuse to support the open standard. Then we'll see their true colours and know to boycott them, the greedy bastards.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I think BO and Jarred have it pretty much covered.
  • anubis44 - Tuesday, February 3, 2015 - link

    Somebody'll hack the nVidia drivers to make nVida cards work with Freesync, kind of like the customized Omega drivers for ATI/AMD graphics cards a few years ago. You can count on it. nVidia wants to charge you for something that can easily be done without paying them any licensing fees. I think we should just say no to that.
  • MrSpadge - Thursday, January 29, 2015 - link

    If Intel were smart they'd simply add Free Sync support to their driver. nVidia gamers could use Virtu to let the Intel + Freesync output the signal from their cards. Non gamers would finally get stutter-free video and GSync would be dead.

    No matter if Intel ever takes this route, they could do so and hence free Sync is not "vendor-locked".
  • phoenix_rizzen - Thursday, January 29, 2015 - link

    "The high resolution also means working in normal applications at 100% scaling can be a bit of an eyestrain (please, no comments from the young bucks with eagle eyes; it’s a real concern and I speak from personal experience), and running at 125% or 150% scaling doesn’t always work properly. Before anyone starts to talk about how DPI scaling has improved, let me quickly point out that during the holiday season, at least three major games I know of shipped in a state where they would break if your Windows DPI was set to something other than 100%. Oops. I keep hoping things will improve, but the software support for HiDPI still lags behind where it ought to be."

    This is something I just don't understand. How can it be so hard?

    An inch is an inch is an inch, it never changes. The number of pixels per inch does change, though, as the resolution changes. Why is it so hard for the graphics driver to adapt?

    A 12pt character should be the exact same size on every monitor, regardless of the DPI, regardless of the resolution, regardless of the screen size.

    We've perfected this in the print world. Why hasn't it carried over to the video world? Why isn't this built into every OS by default?

    Just seems bizarre that we can print at 150x150, 300x300, 600x600, 1200x1200, and various other resolutions in between without the characters changing size (12pt is 12pt at every resolution) and yet this doesn't work on computer screens.
  • DanNeely - Thursday, January 29, 2015 - link

    It's not the drivers; it's the applications. The basic win32 APIs (like all mainstream UI APIs from the era) are raster based and use pixels as the standard item size and spacing unit. This was done because on the slower hardware of the era the overhead from trying to do everything in inches or cm was an unacceptable performance hit when the range of DPIs that they needed to work on wasn't wide enough for it to be a problem.

    You can make applications built on them work with DPI scaling; but it would be a lot of work. At a minimum, everywhere you're doing layout/size calculations you'd need to multiply the numbers you're computing for sizes and positions by the scaling factor. I suspect if you wanted to avoid bits of occasional low level jerkyness when resizing you'd probably need to add a bunch of twiddles to manage the remainders you get when scaling doesn't give integral sizes (ex 13 *1.25 = 16.25). If you have any custom controls that you're drawing yourself you'd need to redo the paint methods of them as well. It didn't help that prior to Windows 8, you had to log out and back in to change the DPI scaling level; which would make debugging it very painful for anyone who tried to make it work.

    Newer interface libraries are pixel independent and do all the messy work for you but changing one out is a major rewrite. For Windows, the first one from MS was Windows Presentation Foundation (WPF); which launched in 2006 and was .net only. You can mix C/C++ and .net in a single application; but it's going to be messy and annoying to do at best. Windows 8 was the first version to offer a decedent of WPF to c++ applications directly; but between lack of compatibility with win7 systems meaning the need to maintain two different UIs and the general dislike of the non-windowed nature of Metro applications it hasn't gained much traction in the market.

    Disclosure: I'm a software developer whose duties include maintaining several internal or single customer line of business applications written in .net using the non-dpi aware Windows Forms UI library. Barring internal systems being upgraded to Win 8 or higher (presumably Win10) and high DPI displays or a request from one of our customers to make it happen (along with enough money to pay for it); I don't see any of what I maintain getting the level rewrite needed to retrofit DPI awareness.

Log in

Don't have an account? Sign up now