Acer XB280HK: Introduction, Design and Specs

When it comes to gaming, 4K displays present a conundrum (beyond 4K being used incorrectly, but I’ll still use it). On the one hand, all the extra pixels allow for far more detail. On the other, that is a lot of pixels to push for a GPU. Even with the best GPUs out there, you might – okay, will – have to disable certain features and start to introduce aliasing and other artifacts. A solution to this might be G-SYNC to enable gaming that looks smooth even when running below 60 FPS, and that's what we're looking at today.

G-SYNC, only available on video cards from NVIDIA, allows frame rates below the normal optimal speed of 60FPS to still look very smooth. The Acer XB280HK is the first G-SYNC display to also feature a 3840x2160 resolution. Unlike some other G-SYNC displays the Acer only runs at 60Hz and below, though I don’t believe running faster than 60Hz at 4K resolutions will be much of an issue right now. Anand previously reviewed G-SYNC and described the details of how it works.

Like all currently shipping G-SYNC displays (with the exception of the IPS Acer display announced at CES 2015), the Acer uses a TN-panel. For 120Hz or 144Hz G-SYNC panels you often need to use TN, but 60Hz would allow for IPS. The likely culprit here is cost, as the Acer currently sells for under $800. Other 4K 28” IPS displays cost at least as much and lack G-SYNC, making them a much worse choice for gaming than the Acer. Since I am not a gamer myself, all the gaming comments for this review will be done by Jarred Walton. Aside from some WiiU or Magic Online, my gaming days are well behind me (or ahead of me).

Like most G-SYNC displays, the Acer has but a single DisplayPort input. G-SYNC only works with DisplayPort, and if you didn’t care about G-SYNC you would have bought a different monitor. It also has a USB 3.0 hub with two ports on the rear-bottom and two on the side. There are no headphone connections or speakers, so it is fairly bare-bones as far as connections and extra features go.

The included stand is very good overall. Built-in adjustments for height, tilt, swivel and pivot make it a very flexible option, and though running a TN panel in portrait mode can be problematic at best, the ability to pivot does provide for easier access to the bottom ports when connecting peripherals. It also has 100mm VESA mounting holes if you desire to use another stand or even wall mount it. The outer bezel is a shiny plastic, which is not my favorite as it shows fingerprints and smudges very easily. Though an $800 monitor should have a nice stand, many displays choose form over function but Acer does it correctly here. I really see no reason to replace the stand they provide.

The OSD works well, with a row of buttons on the bottom of the screen and icons directly above them indicating what they do. There's no guessing which is correct, and no touch-sensitive buttons that don’t work well. Acer provides basic, simple, effective controls that everyone should be happy with. There are a decent number of controls available, including gamma and color temperature. There is also an optional frame rate indicator that you can see on the left side of the screen. This gives you a quick indication of what your actual frame rate is, since G-SYNC should remain smooth even when it drops below 60Hz.

From a user interface perspective, the Acer XB280HK hits all the right notes. The stand is very adjustable while the controls are easy to use. The only real thing I would change is to make the bezel a matte finish instead of glossy to avoid fingerprints, and because I think it just looks better.

Looking just at the specs and the exterior design, the Acer XB280HK has a lot going for it. The big questions are how well will it perform when gaming at 4K with G-Sync, and how does the monitor perform on our objective bench tests?

Acer XB280HK G-Sync
Video Inputs 1x DisplayPort 1.2
Panel Type TN
Pixel Pitch 0.1614mm
Colors 16.7 Million
Brightness 300 cd/m2
Contrast Ratio 1000:1
Response Time 1ms GtG
Viewable Size 28"
Resolution 3840x2160
Viewing Angle (H/V) 170 / 160
Backlight LED
Power Consumption (operation) 42.5W
Power Consumption (standby) 0.5W
Screen Treatment Anti-Glare
Height-Adjustable Yes
Tilt Yes, -5 to 35 degrees
Pivot Yes
Swivel Yes, 45 Degrees
VESA Wall Mounting Yes, 100mm
Dimensions w/ Base (WxHxD) 25.9" x 22" x 9.6"
Weight 17.2 lbs.
Additional Features 4x USB 3.0, G-Sync
Limited Warranty 3 Years
Accessories DisplayPort Cable, USB 3.0 Cable
Online Price $785
G-SYNC Gaming Experience at 4Kp60
Comments Locked

69 Comments

View All Comments

  • MrSpadge - Thursday, January 29, 2015 - link

    Jarred, please test his claims and modded drivers! He surely comes accross as dubious, but if he's correct that's a real bomb waiting to explode.
  • SkyBill40 - Thursday, January 29, 2015 - link

    There's a huge thing that he's doing that makes his claim patently false: he's running that game WINDOWED and G-Sync only works full screen. Period. So, in essence, while he does make an interesting point... he's full of shit.
  • JarredWalton - Thursday, January 29, 2015 - link

    The current (updated) video is running the pendulum fullscreen, but again... his claims are dubious at best. "Look, it has an Altera FPGA. The only thing that's good for is security!" Ummm... does he even know what FPGA means? Field Programmable Gate Array, as in, you can program it to do pretty much anything you want within the confines of the number of gates available. Also, the suggestion that G-SYNC (which was released before AMD ever even talked about FreeSync) is the same as FreeSync is ludicrous.

    FWIW, I've seen laptop displays that can run at 50Hz before, so with this demo running at a static 50 FPS it seems, that's not really that crazy to have "modded drivers" work. Sure, the drivers allow you to apparently turn G-SYNC on or off, but he could mod the drivers to actually turn triple buffering on/off and I doubt most of us could tell the difference via an Internet video.

    He needs to show it running a game with a variable FPS (with a FRAPS counter), and he needs to zoom out enough that we can see the full laptop and not just a portion of the screen. Take a high speed video of that -- with the camera mounted on a tripod and not in his hands -- and someone could actually try stepping through the frames to see how long each frame is on screen. It would be a pain in the butt for certain, but it would at least make his claims plausible.

    My take is that if G-SYNC is basically hacked, it would have come to light a long time ago. Oh, wait -- the random guy on the Internet with his modded drivers (anyone have time to do a "diff" and see what has changed?) is smarter than all of the engineers at AMD, the display companies, etc.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I agree with you and appreciate your more in depth commentary on it. I still, like you, find his claim(s) to be quite dubious and likely to be pure crap.
  • JarredWalton - Saturday, January 31, 2015 - link

    Turns out this is NOT someone doing a hack to enable G-SYNC; it's an alpha leak of NVIDIA's drivers where they're trying to make G-SYNC work with laptops. PCPer did a more in-depth look at the drivers here:
    http://www.pcper.com/reviews/Graphics-Cards/Mobile...

    So, not too surprisingly, it might be possible to get most of the G-SYNC functionality with drivers alone, but it still requires more work. It also requires no Optimus (for now), and you need a better than average display to drive.
  • Will Robinson - Thursday, January 29, 2015 - link

    Thanx for reposting that link Pork.I posted it yesterday but it seems some people want to accuse him of being a conspiracy theorist or somehow not of sound mind rather than evaluate his conclusions with an open mind.
    I wondered if we would get an official response from AT.
  • nos024 - Thursday, January 29, 2015 - link

    Technically, if AMD is the only one supporting "FreeSync" you'll still be so-called "vendor-locked"? No?

    As a PC gamer you only have two choices for high performance gaming video cards. So I don't understand this so-called vendor-lock debate thing with G-sync and Freesync. Just because G-sync comes in the form of a chip and Freesync with me with the new version of display port, it's the same deal.
  • SkyBill40 - Thursday, January 29, 2015 - link

    No, it's not the same thing. G-Sync is wholly proprietary and the effects of it will *not* work without a G-sync capable video card; on the contrary, Free-sync is just that: free to whatever card you have no matter the vendor. It's open source and thereby there's no proprietary chips in the design. It just works. Period.
  • nos024 - Thursday, January 29, 2015 - link

    What do you mean it just works? If Nvidia decides not to support it, AMD becomes the only one to support it, which means vendor-lock anyways.

    So you are saying if I decide to use Intel's IGP (given it comes with the correct Display port version), I need no additional support from Intel (driver) and Freesync will just work? I don't think it's THAT easy. Bottom line is, you will be locked to AMD graphics card IF AMD is the only one supporting it. It doesn't matter how it is implemented into the hardware - it's all about support.

    The only thing it has going for it is that there's no royalty paid to AMD to adopt the technology from a monitor manufacturing point of view.
  • Black Obsidian - Thursday, January 29, 2015 - link

    And no additional (monitor) hardware required.
    And it's part of the DisplayPort spec.
    And any GPU manufacturer that wants to support it is free to do so.

    The only thing that Freesync has going *against* it is that nVidia might decide to be a bag of dicks and refuse to support an open standard in favor of their added-cost (read: added-profit) proprietary solution.

Log in

Don't have an account? Sign up now