LG 34UM67 sRGB Data and Bench Tests

For color accuracy, we test before and after calibration. For calibration, we use SpectraCal CalMAN with our own custom workflow. We target 200 cd/m2 of light output with a gamma of 2.2 and the sRGB color gamut, which corresponds to a general real-world use case. We use an i1 Pro provided by X-Rite. All measurements use APL 50% patterns except for uniformity testing, which uses full field.

LG 34UM67 Pre/Post Calibration
Pre-Calibration,
200 cd/m2
Post-Calibration,
200 cd/m2
Post-Calibration,
80 cd/m2
White Level ( cd/m2) 201 198.7 79.3
Black Level ( cd/m2) 0.2056 .2153 .0977
Contrast Ratio 978:1 923:1 811:1
Gamma (Average) 2.18 2.21 2.21
Color Temperature 6558K 6548K 6482K
Grayscale dE2000 2.94 0.38 0.99
Color Checker dE2000 2.49 1.24 1.39
Saturations dE2000 2.14 1.07 1.17

Before calibration, the LG 34UM67 has a slight blue tint to the grayscale but nothing too noticeable – especially for gaming purposes. Tweaking the OSD settings to 53/50/47 RGB gives a result reasonably close to the ideal 6504K color target. The grayscale errors are all under 4.0 dE2000, which is potentially visible but not overly so, with an average error level of 2.9 dE2000. The gamma curve isn’t great, starting high and ending low but with an average of 2.18 that’s close to our 2.2 target, so things can definitely be improved. Moving to colors, there are a few larger errors of nearly 5.0, mostly in the yellows and oranges. Some of these are due to the gamut falling slightly higher than sRGB, leading to some oversaturation of green and red.

Post-calibration the gamma and RGB balance are almost perfect. The average grayscale dE2000 falls to well below 1.0, which is invisible to the naked eye. Colorchecker and saturation accuracy improves as well, though there are still colors in the 4.0 range. Again, it’s mostly shades of yellows, oranges, and some greens that cause problems, which unfortunately tend to be the worst colors to have wrong for imaging professionals. Overall it’s a good monitor, and the target audience clearly isn’t going to be imaging professionals, so with or without calibration it will do well for gaming, movie watching, and other general tasks.

Changing to 80 cd/m2, the calibration results remain pretty consistent. The dE2000 numbers are slightly higher, but if the small change in accuracy is a concern then potential buyers would have already passed on this display. Only the most finicky of regular consumers might find something to complain about.

It’s also worth quickly discussing some of the other color modes, just because certain ones can be so far off that it’s a wonder anyone would even consider using them. LG offers four picture modes (Photo, Cinema, Reader 1, and Reader 2). Photos has a strong blue tint with average grayscale dE of 6.4 and many values nearing 10.0, though colors aren’t quite so bad averaging closer to 5.0. The Cinema mode is pretty close to the Custom setting, so while it’s tinted blue the grayscale dE is 2.3 while the colors average close to 4.0, with skin tones often falling into the 6.0+ range. Reader 1 and 2 are supposed to be more like print, with the results being heavily red biased with limited blue, and minimum black levels are much higher (2.5 cd/m2). The resulting grayscale dE2000 of 10.8/8.7 and average colors of 7.5/6.0 however are not particularly useful.

And that sums up why NVIDIA didn’t bother with supporting specialized color modes on their G-SYNC module: doing one color mode properly is generally more useful than supporting multiple incorrect color modes. While some people might appreciate the ability to quickly switch between various color modes, most just set up a display for everyday use and leave it be. Most named presets other than “standard” or “custom” end up being bullet points more than anything useful.

LG 34UM67 Brightness and Contrast LG 34UM67 Display Uniformity
Comments Locked

96 Comments

View All Comments

  • dragonsqrrl - Wednesday, April 1, 2015 - link

    "FreeSync actually has a far wider range than G-Sync so when a monitor comes out that can take advantage of it it will probably be awesome."

    That's completely false. Neither G-Sync nor the Adaptive-Sync spec have inherent limitations to frequency range. Frequency ranges are imposed due to panel specific limitations, which vary from one to another.
  • bizude - Thursday, April 2, 2015 - link

    Price Premium?! It's 50$ cheaper than it's predeccessor, the 34UM65, for crying out loud, and has a higher refresh rate as well.
  • AnnonymousCoward - Friday, April 3, 2015 - link

    The $ goes on the left of the number.
  • gatygun - Tuesday, June 30, 2015 - link

    1) 27 hz range isn't a issue, you just have to make sure you game runs at 48+ fps at any time, which means you need to drop settings until you hit 60+ on average in less action packed games and 75 average on fast paced packed action games which have a wider gap with low fps.

    The 75hz upper limit isn't a issue as you can simple use msi afterburner to lock it towards 75 fps.

    The 48hz should actually have been 35 or 30, it would make it easier for the 290/290x for sure and you can push better visuals. But the screen is a 75hz screen and that's where you should be aiming for.

    This screen will work perfectly in games like diablo 3 / path of exile / mmo's which are simplistic gpu performance games and will push 75 fps without a issue.

    For newer games like witcher 3, yes you need to trade off a lot of settings to get that 48 fps minimum, but at the same time you can just enable v-sync and deal with the additional controlled lag from those few drops you get in stressing situations. You can see them as your gpu not being up to par. crossfire will happen at some point.

    2) Extra features will cost extra money, as they will have to write additional stuff down, write additional software functions etc. It's never free, it's just free that amd gpu's handle the hardware side of things instaed of having to buy licenses and hardware and plant them into the screens. So technically specially in comparison towards nvidia it can be seen as free.

    The 29um67 is atm the cheapest freesync monitor on top of it, it's the little brother of this screen, but for the price and what it brings it's extremely sharp priced for sure.

    I'm also wondering why nobody made any review on that screen tho, the 34inch isn't great ppi wise while the 29inch is perfect for that resolution. But oh well.

    3) In my opinion the 34 isn't worth it, the 29um67 is where people should be looking at, with a price tag of 330 atm, it's basically 2x cheaper if not 3x then the swift. There is no competition.

    I agree that input lag is really needed for gaming monitors and it's a shame they didn't spend much attention towards it anymore.

    All with all the 29um67 is a solid screen for what you get, the 48 minimum is indeed not practical, but if you like your games hitting high framerates before anythign else this will surely work.
  • twtech - Wednesday, April 1, 2015 - link

    It seems like the critical difference between FreeSync and GSync is that FreeSync will likely be available on a wide-range of monitors at varying price points, whereas GSync is limited to very high-end monitors with high max refresh rates, and they even limit the monitors to a single input only for the sake of minimizing pixel lag.

    I like AMD's approach here, because most people realistically aren't going to want to spend what it costs for a GSync-capable monitor, and even if the FreeSync experience isn't perfect with the relatively narrow refresh rate range that most ordinary monitors will support, it's better than nothing.

    If somebody who currently has an nVidia card buys a monitor like this one just becuase they want a 34" ultrawide, maybe they will be tempted to go AMD for their next graphics upgrade, because it supports adaptive refresh rate with the display that they already have.

    I think ultimately that's why nVidia will have to give in and support FreeSync. If they don't, they risk effectively losing adaptive sync as a feature to AMD for all but the extreme high end users.
  • Ubercake - Thursday, April 2, 2015 - link

    Right now you can get a G-sync monitor anywhere between $400 and $800.

    AMD originally claimed adding freesync tech to a monitor wouldn't add to the cost, but somehow it seems to.
  • Ubercake - Thursday, April 2, 2015 - link

    Additionally, it's obvious by the frequency range limitation of this monitor that the initial implementation of the freesync monitors is not quite up to par. If this technology is so capable, why limit it out of the gate?
  • Black Obsidian - Thursday, April 2, 2015 - link

    LG appears to have taken the existing 34UM65, updated the scaler (maybe a new module, maybe just a firmware update), figured out what refresh rates the existing panel would tolerate, and kicked the 34UM67 out the door at the same initial MSRP as its predecessor.

    And that's not necessarily a BAD approach, per se, just one that doesn't fit everybody's needs. If they'd done the same thing with the 34UM95 as the basis (3440x1440), I'd have cheerfully bought one.
  • bizude - Thursday, April 2, 2015 - link

    Actually the MSRP is $50 cheaper than the UM65
  • gatygun - Tuesday, June 30, 2015 - link

    Good luck getting 48 minimums on a 3440x1440 resolution on a single 290x as crossfire isn't working with freesync.

Log in

Don't have an account? Sign up now