Color IQ

As I mentioned earlier, the technology at the heart of the Philips 276E6 monitor is QD Vision's Color IQ quantum dot technology. To gain a better understanding of how this differs from other quantum dot implementations currently used in the market, I spoke with QD Vision's Chief Marketing Officer John Volkmann.

Quantum dots are a type of semiconducting nanocrystal. They're typically made of cadmium selenide or indium phosphide, and when used in displays they have a diameter less than ten nanometers. What makes them interesting is that they exhibit a property known as photoluminescence, which means that they emit light after absorbing photons.

In LCD displays this property is highly desirable, as it means that you're able to place an array of quantum dots between the backlight and the color filters to reduce the frequency of the light emitted by the blue backlighting. By altering the diameter of the quantum dots you can control the frequency and wavelength of the light that is emitted, which allows for the emission of specific red, blue, and green wavelengths at the required intensity to cover your target color gamut. Smaller quantum dots on the scale of one or two nanometers emit wavelengths of light in the blue part of the visible spectrum, while larger quantum dots with a diameter of six or seven nanometers emit red light.

A question you may have is why this is actually necessary. I mentioned above that quantum dots are typically used to convert blue light into red and green light, and the use of blue LEDs for backlighting is not unique to quantum dot displays. Almost all modern LCD displays use LED backlighting, and the majority of them use what is commonly referred to as WLED backlighting. In truth, these "white" LEDs are really blue LEDs paired with a yellow phosphor, and through this process wavelengths of blue, green, and red light are produced. Unfortunately, there is still a very significant blue bias in the final output, and the intensity of the desired red and green wavelengths is relatively low. Because of this, these displays are limited in the range of green and red colors they can reproduce, and to date most monitors of this type have been limited to roughly 99% of the sRGB color gamut.

To produce a wider color gamut with LED backlights alone, vendors have employed the use of different technologies. The most prominent is GB-r backlighting, which pairs green and blue LEDs with a red phosphor to allow for green and red light of a greater intensity. Unfortunately, such designs have shown to be quite expensive, and this has kept wide gamut displays priced well outside what is affordable for the average consumer. An even smaller group of displays has employed full RGB backlighting, but due to cost this did not see much adoption by any display vendor.

The cost-related issues of RGB and GB-r backlighting is the problem that QD Vision hopes to solve with their Color IQ technology. Color IQ's appeal is that it works with standard edge-lit displays, and it takes advantage of the blue LED backlighting that those displays employ. Most quantum dot technologies require the use of expensive full-array backlighting because they use a thin film layer with quantum dots embedded throughout it which sits between the backlighting array and the color filter layer. In contrast, Color IQ uses small glass cylinders that sit in front of the blue LEDs at the edge of the display. According to QD Vision, the cost of a film-based solution for a display around the size of a 50" television can cost around $100, while their quantum dot solution for edge-lit displays will only cost around $20.

With QD Vision's current technology the cylinders with quantum dots sit between the blue LEDs and the light guide plate that distributes the light across the panel. With such an implementation one can expect displays that closely cover the Adobe RGB and DCI-P3 color gamuts depending on exactly how the quantum dots are tuned. According to QD Vision, quantum dot technologies perform best when the quantum dot array is as close to the backlight as possible. Within the next few years they hope to be able to deliver a "chiplet" solution, which consists of a quantum dot matrix mounted in a bead of glass right atop the LEDs. Moving beyond that will be integrating the quantum dot matrix right into the LEDs themselves. Right now such solutions are infeasible due to heat degradation, but they will be necessary as we move toward full coverage of the Rec. 2020 color gamut.

Philips 276E6: The First Color IQ Desktop Monitor Contrast, Brightness, and Gamut
POST A COMMENT

51 Comments

View All Comments

  • jlabelle - Friday, April 29, 2016 - link

    - In the second corner we have Android. Not clear to me how much better off they are. They have handled DPI a lot better, which is a good start -

    If you are speaking of Android, you should compare that in Windows Store with Windows apps from the Store.
    For those, the scaling is just perfect and it is handling ANY screen size / resolution / orientation perfectly.
    Only issue with scaling are Win32 programs not using hidpi API released 9 years ago with Windows 7 (at a time where Android was not a thing).

    - As far as I know there is still no color correction built into Android -

    Android is the worse on this because you have virtually 0 color management.

    bq. In the third corner we have Apple which seems perfectly positioned for all this (meaning that they will likely drive it).

    Again, this is misleading.
    For instance, iOS way of handling color management (see test on the iPad Pro) make the use of wide gamut screen virtually useless (for now) as there are no ways for a developer to take advantage of it. What it seems to do is basically apply a ICC profile to target sRGB color space.
    Scaling is not a question really as resolution are pretty much hard coded but again, Windows app are scaling perfectly.

    OS X has some "native" applications color managed (like Safari) but the same issue occur that the program needs to be color managed otherwise you have the same issue.
    For scaling, this is exactly like Windows with hidpi API existing like forever and developer just need to use it. Maybe there are more application which are using it. But that's it.
    OS X does not have really (from an OS point of view) an inherent advantage compared to Windows on color management / hiDPI screen.

    bq. they're now pushing color accuracy both on the camera side (TrueTone flash, high dynamic range sensors)

    actually, Apple is using 1/3" camera sensor, one of the smaller size in the industry (or only found in low end phone like Lumia 640XL...) and therefore the dynamic range is more limited than the competition (because it is mainly directly link to sensor size).

    - and the screen side -
    nothing exclusive to Apple. For instance, speaking of Windows here and therefore the Surface or the Lumia 950, they both have more color accurate screen that all the various iPad and the iPhone (albeit all are VERY good in color accuracy).

    bq. "Our colors look good, and look correct, across ALL our devices --- photos on your iPhone look exactly the same on your iMac. Good luck getting that consistency with photo from your Android phone on your Windows screen."

    It is no luck. Just pick the right product. If you pick a Surface and a Lumia 950 for instance, you will have the same great experience. And using a Samsung S6-S7 or accurate Android phone will give you the same.

    Seems indeed that advertising is working correctly for people to believe that Apple has inherent advantage here.

    - the relevance and interest of QD technology is whether it allows larger gamut to move to iPhone this year or at least soon.

    Until developer can take advantage of it, it has not advantage for end user. So as good is the color gamut of the iPad Pro, it is useless from an end user point of view.
    Reply
  • Brandon Chester - Friday, April 29, 2016 - link

    I've already addressed why your understanding of the situation on the iPad is incorrect in my article specifically about it. Please do not spread serious misinformation in the comments or I will have to remove them; this is already an issue that is confusing to many people. Reply
  • theduckofdeath - Friday, April 29, 2016 - link

    I don't get what bigger picture I'm missing here. Yes, LCD tech has evolved a lot over the years. But, it's just the faux marketing these manufacturers always stoop to, to give the impression that they're selling something better than LCD. A few years ago it was LED now it's Quantum Dots. Both insinuating that the backlight isn't the usual old flawed edge lit design. Reply
  • alphasquadron - Thursday, April 28, 2016 - link

    As a Windows User(not by choice but because it supports a lot of software and games), it is tiring to see the slow pace at which Windows fixes problems. When are they going to get 4k scaling done correctly. And I remember when I got my new computer and going through the same confusing ICC sub-menus to get the actual settings.

    Also what was Phillips or QD Vision thinking when they sent a reviewer of tech site that is testing their monitor for color accuracy a fake sRGB mode. I mean he just mentioned that there was no sRGB mode on the monitor so what do you think the first thing he is going to test when he gets the new monitor is. I'm still confused whether the mode actually did change something or if they are just that dumb(or they think reviewers are that dumb).
    Reply
  • Murloc - Thursday, April 28, 2016 - link

    maybe they messed up while doing a quick fix. I hope. Reply
  • Brandon Chester - Thursday, April 28, 2016 - link

    For the record, I spent a long time trying to prove to myself that it did do something. Unfortunately, if it truly were constraining the gamut it would be so completely obvious upon toggling it that you wouldn't even need to make measurements. I did measure anyway, and it truly didn't change the output at all. Reply
  • Guspaz - Thursday, April 28, 2016 - link

    All this talk of colour management... It all works so easily on my macbook (load the profile Anand made, and everything looks correct), but on my main PC, it's a mess...

    I've got a Dell U2711 running Windows 10. That's a wide-gamut display, and I do have an ICC profile for it. The display was also factory-calibrated (it shipped with a printed report on the results).

    If I want the most trouble-free setup where most stuff looks correct, which of these is the correct approach:

    1) Set monitor to default profile and set Windows to ICC profile
    2) Set monitor to sRGB profile and set Windows to ICC profile
    3) Set monitor to default profile and set Windows to sRGB profile
    4) Set monitor to sRGB profile and set Windows to sRGB profile

    I'm guessing option 1 is correct for wide-gamut use, but the crappy Windows colour management would mess everything up. So if I want to just go for sRGB, it seems to me that option 4 is probably correct? Or is option 2 what I want?

    This is all so confusing. On my Mac I just set the ICC profile and everything works immediately and perfectly.
    Reply
  • Murloc - Thursday, April 28, 2016 - link

    yeah MacOS got this down unlike Windows.

    I wonder how amenable Linux is in this regard.
    Reply
  • tuxRoller - Thursday, April 28, 2016 - link

    Pretty much as good as Mac, actually.
    Checkout my comments on the recent 9.7" iPad review (the one that dealt with color management).
    Reply
  • jlabelle - Friday, April 29, 2016 - link

    See my answer in page 2. I was in your EXACT same case.

    1) I guess you have a ICC profile so you are able to calibrate the screen yourself with a probe or you have a generic ICC profile from a DELL review (which means that you do not consider production variation and variatin over time) ?  this is theoretical ideal situation to take advantage of wige gamut screen…except, I do not advise it for the reason describe below.
    2) Hassle free solution : same as above but you constraint yourself with sRGB color space. You will have good color accuracy on color managed application. And even for non color managed application, and even if your ICC profile is not very good, you will have not problem of oversaturation or washed out colors.
    3) make no sense at all ! It means that you are saying that the DELL is perfectly accurate according to sRGB color space and gamut. Obviously, it cannot be further from the truth so you will end up with all your colors (EVEN for color managed applications) oversaturated. No, no, NO !
    4) This is the equivalent as what the article advice for the Philips : you put the screen in sRGB mode. You do not have any ICC display profile (because you do not have the necessary calibration equipement). So you are assuming that it is correctly calibrated and are saying to the OS that you display is perfect according to sRGB. Actually, this is the standard and you do not need to do anything to be in this situation.

    The preferred solution is by far the number 2.

    To understand why, let’s reverse the discussion and ask you (or people) why they think they benefit from a wide gamut screen ?
    • To surf the web ? No because websites are targeting sRGB anyway
    • To view pictures received by email or taken by you ? In most cases, no because mobile phone, compact cameras and even most DSLR are setup to take sRGB pictures
    • To view film ? It is slightly more complicated but anyway, there is no content with wide gamut (to make things simple) and anyway no consumer video software that would manage it. So you would end up with over saturated colors permanently. Except if this is your thing…

    So then, in which case would you have any benefits ?
    IF you have your own DSLR/mirrorless and IF you set it up in aRGB mode and IF you make always duplicates of every single picture in sRGB anyway that you want to share / display on the web / bring or sent to printing.

    And even if all those “IF” are fulfilled, you will end up having over saturated colors in most of your applications, when surfing the web, when watching pictures of others… All that just to be able to see, on your own pictures, maybe a tiny difference with side-by-side comparison in 0,001% of the case (I am not making this number, it is the proportion of pictures where I was able to spot a difference).

    Long story short : a wide gamut screen makes NO sense currently. And there is a reason why it is said that it only make sense for professional for very specific application. And those people do not come here to ask if it makes sense because they are aware of all this.

    Bottom line : choose option 2.
    Reply

Log in

Don't have an account? Sign up now