Why all HDR on TVs isn't the same
More and more TVs are capable of displaying high dynamic range video. But just because they claim to be HDR-compatible, it doesn't mean they can make it shine.
With the amount of high dynamic range video you can watch growing by the day, and more and more people recognizing the "HDR" abbreviation as something new (and potentially cool), TV manufacturers are jumping at the chance to show they're on top of this latest trend. For several years HDR has only been available in expensive high-end TVs , but now it's making its way to midrange and even budget models.
Just because a TV claims to be HDR-compatible, however, doesn't mean you'll be able to see high dynamic range content as it's meant to be seen.
Here's why.
What is high dynamic range?
If you've heard of the term and want a explanation of what it's all about -- and why it's cool -- check out How HDR works. Related to HDR is WCG, or wide color gamut, which is also worth learning about.
It's important to note that the HDR in TVs is very different from the HDR in your phone or camera. It's technically separate from 4K resolution, but almost all HDR TVs happen to be 4K TVs too.
The short version? An HDR TV, showing HDR content, can show brighter highlights and usually, a wider range of colors than a "normal" TV showing normal content. It will potentially look more realistic, with deeper and richer colors, and more visual "punch."
But, that's if the TV has the technology to show HDR as it's meant to be shown.
HDR and 'HDR'
The two aspects of HDR and WCG, namely brighter highlights and wider colors, aren't something just any TV can do. That's another key difference between HDR and 4K. All TVs with 4K resolution, no matter how cheap, can typically show the full detail of 4K sources because they all have the same number of physical pixels.
The technologies that allow a TV to produce images with higher contrast (brighter whites and darker blacks) and wider color involve more than pixel count. And you can't make a TV brighter or more colorful with just a software change, at least not in the precise way we're talking.
To produce the fine highlights that makes HDR images pop, physical display technologies like local dimming or OLED (more on the latter in a moment) really help. Local dimming allows a TV to make certain areas of the screen brighter than other parts, crucial for accentuating the bright highlights in HDR content.
There are two basic types of local dimming: edge-lit, where the LEDs are arranged along the edges of the TV screen, and full-array, where they're behind the screen. Full-array almost always performs better than edge-lit, but there are exceptions such as Sony's superb edge-lit XBR-X930D, one of the best HDR TVs we tested last year. For more about local dimming, check out LED LCD local dimming explained.
To produce the wider color range in WCG content, a TV needs some way to create those wider colors. Some do it by using quantum dots, or "QLED" as Samsung calls it this year. Others, such as Sony, Vizio and LG, use different LED backlight technologies. For more about how color works in TVs, check out Ultra HD 4K TV color, part I: Red, green, blue and beyond and part II: The (near) future. The short version for color is the TV needs to be able to accurately produce much richer colors than TVs have been capable of until now.
To repeat an analogy I used in my Do you need 4K for HDR? article: an HDR-compatible TV that doesn't have the tech to show HDR is like having a 200-mph speedometer in a Toyota Prius. It might be neat that it's there, but you can't really use it.
There are still technical hurdles even if your TV has local dimming (as many high-end LCDs have had over the last few years). In most cases you need HDMI 2.0 or 2.1 to get the extra data to make HDR content work. So it's unlikely a simple firmware update will get HDR to work on older TVs, even if they have local dimming.
And remember, you need HDR content to tell the TV how to produce the picture, otherwise it's just a fast car in slow traffic.
And as for OLED, while all OLED TVs inherently have a greater contrast ratio than LCD, only HDR-compatible models have the extra brightness capabilities to do HDR. While disappointing for early OLED adopters, it does make things easier for current shoppers.
Basically, if an OLED TV claims to be HDR-compatible, it is -- and in CNET's tests OLED delivers exceedingly good HDR, and as wide a color gamut as quantum dots. Different years (and in some cases, different models) might look better than others, but they're actually HDR if they're labeled as such.
What to look for
This type of marketing "gift for fiction" annoys me tremendously because they're taking something that's good (HDR) and misleading people into thinking they're getting it. Worse, it could lead people who buy a TV thinking it's HDR and, not seeing any difference with HDR content, proclaim to the world that HDR is worthless.
Right now there's no simple way to determine if a TV is "real" HDR or "fake" HDR. Given the wide range of TV prices, that's not always a safe determinant.
If a TV has local dimming or especially OLED display technology, it will likely have a better HDR (and non-HDR) image. Though not a requirement, quantum dots and OLED are a likely indicator a TV can do WCG. The best indicator is, of course, checking reviews here at CNET.
Got a question for Geoff? First, check out all the other articles he's written on topics like why all HDMI cables are the same, TV resolutions explained, LED LCD vs. OLED, and more. Still have a question? Tweet at him @TechWriterGeoff then check out his travel photography on Instagram. He also thinks you should check out his best-selling sci-fi novel and its sequel.