Looking back a few years, I found an article that is so on the nose it’s scary. Titled “HDR needs to take a step back,” it was written at a time when HDR TVs were the rage at tech shows but were rare in homes. Even before people started seeing HDR, I saw the problems. And boy, have there been problems
What constitutes an HDR TV
There is really not a lot of agreement about what an “HDR” TV is. There are four competing standards: HDR10, HDR10+, HLG, and Dolby Vision. A TV can call itself HDR without actually supporting any of these standards, as long as it is able to show darker blacks and brighter highlights than normal TVs. A lot of lower priced TVs don’t really deliver on the HDR promise, but it doesn’t really matter since, as I said way back then, people don’t care about HDR as much as manufacturers say they do.
I’m not saying “don’t get an HDR TV.” They are cheap enough that you may as well. I’m saying that even in the ’20s, you shouldn’t expect an incredible range in a TV costing under $1,500. There are truly gorgeous looking TVs out there but they’re costly and require professional calibration to look their best.
The first problem
Like I said four years ago, the biggest problem is “just because you can, doesn’t mean you should.” Today’s cameras capture a lot of detail in very dark rooms. Shooting a scene in near-darkness gives a very realistic look compared to shooting it with blue light or “day-for-night” where the whole scene is shot in daytime and then darkened in the editing room.
The problem is that not everyone has a perfect TV and a perfect viewing area. For most people, they just see dark, muddy images that might not be so entertaining. That’s what I complained about in 2016, and here I am four years later still complaining about it.
The biggest example in the last 12 months was the Game of Thrones episode, “The Long Night.” You may know it as the battle of Winterfell, or “the fricken dark one.” The episode was probably shot and produced in 4K HDR, but HBO doesn’t broadcast in 4K or HDR. If your TV wasn’t perfectly calibrated, or if you were watching on a tablet in a bright room, you didn’t see a lot.
In a lot of ways, standard-definition TV really helped studios create images that people could connect with. When the largest image was 27″ diagonal, when everything darker than a certain level was just muddy, studios had to work harder. New technologies have made it easier for artists to express their vision but they’ve also made it harder for people to see it.
And now the second problem…
What we’re really beginning to realize lately is that a lot of so-called HDR programming… really isn’t. Studios are labeling shows as HDR when there isn’t really anything dark or bright enough to make that label worth it. They’re calling the signal HDR but filling it with SDR (standard dynamic range) content. A decade ago when shows called themseves HD but you actually saw upsampled SD. People can tell the difference.
I’m not sure if it’s better or worse, but there are also studios, notably Disney, which take SDR content and reprocess it to be HDR. Again, there isn’t really any more detail. You’re just stretching the signal so that the darks are dark and the lights are light. You’d think this would be a good thing. Really it has the side effect of making most things darker than they would have been otherwise. Without a reference-quality viewing environment, that’s a problem.
What can you do?
Truth is, not much. Proper calibration of your TV is going to help for sure, as well as a properly lit viewing environment. But, beyond that it’s up to studios to produce real HDR content. It’s up to pay-TV and streaming companies to send out real HDR content. More than anything it’s up to the directors and cinematographers to learn how to use the technology they have to the best effect. Sadly, that’s the part that will probably take longer than anything else.