As I sit down to write this, I realize it’s been about 15 years since the HD revolution started. Back in 2002, high definition was a pipe dream. There were a few TV shows that were smart enough to record in HD, and they’ve aged well. Other shows from the time period are eminently unwatchable, because they look like a blurry mess viewed on today’s 60″ and larger TVs. And we ask ourselves if we even care about HD anymore, as we lurch incredibly slowly toward 4K.
It’s time to take a serious look at that question.
In the 2000s, 4K wasn’t even a thought. The real question about television was whether or not the 720p or 1080i standard would really be the one to take off. The answer, all these years later, is that both standards coexist and really, no one seems to care.
720p… the sports choice
720p broadcasting uses a resolution of 1280×720, which isn’t really much better than the standard definition resolution of 720×480. However, looked at back from an ’00s perspective, 720p brought one thing to the table that everyone wanted. It showed 60 full frames every second, unlike the half-frames of old-school TV. This meant better freeze-frames and smoother motion, especially in sports programming.
1080i… the movie-lover’s choice
The other competing standard at the time was 1080i. This brings you resolution of 1920×1080, which is almost three times better than standard definition. However, it brings forth the legacy of providing 60 half-frames per second, yielding only 30 full frames per second. This can mean that motion looks blurrier. While not a problem for a slow-moving character piece, it might take away from the fun of a fast-paced thriller.
Why was this a problem?
Twenty years ago it just wasn’t possible to produce a reasonably priced HDTV that could do 1920×1080 and 60 frames per second. Many of the first HDTVs couldn’t do that resolution at all in fact. Many in the HD world were pushing for an achievable standard like 720p, thinking that there would never be an affordable TV with 1080 resolution. Obviously those people were wrong.
Flash forward to 2018.
Today’s flat televisions can easily handle the demands of 1920×1080 resolution, but there are still broadcasters who put out a 720p signal. ABC, Disney, and ESPN — who are all part of the same company — put a big investment in 720p equipment and use it still today. When you look at ESPN on a big TV, you see it’s really very blocky, not sharp at all. This is due to the 720p broadcast. Even if your cable or satellite provider (or your television) is resampling the image up to 1080 or even 4K resolution you’re still starting with lower quality and there isn’t much you can do about that.
On the other hand, frame rates just aren’t an issue.
Today’s TVs give you a silky smooth picture whether the original image was 720p or 1080i. Technologies like reverse pulldown and deinterlacing can literally reconstruct the TV’s picture, adding details in between frames so no matter if the original source was 60 full frames or 60 half frames, you get almost the same result. Of course reality is always going to be better than computer simulation, but when it comes to dealing with frame rates, computer simulations are pretty darn good.
So who won? The answer is, “who cares.”
High definition just isn’t the technophile’s dream that it once was. If you’re a home theater fan, you’ve already moved onto 4K. For the rest of the people, they probably find that HD looks pretty good no matter what. When it doesn’t, it’s not usually due to the source’s resolution, but instead due to other factors like overcompression. HD signals by themselves are still considered pretty huge and compressing them to fit the space you have is still an art. Compress too much and you lose detail. That’s the only part of HD that we still haven’t completely mastered.