What you need to pay attention to when buying a TV (monitor) for HDR
June 18, 2023
Bottom Line: Just because a TV or monitor says HDR in its specs, it doesn't automatically mean you'll get the most out of this technology. If the TV (monitor) does not support sufficient brightness or color depth, then you will hardly notice the difference between HDR and SDR.
HDR looks different for different manufacturers
HDR has not only several formats, but also a very different implementation.
TVs and monitors from different manufacturers differ in the number of supported colors and peak screen brightness.
Moreover, even if, according to the actual characteristics, the TV does not reach the full HDR playback, this does not prevent the manufacturer from declaring support for HDR, since, first of all, it means support for metadata that comes with the video and which should affect the brightness of the picture.
However, if the TV actually cannot reproduce sufficient brightness or sufficient color depth, this does not negate the fact that the TV supports HDR.
For this reason, many users are not impressed with HDR on budget TVs.
Moreover, in particularly unfortunate cases, HDR content can look worse than SDR.
But TV manufacturers are not alone in their quest to capitalize on new technology. Movie producers are also actively using the HDR acronym, including releasing films that are not much different from regular SDR films. This has even given rise to the new term “fake HDR”, when the actual metadata for HDR is available and formally it can be claimed that the movie format uses this technology, but in practice the actual brightness of the movie differs little from SDR.
What color depth of the TV (monitor) is needed for HDR
A TV or monitor that claims to support HDR can support 8-bit color depth or 10-bit color depth (that's 1.07 billion colors).
However, one of the advantages of HDR is the support for 10-bit color depth. That is, when using a monitor with 8-bit color depth, you will not get any benefits from the variety of HDR hues.
At the same time, it should be noted that 10-bit color depth and HDR are not equivalent concepts, as some users think. A TV can have 8-bit color depth and its manufacturer is not prohibited from claiming HDR support.
Moreover, even HDR content (video) does not always support 10-bit color depth.
So, it is an alternative characteristic of HDR, but at the same time quite important.
It is recommended that the TV supports 10-bit color depth.
Look at the following screenshot examples, they are taken from HDR content.
In this video, the girl's dress has vertical stripes, but the color depth of the image is 8-bit, so some of the colors and stripes are not visible. Disabling HDR and switching to 8-bit also removes the stripes in this video, just like in the screenshot.
To the left of the head of the weather presenter there is a circle indicating green and yellow areas, with HDR on and 10-bit color depth, individual details are visible in the video, and with HDR off and 8-bit color depth, only solid green color is visible.
In this video fragment, the red house also has clearly visible horizontal stripes, which, with HDR turned off and 8-bit color depth, practically merge with the background.
That is, even with HDR support, but without 10-bit color depth, you will not see these shades, 10-bit color depth is required.
What brightness of the TV (monitor) is needed for HDR
Content created for HDR10 is rendered at a maximum brightness of up to 1000 nits.
Content created in HDR10+ is currently rendered at a maximum brightness of 4000 nits.
In practice, even mid-priced TVs and monitors with HDR support can support 300-350 nits of brightness. As you might guess, this is not enough.
An example of a test video to check the HDR brightness of the monitor:
If your TV is below 1000 nits, then the numbers 1000, 800, and even 600 may look equally bright on your screen, when in fact, on a high-brightness monitor, the top number is much brighter.
Why HDR content is not better than SDR
Not all users who have tried HDR are satisfied with this technology. Also, not all users see the difference between good bright SDR and HDR content.
The problem is not HDR. Most likely, your TV or monitor simply does not meet all the requirements to fully support all HDR capabilities. Another possible reason is that the content itself does not take full advantage of HDR.
Related articles:
- Is HDMI or DisplayPort better for the ultimate picture, sound, HDR, FreeSync, G-Sync and more (100%)
- The complete guide to HDR: how to enable, how to set up, how to watch HDR movies and YouTube, troubleshooting HDR problems in Windows (100%)
- How to increase the color (saturation) of images, how to make a photo warmer or colder (60.8%)
- How to connect a TV to a computer in Windows 11 (58%)
- Updated workplace (55.8%)
- How to get web page content and cookies of .onion sites in Python (RANDOM - 50%)