What is HDR? What's different between HDR formats?

article
What is HDR? What's different between HDR formats?

HDR is one of the most discussed technologies on the web in the past few years. Whether it's about TVs and movies, or computer monitors and games, HDR is gaining a foothold on many of our devices. Do you want to know what HDR is? Do you know what the different types, standards, and certifications of HDR are? If you want to understand why some screens are better than others when it comes to HDR, and to see what terms such as HDR10, HDR10+, Dolby Vision, DisplayHDR 400 or DisplayHDR 1000 are all about, read this article:

Advertisement

What is HDR?

HDR is an acronym for High Dynamic Range, and it is a technology designed to make images resemble the real world as closely as possible. HDR is a term that you can hear of in photography, as well as in everything that's screens-related.

In order to make images as authentic as possible, devices with HDR use wider ranges of colors, brighter light areas, and darker blacks for shades. All these, together with much more balanced contrast ratios, make images look more realistic and accurate, closer to what the human eye would see in the real world.

A computer monitor with HDR
A computer monitor with HDR

When it comes to digital images displayed on a monitor, TV, or any other similar devices, HDR is especially noticeable in pictures or videos that have complex combinations of colors, and light and dark areas. Such examples can be sunsets and sunrises, bright skies, snowy landscapes, etc.

What are the different HDR formats: HDR 10, HDR+, Dolby Vision, and HLG?

In terms of screens, there are three major HDR formats, or profiles if you prefer: HDR 10, HDR+, and Dolby Vision. All these media profiles apply both to how video content is recorded or rendered, and to how that content is displayed by devices with HDR screens. Although all of them aim for the same thing - to display more realistic images - they have different requirements, specs, and properties.

Advertisement

The essential criteria that define the different HDR profiles are related to image quality. See the table below for a comparison:

Comparison between HDR10, HDR10+, and Dolby Vision
Comparison between HDR10, HDR10+, and Dolby Vision

Let's cover these criteria one by one:

Bit depth. Usually, monitors, laptop screens, TVs, and most other screens, including those on smartphones, use 8-bit colors. That allows them to show 16.7 million colors. HDR screens have a 10-bit or 12-bit depth, which allows them to display 1.07 billion colors, or 68.7 billion colors respectively. HDR 10 and HDR10+ have 10-bit colors, while Dolby Vision supports a bit depth of 12. All are impressive, huge numbers. However, you should know that, at least for the time being, there are only 10-bit screens on the market (HDR and HDR+), so even if Dolby Vision sounds fantastic, you don't get to enjoy it on any consumer screens for the moment.

Peak brightness. This refers to the minimum amount of peak luminance attainable by a screen with HDR. For screens to be able to display HDR images, they need higher brightness levels than regular SDR (Standard Dynamic Range) screens. Peak brightness is measured in cd/m² and usually has to be equal to at least 400 cd/m². Read the next section of this tutorial to see the different HDR standards based on peak brightness.

Maximum black brightness. As you know, HDR screens aim to display images that are as close to reality as possible. To do that, besides a high peak luminance for bright image areas, they must also be able to display dark areas using very dark blacks. That's where the maximum black level luminance comes into play. The typical values for this attribute are less than 0.4 cd/m², but there is no requirement as far as HDR protocols are concerned. However, VESA DisplayHDR standards do have specific values for the maximum black level luminance, as you can see in the next section of this article. Any screen that can show blacks at a brightness of less than 0.0005 cd/m² is considered to be True Black.

A TV that supports HDR
A TV that supports HDR

Tone mapping. Content that was created with HDR, such as movies or games, can have much higher brightness values than what an HDR screen can actually display. For instance, some sequences in a movie might have brightness levels of over 1000 cd/m², but the HDR screen on which you're watching it has a peak brightness of 400 cd/m². What happens then? You might be tempted to think that any parts of the image that are brighter than 400 cd/m² are lost. They're not, at least not entirely. What HDR screens do is something called tone mapping, basically using algorithms to reduce the brightness of the filmed images, so that it doesn't go beyond their peak brightness. Sure, some information is lost this way, and contrast can actually look worse than on an SDR (Standard Dynamic Range) screen. However, images still have more details than on SDR screens.

Metadata. In order for an HDR screen to be able to display HDR content, regardless of whether it is a movie or a game, that content must be created with HDR. You can't just film a movie in SDR (Standard Dynamic Range) and expect it to be displayed in HDR on a TV, for example. Content that's created with HDR stores information called metadata about how it should be displayed. That information is then used by the devices on which you play the content to decode the content correctly, and use just the right amount of brightness, for example. The problem is that not all HDR formats use the same kind of metadata. HDR10 uses static metadata, which means that the settings applied to how the content is displayed are the same from beginning to the end. HDR10+ and Dolby Vision, on the other hand, use dynamic metadata, which means that the images displayed can be adjusted on-the-fly. In other words, HDR content can use different ranges of brightness on different scenes, or even for each frame of a video.

Advertisement

You might have noticed that we didn't mention anything about HLG yet. HLG comes from Hybrid Log Gamma and represents an HDR standard that allows content distributors, such as television companies, to broadcast TV content that's both SDR (Standard Dynamic Range) and HDR (High Dynamic Range) using a single stream. When that stream reaches your TV, the content is displayed either in SDR, or in HDR, depending on what your TV is capable of.

What is DisplayHDR?

Furthermore, to make things a bit more complicated, besides HDR formats, there's also the HDR performance specification called DisplayHDR. Devices that bear the DisplayHDR certification meet a series of standards that ensure that they can display images with HDR at a certain quality. If you've searched the internet or electronic shops for a new TV or a monitor to buy, you might have stumbled across the terms DisplayHDR 400, DisplayHDR 600 or Display HDR 1000, or others like that. What do they mean?

VESA Certified DisplayHDR logo
VESA Certified DisplayHDR logo

VESA (Video Electronics Standards Association), which is an international association of over 200 companies all over the world, creates and maintains technical standards for all kinds of video displays, including TVs and computer monitors. One of the fields in which they have established such standards is HDR. Their standards for HDR displays are called DisplayHDR, and all apply to screens that support at least HDR10. To be DisplayHDR-certified, a TV, monitor, and any other device with an HDR display must meet the following brightness standards, among other more technical specifications:

Brightness requirements for VESA DisplayHDR certifications
Brightness requirements for VESA DisplayHDR certifications

How do I enable HDR on my Windows 10 PC?

As a final note on HDR, we'd like to tell you a bit about HDR and Windows 10. If you are using a Windows 10 computer or device, an interesting fact is that this operating system supports only HDR10.

HDR in Windows 10
HDR in Windows 10

Furthermore, you can enjoy HDR content in games or movies on your PC only if you're using an Nvidia GTX 900 series, GTX 10-series, GTX 16-series, or RTX 20-series graphics card. If you're an AMD user, you need to have an AMD Radeon R9 38 or 39-series graphics card or a Radeon RX 460 and up. If your PC meets these criteria and you also have an HDR monitor, you might want to learn how to enable HDR on it: How do I turn on HDR on my Windows 10 computer?

Is there anything else you would like to know about HDR?

We hope that we've managed to shed a bit of light on what HDR is and why you would want it on your screens. Do you intend to buy a monitor or a TV that offers HDR support? Do you already have one or more? Share your opinions and questions, if you have some, in the comments section below, and we'll do our best to help.

Discover: Entertainment Blog Hardware Recommended