Tips and advice

What is HDR? – explained high dynamic range

In the already extensive list of TV specifications you can now almost invariably back HDR. High Dynamic Range is a new image technology that allows for more intense, more lifelike images. We tell you how it works, why we want it and what you can expect from it.
4.6/5 - (584 votes)

In the already extensive list of TV specifications you can now almost invariably back HDR. High Dynamic Range is a new image technology that allows for more intense, more lifelike images. We tell you how it works, why we want it and what you can expect from it.

What is HDR?

The dynamic range of our televisions is closely related to a term that we have been using for a long time: contrast. We define contrast as the relationship between the darkest black and the brightest white that a television can display. Today, no manufacturer there gives any figures because they took ridiculous proportions a few years ago. The reason for this is very simple: for example, to measure the black value of your screen, you can turn off the backlight of the television completely. The measured black value is then so close to zero that the contrast tends to infinity. In practice, however, with normal images, you can not get such contrasts.

Dynamic range differs in a subtle way. First, you should not look at the darkest black that the screen can produce, but the darkest black you can see on the screen. The latter also depends on ambient light that reflects in your screen. A perfect black value and deep shadow detail has no impact on the dynamic range if the ambient light reflected in the screen overrides the black detail. Second, the image must also be visible without clear image artifacts. Then we are mainly talking about ‘banding’ or ‘posterization’, which are very easy to see in the darkest regions of the image.

In short, dynamic range is the ratio between the brightest white and the darkest black you can see on the screen, without significant image artifacts. High Dynamic Range (HDR) stands for images with an extended dynamic range than we know today (eg in TV broadcasts, DVD or Blu-ray). Today we often refer to our current content as SDR content (Standard Dynamic Range). HDR is made possible by modern screens that can show a much higher brightness, and much more detailed dark scenes with more nuances.

Finally, HDR is always mentioned in the same breath with a wider color range (Wide Color Gamut). In theory, the two are not connected, but in practice all HDR content also uses a wider color range.

Why do we want clearer images?

The world around us is one of light, a lot of light. We express light intensity or luminance in cd / m² (or nit, another name for the same unit). To give you an idea: a summer sky easily gets a million nits. Indoor lighting is fast to 10,000 nits and even the reflection of the sun on a pitch-black asphalt road gets 2,000 nits.


That is in stark contrast to our televisions that have an average of around 300 nits as maximum luminance. That explains why specular highlights such as the sun that shines on the water, or on a chrome bumper, never have the same impact on television as in real life. Or why fireworks sometimes look pale. In addition, Blu-rays are mastered on a screen with a maximum white value of 100 nits. The dynamic range of the camera is therefore very strongly reduced during that step (and then very slightly raised by our televisions).

If we want to enjoy HDR, our televisions must have a much higher maximum luminance, and content must be mastered in a different way, taking into account a higher luminance.

More detail in the darkness

HDR is about more than just a high brightness. The black value of the screen is also important. This partly determines the dynamic range and has a strong influence on our perception of the images. A better black value gives the image a natural depth and ensures that colors stand out better. But that is not everything. As we said (in the definition of dynamic range) you have to display images without artefacts. And current image standards very easily cause banding effects in dark scenes. Instead of subtle gradations you will see clear bands of dark (sometimes unnatural) colors. And that while our eyes are just very sensitive to very fine gradations in dark scenes.

In order to use a deeper black value as well as possible in image, it is necessary to encode video images in such a way that more detail can be displayed, and the brightness can be adjusted in smaller steps (at least in the darkest parts of the image). In the current standards, the range of 0 to 100 nits is bridged in approximately 100 to 230 steps. In the new HDR images more than 500 steps are available for the same range. Those images can therefore show much more detail, and that also has a favorable influence on the color rendering in dark images. But you always have to take into account the influence of ambient light. HDR images that show very dark scenes can be flooded just as well as an SDR image by ambient light so that you can not see anything of that detail anymore.

What dynamic range is needed?

The human eye has a dynamic range of approximately 10,000: 1. In practice we can see a much larger range, but that is because our eye adjusts to the lighting conditions. That is why we can see sunlight (more than 1 million nits) but also starlight that is barely 0.0001 nits strong. We can not see them at the same time.

See also   Our experience: Impressed by the World Cup in 4K UHD and HDR (HLG)

We can speak of HDR images if the dynamic range within one image is about the same or larger than that of the human eye. A much higher dynamic range is not useful, our eyes would filter out some of the information (just like in real life). The current theory specifies that for such images at least a 10 bit color depth (providing 1,024 steps) is needed to avoid image artifacts. The 8 bit signals from our current SDR content are not sufficient.

What is required for HDR playback?

If we put all that knowledge together, what is needed to enjoy HDR? HDR is only possible with new content. Film cameras already have a much higher dynamic range, but it is the mastering process that needs to be adapted and where new standards (on which more are needed) are needed. They have to take into account a higher luminance, higher color depth , larger color range  and the desired detail in dark images.

And televisions must also evolve considerably. These new HDR standards take into account a maximum brightness of 10,000 nits, but our televisions are far from ready. A solid target for the near future seems to be 1,000 to 2,000 nits, which are figures that will only reach the best televisions in 2017. A conventional television is limited to about 350 nits. Televisions that have less than 500 nits peak luminance can not, in our opinion, guarantee true HDR reproduction. Yes, they may be equipped with the necessary standards to show the images, but they will miss the impact of real HDR. The color range of the television must also increase. The current standard for SDR content (Rec.709) is perfectly reproduced by almost any television, but contains only 69% of all visible colors. Colors such as eggplant purple and fire truck red can not be displayed correctly on such a television. The new standard we are aiming for is Rec.2020 which contains 99.99% of all visible colors. But televisions are not nearly ready for that either.

The new standards

As we mentioned, new standards are required for HDR reproduction. In concrete terms, this concerns the way in which we store and display visual material. The latter is a factor that should not be underestimated, because televisions have very different capacities today. An HDR image is mastered in the studio for a certain color range and a specific maximum brightness. But in one living room there is a television with a peak luminance of 1,500 nits while someone else has a model with a maximum of 400 nits. Yet they both want to watch the same HDR content. Same for color range. How should this be addressed?

There are currently a number of different standards in the game. More information about the differences and advantages and disadvantages is provided in a later article. HDR10 is the most widespread standard, it is already used today for streaming content and Ultra HD Blu-ray.

Dolby Vision is a competitive standard that is better equipped to deal with the many different television capacities. It is used in streaming, and can also be used on Ultra HD Blu-ray. HLG (Hybrid Log Gamma) is a standard that is mainly intended for broadcasting (live TV). The same signal contains both HDR and SDR playback, which is a big advantage for TV channels. HDR10+  is a successor to HDR10. He takes over some of the features of Dolby Vision so that the display can be optimized on different screens.


Where Ultra HD resolution brought us more pixels, High Dynamic Range in a sense creates ‘better’ pixels. HDR delivers images that can show much more intense light accents, on screens that can provide considerably more light than our old televisions. HDR can also show more detail in shadows, without annoying image artefacts such as banding. By combining both, we get greater contrasts in the image, so that we perceive detail considerably more sharply. And because HDR also uses a larger color range, the images are more intense.

But we can also make reservations about HDR. There are many new standards on the market, and it remains to be seen how (and or) these will ultimately all survive. And televisions complement their HDR capabilities in many different ways. That means that even for a middle-class price you can already find an HDR-ready device. But it is not realistic to expect the same performance from that device as from a top model. For HDR at any rate, the rule applies that more expensive models perform better.