Tips and advice

Background: Can I make HDR images less intense?

Background: Can I make HDR images less intense? This article will tell you in detail regarding option you can chose with your HDR mode.

Every now and then we receive questions from people who are looking for the best picture settings for HDR, or who find the HDR image too bright and want to adjust it. It’s tempting to tinker with maximum brightness, but is that the right choice? Because HDR and SDR are quite fundamentally different from each other.

Introducing HDR

HDR content is growing fast. Where you initially obtained this mainly via Ultra HD Blu-ray discs, we now see more and more streaming content and game console games in HDR format. That is an evolution that we can only encourage. HDR offers particularly beautiful images, and only HDR images really show the full potential of top and mid-range TVs.

Before we go any further, a few notes:

  • We are talking about HDR10 content in this article. It is the most widely distributed and available on all televisions. To a lesser extent, what we mention here also applies to Dolby Vision and HDR10+.
  • Let’s not consider HLG for a moment, it is hardly available yet and it also works differently than HDR10.

Anyone who has just exchanged his ten-year-old television for a recent (top) model is in for a big surprise. That ten-year-old model didn’t support HDR, and the maximum brightness it could deliver was presumably no more than 200-300 nits.

Your new TV can almost certainly deliver 500 nits, or even 700-900 nits (OLED) and 1,000-1,500 nits (LCD top model). That is quite a bit more, and it is not surprising that viewers fear that the image will be far too bright to be pleasant. However, that fear is unfounded.

After all, HDR content uses that maximum brightness quite limited. In almost all cases, this only concerns a small part of the image, such as light accents. A “dazzling” 1,500 nits completely white screen, so you don’t have to worry about that. We do not use those quotes just like that, after all, 1,500 nits is absolutely not dazzling. You look at objects that are much brighter every day. Even on a partly cloudy day, an average flower in the garden is 3,000 – 5,000 nits, as nicely illustrated in this video.

So first look at some different HDR content (preferably use the Film image mode) for a few days to get used to the new footage. That’s advice we often give to people who have their TV calibrated. Sometimes you’re so used to the image of your old TV that everything else seems “wrong”. That feeling will go away after a few days.

Can’t it go wrong?

With ordinary SDR content, a recent device in the correct image mode (film or similar) still delivers a maximum of 300-500 nits. That is more than before, but rarely of the nature that you will find it really too intense. The image will look a lot better in a lot of ambient light, and with blackout the light sensor intervenes so that the image is not too bright.

Real problems with images that are too bright in HDR are rare, but there are two things that we do notice every now and then. In our regions we all use subtitles. These can sometimes be quite intense, especially if the image is predominantly dark. Fortunately, we rarely see that, TV manufacturers seem to take this into account. Another culprit is the user interface. If for some reason it is displayed in HDR, it can be a bit strong on the eyes.

But what if you think HDR content is really too bright? The tendency is then great to tinker with the settings and more specifically with the brightness of the image. That setting is almost always at the top of the list, but under different, confusing names: backlight, OLED pixel brightness, panel brightness, or even just brightness. That setting is also invariably set to the maximum in HDR, so it seems logical to set it a little lower. However, we strongly advise against that. Why?

The difference between HDR and SDR

One of the differences between HDR and SDR has to do with the EOTF (electro optical transfer function). The EOTF determines how the incoming signal is converted to luminance. With SDR this is the gamma curve, with HDR10 it is the PQ curve. The main difference between the two is that the gamma curve is a relative curve and the PQ curve is an absolute curve. In other words:

  • In SDR, each signal represents a certain fraction of the maximum luminance the screen can provide. For example, a 75% signal is translated by the gamma curve to 50% of the maximum luminance of the screen. If the maximum luminance is set to 200 nits, 75% is 100 nits. If the maximum is set to 400 nits, 75% is therefore 200 nits.
  • In HDR, however, each signal means a very specific luminance. A 50% signal always means 100 nits, and a 75% signal always means 1,000 nits.

With the differences between HDR and SDR in mind, it becomes clear that tinkering with the settings does not theoretically have the same effect for both types of content.

The effect of the backlight/OLED brightness in SDR

Since SDR works relatively, you expect that when we dim the maximum luminance, the entire luminance curve drops by the same amount. To check that, we measured two devices that we had available. It concerns a Samsung QE65Q70A and a Panasonic TX-55GZ950. On both models, we set the backlight (LCD) and OLED brightness (OLED) to a maximum of 400 nits, and measured a full luminance gradient. We then adjusted the maximum to 200 nits on both devices and measured the full course again.

On the x-axis you see the input signal (in %), on the y-axis the resulting luminance (in nits). In both cases, we see that the curve drops along the entire line. Because the difference in the lower regions is so small, we make this more visible by displaying the graphs on a logarithmic scale.

We then see very clearly that the decline is a constant factor across the board. That is actually the intention. The only difference is of course that with OLED the black remains absolute. We must therefore also omit it from the log/log curves (log (0) is not determined, the limit goes to –infinity).

Photos of both devices also show that effect very clearly.

LCD SDR 200 vs LCD SDR 400

On the LCD screen (above) you can see that despite the fact that the contrast level is preserved, the clear image makes a somewhat faded impression because the deep blacks increase proportionally. The OLED image (below) also becomes much brighter but retains better depth because black is preserved.

OLED SDR 200 vs OLED SDR 400

Something about tone mapping

Before we discuss the effect on HDR, we need to touch on an important aspect of HDR, which is tone mapping. HDR10’s PQ curve is absolute, but what happens if the input signal demands 1,000 nits, and the TV can only deliver a maximum of 700? By the way, the maximum HDR input signal signals 10,000 nits, and it will be a long time before we have that kind of TV in the house. To solve that problem, every TV uses tone mapping. From a certain value, the luminance curve leaves the ideal curve and slopes gently towards the maximum that the screen can display. You can also clearly see this on the graphs that we always provide with a TV review.

Here on the LG OLED55A1 for example. At the bottom right you see the luminance curve, the yellow line is the norm. Above that is the EOTF, again yellow is the norm. If the TV follows the standard, you will lose all white detail above about 480 nits. Instead, the TV rolls down the brightness curve a bit so that it reaches approximately the maximum brightness of the TV around 75% stimulus (which equates to 1000 nits). The roll-of starts around 50% stimulus (100 nits). The TV maps the input range from 100 to 1000 nits to an actual displayed range of 100 to 480 nits.

The effect of the backlight/OLED brightness in HDR

HDR definitely works, so adjusting the maximum luminance should have little effect. Suppose the television can show a maximum of 700 nits. Between 0 and 500 nits, it can accurately trace the PQ curve. It uses tone mapping to map the input signal between 500 and 1000 nits in its range of 500 to 700 nits.

What if you now limit the maximum luminance to, for example, 400 nits? We would then expect the TV to start tone mapping faster, for example for input signals from 200 nits. But in the range between 0 and 200, everything could remain unchanged.

We measure that again with the same two test televisions. We also measure four versions: one with the maximum brightness, one with 80%, 50% and finally 20% of the maximum brightness. And because we know in the meantime that we don’t see enough on a linear scale, we immediately give you the log/log graphs. On the x-axis you see the input signal (in %), on the y-axis the resulting luminance (in nits).

LCD HDR LOG LOG vs OLED HDR LOG LOG

And what stands out? With both OLED and LCD, we again see that the curve drops over the entire line, except for some measurement inaccuracies, just like with SDR. That’s not what we expected. Even in the darkest regions (for example, input signal 20 and less) the decline continues.

And that could be a serious problem. Because HDR10 content counts on absolute reproduction. Each scene is mastered to show exactly a certain brightness. Certainly for dark scenes, that even darker rendering can then cause significant loss of black detail.

We also took pictures to compare between maximum brightness and half brightness. What stands out in any case is that dark shades go all the way into the depths. Do not compare between OLED and LCD, those images are made with different camera settings and you should therefore not compare.

LCD HDR 100 vs. LCD HDR 50

OLED HDR 100 vs OLED HDR 50

Conclusion: use the light sensor!

In any case, our conclusion is clear. Stay away from the backlight/OLED brightness setting in HDR. You run the risk of making the image significantly darker. You only wanted that for bright shades, but our tests indicate that dark shades are also pulled down, even though we didn’t expect that. For example, dark images may become very ugly and you lose a lot of shadow nuance. We also don’t know if this is the case on all televisions, but it is better to avoid the problem.

There is a solution: the light sensor. In recent years much attention has been paid to the use of the light sensor on television so that it adjusts the image based on the amount of ambient light. Where in the past this was done by simply adjusting the maximum luminance, modern implementations are much smarter. They also look at the EOTF and color rendering.

For HDR images, there are even specific implementations such as HDR10+ Adaptive and Dolby Vision IQ. If you find the image too intense, use the light sensor. It also works on all models with standard HDR10 images and with SDR images. Of course you then give some control over the display. That is not done for real image purists. But those who prefer not to use different settings for the day and the evening can still adjust the display in this way.