With High Dynamic Range (HDR), the image industry has taken a solid step forward in terms of image quality. The advantages of HDR are clearly visible, but you do need a new television, a compatible player, and images in the right format. In this background piece we tell you how it is with the list of HDR formats, which unfortunately is getting longer and longer.
HDR briefly explained
What was High Dynamic Range again. Briefly explained: a high peak luminance, better shadow detail and a more extensive color palette. In ideal circumstances this culminates in a higher contrast, so that images get more depth and we perceive more detail.
Why do we want a higher peak luminance? Because in reality some images are many times brighter than what a conventional television shows. We also add that it is a misconception that HDR makes all images clearer. The average image clarity remains unchanged in many cases, the opposite would lead to an unpleasant viewing experience. The higher peak luminance is mainly used to make light accents, however. In the same way, the extended color range is required because the current standards can not show all natural colors. We want more shadow detail because our eyes are the most sensitive to very fine shades of light (or shades of shade if you like) in dark images. And that immediately brings us to one of the first standards.
PQ: Perceptual Quantizer
PQ is a new EOTF. A what? An EOTF or Electro-Optical Transfer Function. The EOTF determines the relationship between the incoming (electrical) image signal and the output (optical) light intensity of the screen. You already know an EOTF today, namely the gamma curve, which determines that relationship on all our televisions. The gamma curve is not linear: If the image signal is X and the corresponding light intensity is Y, then a signal 2X does not result in an intensity of 2Y. A gamma curve exponentially: Y = X gamma (where gamma is the gamma value, typical values are 2.2 to 2.4). This exponential curve ensures that the signal gets more steps in the dark parts of the image, where our eye is just the most sensitive to nuances.
The gamma curve complied as long as we looked at SDR images. But with the new maximum brightness (we aim at 10,000 nits in the future), it is no longer enough to show the image without clear banding effects. Unless we raise the bit depth very heavily. You can see that on the graph below. On the X-axis is the luminance (brightness) and on the Y-axis contrast steps. The gray dotted line is the ‘Barten’ curve. It indicates the minimum contrast step that you can make at a given light intensity without creating visible banding effects. Everything above the curve (the red area) creates visible errors. An EOTF must remain within the green part throughout its course.
If we stay with the gamma curve (blue), then a bit depth of 15 bits is needed (almost a doubling compared to the 8 bits of today) in order to obtain the desired result. However, it is also clear that a large part of that precision is wasted in the clear parts. A logarithmic curve with 13 bit bit depth then wastes a lot of precision in the dark parts.
Dolby labs therefore created a new EOTF that closely follows the Barten disaster and goes up to 10,000 nits. It was named PQ (Perceptual Quantizer). With 12 bits, the PQ curve can display the entire signal range without errors. In practice, a 12 bit signal is only absolutely required for animation and graphics, because these images do not contain any noise. With regular camera images where there is always a lot of noise, a 10 bit version is sufficient.
PQ is essentially different from gamma because it is also an absolute EOTF. This means that a certain video signal indicates an absolute value for the brightness. The gamma curve opposite is a relative curve. A certain video signal indicates the relative brightness with respect to the maximum value of the display.
PQ is part of the two main HDR formats.
The most universally supported and used standard for high dynamic range is HDR10. This standard is mandatory used on Ultra HD Blu-ray discs, and is also offered via Netflix, iTunes and Amazon Video. Every TV manufacturer supports it and also the latest game consoles from Sony and Microsoft know the format.
HDR10 uses the PQ curve as a EOTF with a bit depth of 10 bits. The Rec.2020 is used as a color range. The most important addition to this is the use of static metadata. In addition to a part of the color calibration info from the mastering display, an HDR10 signal contains two values at the beginning: MaxFALL and MaxCLL. MaxFALL (Maximum Frame Average Light Level) indicates the maximum average value of one frame (one image). MaxCLL (Maximum Content Light Level) returns the value of the pixel with the highest luminance. Those two values can then use a television to adjust the image to its abilities. This is necessary because almost all HDR content is mastered at least at 1,000 nits maximum, but there are also 2,000 and even 4,000 nits. Unfortunately, most televisions can not handle that. They will therefore adjust the PQ curve.
An HDR10 signal contains one video stream and is never backwards compatible. The TV must therefore know the format.
Dolby labs not only worked out the PQ curve, but also developed a complete standard for HDR images. However, Dolby Vision is less widely supported. It is an optional format on Ultra HD Blu-ray, but streamers such as Netflix, iTunes and Amazon Video use it. At the TV manufacturers LG, Sony, Loewe, TCL and Hisense support it, but always only on the better or even only the top models.
Dolby Vision uses the PQ EOTF and the Rec.2020 color range and can use up to 12 bit bit depth. The main difference with HDR10 is the use of dynamic metadata. Instead of just giving information at the beginning of the video, that information can be adapted from scene to scene or even image to image. In this way, the TV can adjust dark and clear images in their own way to stay within the capabilities of the TV.
Another difference with HDR10 is that Dolby Vision can be implemented in different ways. It can be contained in one video stream, so that it is not backward compatible. But it can also use a basic flow and extension flow. That basic power can then be compatible with HDR10, or with SDR TV (Rec 709). Dolby Vision is therefore much more flexible in that respect.
Hybrid Log Gamma or HLG is an HDR standard developed by the BBC and NHK (the Japanese broadcaster). The standard is primarily intended for TV broadcasts. Why? It has been developed so that you can send out one video stream that can be shown by both SDR and HDR TVs. This is an important characteristic because bandwidth is a valuable asset in the world of TV broadcasts. Transmitting two signals (HDR and SDR) is not an attractive option, a signal that can be shown on both types of display is. Of course, the visible result is a compromise in both cases.
HLG is broadly supported by TV manufacturers (LG, Sony, Samsung, Philips, Panasonic) but on the content side it remains very quiet for the time being. YouTube offers HLG content, but TV broadcasts are not there yet (some test broadcasts are not calculated).
HLG uses an EOTF that uses a gamma function in the dark part of the curve (the first half) and switches to a logarithmic curve in the bright part (the second half). Hence the name. Like the gamma curve, it is a relative function, which therefore adapts to the maximum value of the display and therefore requires no metadata. HLG uses a bit depth of 10 bits, but can also use 12 bits. An important detail is that HLG also uses the Rec.2020 color space. This means that the signal can only be displayed on screens that support this color space. Or concretely: if you have a conventional SDR TV from a few years ago, it will not be able to display an HLG signal correctly. The compatibility with ‘SDR-TV’ actually only refers to recent models that do not have the necessary peak luminance,
To address the shortcomings of HDR10, and to provide an answer to Dolby Vision, Samsung developed HDR10+. The standard has been around for several years, but was put in the spotlight at IFA in 2017 thanks to a collaboration between Samsung, Panasonic and 20th Century Fox. HDR10+ has no license fees (unlike Dolby Vision), an asset with which the partners hope for a quick adoption in the market. For the time being there is little news about the support of HDR10+. Besides Panasonic and Samsung, for the time being, only Philips has announced that its 2018 models will support the standard. Several 2017 models would also receive an update via firmware. On the content side there is, besides 20th Century Fox, only an announcement from Amazon Video. Samsung and Panasonic endeavor to include the standard in the UHD Blu-ray standard. For more news about HDR10+ it is mainly waiting for CES 2018.
HDR10+ builds on HDR10, as the name suggests. In concrete terms, HDR10+ provides the basic HDR10, supplemented with dynamic metadata (just like Dolby Vision), and also provides suggestions on how a television should use metadata to adapt the image to its abilities. Another advantage of HDR10+ is that it is backwards compatible with HDR10. A device that only knows HDR10 will simply ignore the additional metadata in HDR10+ and show the HDR10 version.
Technicolor / Philips
Technicolor and Philips both have a separate proposal for high dynamic range, but they have joined forces. The result is SL-HDR1. The standard has the advantage that it works with a single layer (hence: Single Layer, SL) that can be displayed on existing SDR TVs as well as new HDR TVs. The base layer is a classic SDR-compatible video stream (8 or 10-bit bit depth) and it is accompanied by metadata. SDR TVs ignore that metadata and show the original SDR image. HDR TVs use the metadata to create the HDR version in a post-processing step. SL-HDR1 claims to retain the original quality in both cases. The SDR image is perfectly comparable to the current content, and the HDR version does not compromise on quality compared to, for example, an HDR10 version.
But for the consumer SL-HDR1 still has very little to offer. LG has announced that it will support the standard on its 2017 models via a firmware update. But there are no sources known yet on the content side. Whether SL-HDR1 also has a chance, is a serious question.
The HDR battle scene is becoming increasingly complex. The best contenders today are HDR10 and Dolby Vision, who have already gained a foothold both in the TV market and on the content side. For HLG, HDR10+ and SL-HDR1, a strong catch-up movement seems to be required to bring in the two front runners. How it will clear in practice is currently looking at coffee. A format war is not inconceivable if some studios are exclusive to some size and TV manufacturers do the same. On the other hand, it is perfectly possible that in the end various formats remain that coexist. How long we have to wait for clarity is impossible to say. But it can go fast.