High Dynamic Range is a big step forward in image quality. Not only for movies (or TV series) but also for gaming. However, to ensure that everyone fights with equal weapons, some rules must be observed. That is why the HDR Gaming Interest Group (HGIG) was established. What can you expect from it?
Who / What is HGIG?
Another acronym to remember. The HDR Gaming Interest Group (HGIG) is a group of companies that voluntarily collaborate to create guidelines to enhance the HDR gaming experience. HGIG does not make standards, for that they rely on existing standardization organizations. HGIG guidelines are a kind of “best practice” for game developers, console makers and TV manufacturers.
Such a group can of course only have an impact if there is enough support from the industry. And that seems to be the case anyway. The complete list of members can be found here . Let's quote the main names.
Television: LG, Panasonic, Samsung, Sharp, Vizio, TCL, TPV.
Gaming hardware: Sony, Microsoft, AMD, NVIDIA
Gaming studios: Activision, Codemasters, Eidos, Epic Games, Ubisoft
What's the problem?
Gaming in HDR, that can only be good? And aren't there standards like HDR10 ? Why is there still a need for further guidelines? In a nutshell: because different televisions display the HDR signal in different ways. This obviously leads to a different gaming experience, but it can also lead to an unfair advantage for some players.
The slightly longer explanation then. The standard we use for HDR images is ITU-R BT.2100. Among other things, it specifies the color gamut (ITU-R BT.2020) and the luminance range (10,000 nits). Gaming images can make full use of this. After all, every image is rendered by the computer or game console, there are no physical limitations of cameras, sensors and lenses. That's great, but … Our televisions can't display that at all. The top models currently sit around a peak luminance of 2,000 nits, and the color range is around 95% P3 (which is roughly 70% from Rec. 2020). What does a television do with images that exceed its physical capabilities?
The answer is tone mapping. With this technique, the TV converts the incoming image into something that is within its physical capabilities (which is why we also call this display mapping), while of course retaining the original character of the image as much as possible. Specifically, suppose you want to display a gray gradient that goes from 0 to 10,000 nits, but your television can display up to 1,000 nits. The TV will then convert that gray gradient to 0 to 1,000 nits. You think that's easy to do? Unfortunately, there is no single standard that defines exactly how to map folders. Do you cut out all white nuances above 1,000 nits? In that case you will lose a lot of white detail. Or map the full range to that of the screen. Then, of course, the whole image becomes darker. Or do you take some combination of the two?
In practice, the tone mapping is a combination of manufacturers' own research so that the tone mapping makes optimal use of their screen technology, and a certain intended result (a kind of own accent). And that is where the problem lies.
Complications for gamers
Tonemapping is a step that happens on all HDR televisions, even now with most HDR content. And yes, that also occasionally leads to problems with film and series. HDR content is too dark, or one TV hides some white detail, these are all possibilities. And although the negative effects are of course not desirable, it is not a problem that one viewer sees a slightly different image than the other. But for gamers, that can lead to complications. If one gamer doesn't see certain details, the game can be more difficult for him. And if you play against each other, a player who sees those details can have a competitive advantage.
These shots show the problem. The first photo shows what happens when the screen decides to clip white detail. The player does not see what comes after the end of the tunnel. The second photo shows a player who does see what comes after the end of the tunnel.
What should the solution take into account?
solve all sorts of ways, but HGIG has a number of conditions that must be respected.
- Games and hardware have to take into account that the HDR capacities of each screen differ, and they have to make maximum use of those capacities. gameplay must be consistent regardless of screen capabilities. This is of course especially important for shadow detail (an enemy hiding in a dark corner) or white detail (the tunnel example above).
- The solution must be forward compatible. When you buy a new screen with better performance, your gaming experience should be at least the same or better. So if ever screens come on the market that can handle, for example, 4,000 nits, then the game should make use of that.
- The solution should be practical and simple for developers and end users.
The solution that HGIG has devised, based on recommendations for each of the three components in the game chain.
- HDR screens: an HDR screen must report its capacities (concrete details further) to the connected game console or computer and so to the game, so that who can take this into account. It must also disable tone mapping within those capabilities!
- The game console: must query the capabilities of the connected display and pass them on to the game software. The console should also offer a simple calibration process that takes precedence over the information provided by the screen. This is in case the player wants to adjust their gaming experience (for example, in a lot of ambient light).
- The game developer: When rendering his game, must take into account the capacities of the connected screen so that he stays within the specified luminance values. 19659038]
The main problem was that HDR screens can use very different tone mapping. That problem is solved by disabling tone mapping in Game mode. In return, the screen (via the game console) reports to the game what its capabilities are. This happens with three values:
- MaxFFTML (MaxFullFrameToneMapLuminance): This is the maximum luminance at which the screen can still show white detail on a full screen.
- MaxTML (MaxToneMapLuminance): This is the maximum luminance at which the screen can still show white detail. show, in a window that fills 10% of the screen surface.
- MinTML (MinToneMapLuminance): This is the minimum luminance at which the screen can still show black detail.
Based on those three parameters, two ranges are created:  Primary HDR Range: the range between minTML and MaxFFTML.
- Extended HDR Range: the range between minTML and MaxTML.
Alternatively, a screen cannot communicate precise values but reports that it belongs to one of four predefined categories.
- Category 1: MinTML = 0.1 nits; MaxFFTML = 600 nits; MaxTML = 600 nits.
- Category 2: MinTML = 0.1 nits; MaxFFTML = 600 nits; MaxTML = 1000 nits.
- Category 3: MinTML = 0.1 nits; MaxFFTML = 600 nits; MaxTML = 4000 nits.
- Category 4: MinTML = 0.0 nits; MaxFFTML = 600 nits; MaxTML = 10000 nits.
A game that supports HGIG must take into account the properties of the screen that is provided via the game console during the rendering of the images. Areas critical to gameplay must remain within the Primary HDR range if they occupy a large area of the screen, or within the Extended HDR Range if they occupy a small area of the screen.
Games may be there handle it smoothly. If a game only exceptionally shows a very bright object on the entire screen, the developer can decide to still render within the Extended HDR range and thus accept that some displays occasionally clip away a minimum of white detail. If it regularly shows very bright objects on the entire screen, it does not always have to remain within the Primary HDR range, but it can interpolate between the two ranges, for example, based on histogram information of the image.
The role of the game console is primarily the passing of the screen information to the game software. To do this, he has access to a database with information about all screens (which information is provided by the screen manufacturer). The game console identifies the connected screen and looks it up in the database. He then extracts the exact values for this screen or the assigned category (again, this depends on what the manufacturer reports). If it cannot identify the screen or it does not appear in the database, it uses the default value of category 2.
The game console has another function. It can display three calibration screens that allow the end user to set the MinTML, MaxFFTML and MaxTML. That choice always takes precedence. This allows the user to customize the gaming experience based on what he sees, which is especially useful in a brightly lit environment.
The HGIG's recommendations seem to us to be an excellent way to make the HDR playing field the same for everyone. but also to enable progress or adaptation. All that remains is the question of who supports HGIG. Because although the list of participating companies seems very complete, there is currently (April 2020) little clear communication about it. Only LG offers this (on 2019 and 2020 models). Information about consoles and games is currently very scarce. Hopefully we can provide more information about that later.