HDR calibration tool - 6.4

Hi,
Let me start by saying that I have access to proper tools to measure my display. However, I’ll admit I’m not entirely sure whether the following will negatively affect HDR content or not. I’ve only recently started paying attention to HDR, mainly because of NVIDIA—only now has it become easy to use HDR properly with recent proton-ge releases or mpv.

In SDR, my display is calibrated so that displaying a 255,255,255 patch (e.g., using KolourPaint) gives me around 100 nits. When I switch HDR on, I have to set the second “100%” slider in the HDR calibration wizard to 103 to achieve the same ~100 nits using the same 255,255,255 patch. This way, SDR and HDR match in terms of 100% SDR luminance. The goal is to leave HDR always on while maintaining the same “desktop luminance”.
Is my thinking flawed @Zamundaaa?

If it is not, In my opinion, the second calibration image could include a white square in the center where you can point your colorimeter/spectro to directly measure and set your SDR “100%” brightness value. Something like the image below, perhaps.

1 Like

No, that’s exactly right.

That’s not a bad idea, but I want to include such functionality in Xaver Hugl / DisplayProfiler · GitLab instead - which will also be linked from display settings, and which could, among other things, automatically adjust the brightness sliders to match the values you’re targeting. The app doesn’t have HDR support yet, but I intend to fix that sooner than later :slight_smile:

2 Likes

@Zamundaaa The second slider for the SDR “100%” brightness value affects MPV video playback. I’m not sure whether setting the second slider to 103, in order to match both real SDR and HDR (SDR) desktop luminance levels is also reducing the overall brightness of HDR movies. This part is confusing, and without any reference, I can’t tell whether HDR movies are being reproduced correctly or not.
Care to chime in?

Thank you

There is no “correct”.

Make the screen as bright or dark as you want it to be. That’s what brightness sliders are there for.

@Zamundaaa When it comes to movie consumption in a reference environment, such as the one I have carefully built over the years, that is precisely the purpose of calibrating a display. Having invested considerable time ensuring my display conforms to both REC.709 and ST.2084 (BT.2020), I would prefer to bypass any brightness controls that could alter or compromise playback accuracy. If an HDR scene is intended to output 145 nits APL, I want my display to reproduce exactly 145 nits. In SDR this presents no issue, but in HDR I find myself somewhat lost at the moment. Ideally, I would like my HDR desktop to match my SDR setup. That is where the “SDR slider” comes in, but without altering movies in any way, similar to the concept of audio bitstreaming.

@Zamundaaa

Ok. I got myself a Raspberry Pi and I’ve been testing HDR output using LibreELEC/Kodi, so I finally have a reference, and I’d like to report what I’ve observed.

When using measurement equipment with standardized HDR test patterns, I found that for the KDE desktop in HDR mode, MPV to visually match what the Raspberry Pi outputs, the SDR slider in the HDR Calibration tool has to be set around 203 (From what I’ve read this is expected).

However, this introduces a significant problem: anything else becomes extremely bright. For instance, a pure SDR white value of 255,255,255 results in about 190 nits, which is uncomfortably bright in a completely dark room. More importantly, SDR content is rendered incorrectly under these conditions.

From a technical standpoint, the SDR setting should not affect HDR output in any way. Currently, it seems impossible to have both desktop/SDR content and HDR content produce accurate luminance levels when HDR is enabled. It’s either one or the other.

Could this please be reviewed? I stumbled upon this very topic in “KDE bugs,” and many people expressed the same findings.

Thank you for your time and support.

Rec.709 is about cameras, not for displays. ST.2084 doesn’t actually define a viewing environment either… What you probably wanted to refer to is BT.1886 and BT.2408, but these are different and conflicting standards.

Indeed, as with SDR everyone agrees that the concept of displaying “absolute nits” is stupid. It is the same level of stupid with HDR.

From a technical standpoint there is a reference luminance, and that applies to all content, no matter whether or not you classify the content as “HDR” or “SDR”.

Quite the opposite.

Your room either matches the BT.2408 recommendation, meaning that 203cd/m² SDR content and “unmodified” HDR content is comfortable to look at.

Or it matches BT.1886, meaning that 100cd/m² SDR content is comfortable to look at, but “unmodified” HDR content is too bright.

To make it possible to view a range of content without constantly having to rearrange your room all the time, all modern systems with good HDR support adjust brightness levels of all kinds of content to your brightness setting. KWin is no exception there.

Rec.709 is about cameras, not for displays.

Partly true, partly misleading.

ITU-R BT.709 (Rec.709) is the HDTV system standard. It covers not just the camera encoding (OETF), but also the display characteristics. Color primaries, D65 white point, and originally an assumed CRT-style EOTF. So Rec.709 isn’t just about cameras. It defines the whole HDTV ecosystem, cameras and displays.

ST.2084 doesn’t actually define a viewing environment either…

Correct , but that’s not the point.

SMPTE ST.2084 (PQ) defines how digital signal values map to absolute luminance (up to 10,000 nits). It deliberately doesn’t prescribe the viewing environment, that comes from related standards (e.g. BT.2100, BT.2390).

Where things go wrong is when an OS or application interferes with PQ. For example, KDE’s SDR slider actually alters how HDR content is mapped, which means the signal isn’t displayed per ST.2084 anymore. That’s why users keep asking for a true HDR passthrough mode, let the PQ signal hit the screen unmodified. If users want to dim the display, they can use the normal brightness control, but HDR content itself shouldn’t be remapped.

What you probably wanted to refer to is BT.1886 and BT.2408…

Important distinction here.

BT.1886 doesn’t define a color space, it defines the reference EOTF for Rec.709 content on modern displays. If your display has perfect blacks (like OLED), BT.1886 is essentially equivalent to Rec.709 with gamma 2.4.

ST.2084 (PQ) does something similar for HDR: it defines an absolute-luminance EOTF, assuming perfect blacks.

BT.2408 is different: it gives operational guidelines for handling HDR signals (PQ/HLG), e.g. how to map them to real-world displays with limited brightness.

So you don’t “choose BT.1886” as a calibration color space. You calibrate in Rec.709 primaries with BT.1886 as the EOTF.

Similarly, you don’t “choose ST.2084” as a color space either. You calibrate in Rec.2020 primaries (for HDR10 / Dolby Vision / HDR workflows) with ST.2084 (PQ) as the EOTF.

But going back to the environment @Zamundaaa . Have a look at the following link. (I can’t post links).

lightillusion . com / viewing_environment.html

It does not define the display or viewing environment.

Every image encoding standard defines that. PQ is not special.

That is a meaningless statement. Just like presenting sRGB at 80 cd/m² outside of the sRGB reference viewing environment is wrong, presenting content following BT.2408 at 203cd/m² outside of its viewing environment is just plain nonsense.

Trying to present what I wrote as “actschually technically incorrect” does not help your argument. You were complaining about luminance, not about primaries.

Linking to a website summarizing that different video standards do in fact require different viewing environments doesn’t help either.

@Zamundaaa Mate, it doesn’t matter anymore. If you still think KDE’s handling of HDR is correct then, for you it is, but for anyone else reading this just play the same HDR file on any recent LG OLED built-in player or use a Raspberry Pi 4 with LibreELEC/Kodi using the Filmmaker Mode Picture Profile on the LG to make things as simple as possible, then do the same on KDE while adjusting the HDR “100%” slider. The fact that the “100%” brightness slider in the HDR calibration tool alters HDR content is proof enough that KDE’s HDR implementation is… “““broken”””. I understand the idea (I think) behind the “100%” control, but it’s useless until it stops affecting actual HDR content. For that, we already have the system Brightness control.

I will keep HDR enabled on KDE for gaming only honestly. I’m almost with you on this one. I don’t really care as much about how a game presents HDR, because it’s almost impossible to know the exact conditions or errors in the game’s HDR pipeline. Since I run my desktop at “103%” for the “100%” HDR slider, if I don’t want to be blinded and as stated before, I want my 100% “sdr” to be 100Nits, I already know my games HDR presentation will be darker, with much less punchy highlights, mehhh.

But for movies? Absolutely not. They need to be reproduced as close to the mastering intent as possible.

And honestly, the solution is simple. Is there a straightforward way to toggle HDR automatically when a game or fullscreen application starts? For example, add an HDR on/off parameter to “Window Rules” that switches HDR on when the program is running and off when it is not. This seems like a really straightforward and elegant solution, in my opinion.

Chromium follows the same behavior that SDR brightness affect HDR output brightness under windows: Chromium

Btw for movie playback, user could manually choose whether to let the brigheness get altered by SDR brightness settings when using MPV: mpv not respecting Windows 11 HDR settings? · mpv-player/mpv · Discussion #16814 · GitHub

I gave up on trying to play HDR videos on my computer, at least for now, after seeing how they’re supposed to look with a dedicated standalone player that simply passes the data through untouched.

I also prefer that HDR content not be touched. I switch off the lights when I watch HDR and want an unmodified image passed through to my TV without changing any settings. At the same time, I do not want SDR desktop applications blinding me with 203 nits. In my opinion, the 203 nits diffuse white is intended to be used within the context of HDR content, not to indiscriminately stuff blazing white SDR applications into a HDR container. This is why Windows has the “SDR Content Brightness” slider.

Windows has different modes of operation for external monitors and built-in displays.

On external monitors, there is no global brightness slider. HDR content is always passed through untouched. When watching HDR test patterns in Edge, the content correctly clips at the peak brightness of the display. There is only one slider “SDR Content Brightness” which controls the white level of SDR applications within the HDR container. This lets you set SDR to a comfortable level like 120 nits (10 on the slider) with no effect on HDR content.

On built-in displays, there is a global brightness slider. Setting the global brightness slider below 100 dims both SDR and HDR content. The “SDR Content Brightness” slider is replaced with a “HDR Content Brightness” slider. Setting the “HDR Content Brightness” slider below 100 tone maps HDR content with no effect on SDR content. To view HDR content accurately, both sliders must be set to 100. When watching HDR test patterns in Edge, the content correctly clips at the peak brightness of the display. I would like to measure the SDR brightness at this point to see what the value is. I wonder if it’s 203 nits according to the standard, or 480 nits like when the “SDR Content Brightness” slider is to 100 on an external monitor. If either slider is not set to 100, test patterns in Edge do not clip at the peak brightness of the display. To view SDR content comfortably, the global brightness slider must be reduced as needed.

On KDE, the only way I could get test patterns in Edge to clip at the peak brightness of the display was to run the HDR calibration and set both “maximum brightness” and “100% normal brightness” to the same value, i.e. the peak brightness of the display. The global brightness slider must be set to 100 for HDR to display accurately. If the global brightness slider is not set to 100, test patterns in Edge do not clip at the peak brightness of the display. To view SDR content comfortably, the global brightness slider must be reduced as needed.

This does not seem to be the case anymore. The “SDR Content Brightness” slider has no effect on HDR videos playing in Chromium on Windows. It affects only the Chromium UI around the video.

@Monstieur If chromium have different behavior on different platform, I think this is some kind of implementation error on chromium and you’d better report this issue. Previously, chromium used a scRGB framebuffer for the whole window, and so that compositor can’t decide which part of the content is actually HDR, so whether the HDR content brightness is adjusted was completely decided by chromium. And if they want to implement the same behavior on wayland, that’s decided by their code.

Edit: also you can find some discussions here: Making sure you're not a bot!

im sorry to say that but i think the current calibration tool is very bad.

i prefer a lot more the windows 11 tool because this is clear, no guessing.

it show light areas, dark areas , peak , and show clear comparisons with a before and after result.

i cant link a video of the calibration process but you could do something similar but with a kde logo instead i guess

I got myself an AMD 9070 XT and took another look at HDR. It’s still a mess. Compared to my external movie player, it feels like guesswork, and the desktop becomes excessively bright if I try to match things.