Hi,
Let me start by saying that I have access to proper tools to measure my display. However, I’ll admit I’m not entirely sure whether the following will negatively affect HDR content or not. I’ve only recently started paying attention to HDR, mainly because of NVIDIA—only now has it become easy to use HDR properly with recent proton-ge releases or mpv.
In SDR, my display is calibrated so that displaying a 255,255,255 patch (e.g., using KolourPaint) gives me around 100 nits. When I switch HDR on, I have to set the second “100%” slider in the HDR calibration wizard to 103 to achieve the same ~100 nits using the same 255,255,255 patch. This way, SDR and HDR match in terms of 100% SDR luminance. The goal is to leave HDR always on while maintaining the same “desktop luminance”.
Is my thinking flawed @Zamundaaa?
If it is not, In my opinion, the second calibration image could include a white square in the center where you can point your colorimeter/spectro to directly measure and set your SDR “100%” brightness value. Something like the image below, perhaps.