Before KDE 6.1, brightness changes (via keyboard shortcut or slider) would fade smoothly between levels, which felt really nice. After upgrading, on Wayland, brightness changes happen instantly with no transition. On X11, it still fades smoothly, so it seems Wayland-specific.
The underlying reason is that in Wayland, KWin now takes control of brightness animations. This allows it to do cool stuff like turn up the display brightness for HDR content while simultaneously reducing the brightness of SDR windows to keep them looking the same. But it also means that there is now more than one invocation of the privileged (and rather slow) KAuth brightness helper process, which reduces the possible framerate of the animation.
I need to investigate how much it would help if we switch from invoking the KAuth helper to the calling SetBrightness D-Bus method that’s provided by systemd-logind. That would increase brightness animation frame rates for sure, we’d have to see if it’s fast enough to make it as buttery smooth again as it was before the KWin brightness rework.
In the long run, the Linux kernel will at some point expose brightness as a property on KWin’s DRM handle (or was it KMS? I always get it wrong) so KWin will eventually be able to do it directly without the delay caused by interacting with other background services.
The funny thing is on my HDR enabled Monitor it still fades smoothly or at least noticeable smoother than on the other two (SDR).
But at least Brightness control now works with all my Monitors on plasma 5 and early 6 I could not software control the Brightness of my external Monitors at all.
Brightness via DDC/CI will not be animated anymore in order to minimize the risk of shortened monitor lifespans. The chance of this happening is hard to verify for someone not in the monitor industry, but we’ll follow the example of monitor manufacturers’ own helper applets and apply newly adjusted monitor brightness values only after a half-second delay.
Lifespan in this case refers to the nvram write limits - which for some/old monitors may be quite low.
I was assuming that this thread is about the internal laptop display, which is not controlled by DDC/CI, but instead by firmware and Linux kernel brightness interfaces which are intended to transition fast with many steps in the brightness animation.
The internal yes. And even the external SDR Monitor, when set to not use DDC/CI, transitions basically instantly while the HDR one, probably also not using DDC/CI I would guess, does it veeeerrry smoothly in comparison.
Edit: Just checked the HDR one has also DDC/CI unchecked, pretty sure I have not seen that checkbox when HDR was enabled before. Not sure when I last checked these settings, though. But when DDC/CI on the HDR display is enabled it still transitions as smoothly (did not reboot), strange.
Perhaps that is a function of the display’s firmware - the display itself might be doing the transition. If the display’s own controls are used to make a major change, does it also show a gradual transition? That would be a very nice feature. If yes, maybe I should consider it when next in the market, what brand model is it?
I guess you could also do a major jump using ddcutil on the command line, and see if that transitions.
BenQ EX2710Q (bought mainly because of its front speakers, with my older Asus Monitor that I still use as secondary the sound is louder and clearer in the room behind not in front of the Monitor).
ddcutil and Nvidia unfortunately never liked each other in the past, not sure if I should get that a retry.
If DDC/CI is not used, as with my Asus Monitor the settings box is unchecked and with my internal Display, a smooth transition should be possible, though? That is not what is happening.
Not sure what kind of display or setting the @KyleDaCow uses.
Thanks, I take a note to check out Beng next time I’m in the market.
Plasma6 normally uses libddcutil to control external monitors. libddctuil is a relative of ddcutil with a large amount of code shared between them. If the Plasma controls work, then ddcutil is also likely to work.
I used to have to follow the Special Nvidia Driver instructions at ddcutil.com to get Nvidia’s i2c implementation to cooperate, but with recent drivers (550-580) these settings appear to be unnecessary (I have a RTX 3060 and GTX 1650 Super).
I’m not sure about the situation with internal panels. Maybe the same wear-reducing decision was also applied to them - either deliberately or by accident. Perhaps @KyleDaCow could ask in the other thread or contact the developers directly.