Hi,
Posting this here before I file a bug report in case I am being stupid.
I have two monitors, a 2k (2560x1440) primary monitor that I keep at 125% scaling (DP-1), and a 1080p monitor that I keep at 100% scaling (DP-2).
When I have “Legacy applications (X11)” set to “apply scaling themselves”, X11 reads the correct resolution for the primary, but scales my 1080p screen to 2400x1350 for some reason:
> xrandr | grep connected
DP-1 connected primary 2560x1440+0+0 (normal left inverted right x axis y axis) 597mm x 336mm
DP-2 connected 2400x1350+2560+90 (normal left inverted right x axis y axis) 598mm x 336mm
This makes X11 apps look crisp on my primary monitor, but blurry on my secondary.
If I change to “Scaled by the system”, the opposite (but expected) behavior occurs - The second (unscaled) monitor appears as proper 1080p, while the scaled monitor shows the auto scaled resolution:
> xrandr | grep connected
DP-1 connected primary 2048x1152+0+0 (normal left inverted right x axis y axis) 597mm x 336mm
DP-2 connected 1920x1080+2048+72 (normal left inverted right x axis y axis) 598mm x 336mm
While “Scaled by the system” seems to be working as intended, I don’t know why “apply scaling themselves” is scaling the resolution on my second monitor. My understanding was that “apply scaling themselves” is supposed to pass the resolution of the monitors unchanged to x11 - did I misunderstand how this is supposed to work, or is this a bug? Is there some way to fix this?
My system information:
Arch Linux 6.18.2-zen2-1-zen
KDE Plasma 6.5.4
KDE Frameworks 6.21.0
Qt 6.10.1
