While searching for a solution to the stuttering and low FPS issues I’ve been having with KDE Plasma on Wayland with Nvidia GPU, I came across an interesting comment. The person mentioned that running OBS somehow makes the stutters disappear. To my surprise, this actually worked!
When I maximize Firefox without OBS, I see stutters, but when OBS is opened, the experience becomes much smoother. There are still occasional stutters here and there, but overall, the performance is much more stable.
Has anyone else experienced this, or can explain why OBS might have this effect?
Thanks in advance!
OS: Arch Linux
Kernel: 6.12.3-arch1-1
GPU: NVIDIA GeForce RTX 4070 SUPER
Display: 2560x1440 @ 170Hz
CPU: AMD Ryzen 5 3600
KDE Plasma: 6.2.4
Hi - a different spin on that question, perhaps…if you close Firefox, do you still notice stuttering? I’ve noticed fairly consistently that everything on my system is just a little bit janky when Firefox is running, for some reason.
That issue in particular hasn’t been an enormous deal for me with my system, but for folks who are tracking FPS carefully, it might stick out more?
It might also be worth double-checking if the Linux 6.12 kernel on Arch is involved somehow - improved performance via some sort of screen-sharing or recording made me think of this thread, which eventually narrowed down to something in that kernel version: Skipping frames with external monitor plugged in - #26 by willard
Yeah, closing Firefox helps a lot, but it’s still far from perfect. I have tried the LTS kernel, but nothing changed. I also tried using KDE on Arch, Nobara, and Fedora, and they all stutter.
Yes, I tried every possible option with VRR and didn’t notice any changes. Disabling GSP also did nothing. I tried both open and closed drivers, but there were no changes.
However, when I tried KDE Neon with Nouveau drivers installed, the desktop was surprisingly very smooth.
For now, I switched to GNOME, and it works quite smoothly. However, there are still occasional stutters, though they’re rare. I’m not sure if the problem lies with KDE Plasma itself or if it’s related to the NVIDIA drivers and Wayland.
I am having similar issues. I have a laptop with Intel graphics and one Desktop with an AMD CPU and NVIDIA discrete graphics. I think running OBS, or what also works for me is just playing a YouTube video, smooths everything out by a lot where the experience is almost as good as on my laptop. Also, closing the YouTube video doesn’t bring the stutters back immediately. It takes a few seconds for them to reappear.
It feels like the GPU is wrongly being put into some sort of a power saving mode which limits it to 30fps or something like it. I could imagine the moment you start OBS or YouTube, this power state is left and the card runs faster and thus, the KDE desktop is almost stutter-free.
So yes, I think this is a driver thing. On GNOME I get the same stutters, but they aren’t as apparent as on KDE.
I am starting to feel more and more that I am up to something here.
I can now consistently reproduce this in just a few seconds, no need for videos or OBS. I can open KDE’s “Overview” to get perfect performance for a few seconds at a time.
I am on Arch with Kernel 6.12, KDE 6.2 using nvidia-open.
I can reproduce this with the following steps:
Open a browser and go to a long text page to test scrolling performance. You can also just watch the mouse cursor, it should be easy to see when it starts lagging.
Open terminal and run nvtop
Observe, when idling, it shows PCIe GEN 1@16x
Scroll the text page, note that it has horrible performance and lags
Press META+W to open KDE’s Overview
Observe the GPU jumping to PCIe GEN 4@16x immediately. Also observe, the transitions in and out of Overview are buttery smooth
Now, for a few seconds, the GPU will stay in PCIe G4
Observe the browser scrolling performance is buttery smooth, as is opening and closing the Application Launcher, the System Tray and just everything in KDE.
After like 5 seconds, the GPU goes to PCIe G2, then a second or two later back to G1.
Now the performance is horrible again until you press META+W again.
Can you guys reproduce this?
Obviously the PCIe generation in use is just an indication for the GPUs power state. It directly correlates with the GPUs reported clock speed and memory speed. The performance jump is just easier to spot that way.
Hey, thanks for this post - I found it while debugging my PC with Nvidia GPU on Arch + KDE after seeing how smooth is Plasma on Radeon GPU I installed in my other PC.
Caveat: I am using proprietary Nvidia drivers, not nvidia-open.
I was able to get exactly same result as you using nvtop. Power level is also displayed in Nvidia Settings GUI app (in the PowerMizer section) - and it matches what I see in nvtop.
I was able to (at least partially) fix this by applying
in /etc/modprobe.d/nvidia.conf. I found this on ArchWiki in “NVIDIA/Tips and tricks”, section 8.2 (“Setting static 2D/3D clocks”) - cannot link it here.
Now in nvtop I always see GEN4@16x and in PowerMizer it’s level 4.
It still drops some frames and does not keep 144FPS at all time, but it’s much better than before. I guess that’s all I can expect from Nvidia on Linux as of now.
Hope it helps.
For what it’s worth, here’s my working theory - based on 90% speculation
With all the focus on high-end performance, maybe some of these modern big rig desktop graphics cards - built and benchmarked for cranking out 9,000 FPS for hours in a row on Call of Duty - might not naturally perform as well at the basic desktop computing demands of “chill for 30 minutes, boost up quickly to render this menu, then go back to chilling.” Perhaps they need more assistance from the software and drivers in being told when and how to ramp up, and the NVIDIA drivers don’t have that worked out as well on Wayland (yet)?
The anecdote that makes me think that is, across our family I regularly use three devices that are all running up-to-date Fedora KDE 41 with a Wayland session - so the same versions of all system software:
9 year old gaming desktop - NVIDIA GTX 1060 card (old, soon-to-be unsupported)
6 year old gaming laptop - AMD+NVIDIA hybrid
New gaming desktop - NVIDIA RTX 4070 SUPER ULTRA MEGA PRO MAX (name not exact) from 1 year ago
The performance of each device at basic desktop interactions - moving windows around, menu animations, and all of that - has a pattern that surprised me:
9 year old desktop: 100% buttery smooth
6 year old laptop: 100% buttery smooth
New gaming desktop: Sometimes smooth, sometimes janky (even with GSP disabled)
That’s all speculation, but I am 99% sure that, as Nate said above, it’s all down to clock speeds. On the new gaming desktop, running the command sudo nvidia-smi --lock-gpu-clocks=1980,3105 to raise the minimum clock speed makes everything 100% buttery, just like the other less-powerful devices.
Proposed workarounds visually fixed the issue for me, thanks!
However, FIY, as indicated by @ngraham, idle power draw of my 4080 almost doubled. I don’t know how accurate the values in nvtop are, but a jump from 30W to 60W is surely something I’d keep in mind before running these workarounds long term.
Hi! Just in case, it might be worth double-checking what’s shown in nvidia-smi - for me, setting the minimum clock speed on a 4070 SUPER to 1980 MHz results in power usage during desktop computing of anywhere between 8-17W. I might have a different workload for the card, though - only one monitor, usually not a whole ton going on at once.
4070 Super here driving dual 1200p monitors, power usage at the desktop is 25W. I don’t have to lock my clocks, and oddly enough as an Nvidia user I don’t experience any desktop fps issues running Wayland.
Maybe I don’t have enough tasks routinely running, or have enough total display pixels to push, to trigger my card into a high-enough power state naturally…or maybe I just got a “heavy sleeper” of a card
Hi there and thanks for the input. After comparing the output from nvtop and nvidia-smi I’m fairly certain it’s from the same source, as they are essentially identical.
I must admit it’s interesting to read the values you both are getting.
For some context:
My idle clock speed is @ 210Mhz and power somewhere above 30W. I’m running an MSI Suprim X 4080 connected to a single 55" 4k OLED 120Hz TV.
Guess I shouldn’t wonder about the power draw