Apps not using Nvidia GPU (Brave, Chrome, Vesktop) on KDE Neon Wayland

Hi,

I’ve recently noticed a drop in performance in Brave (and also Chrome and Vesktop, my custom Discord client). The framerate feels much lower than before, for example, when I run the WebGL Aquarium test (https://webglsamples.org/aquarium/aquarium.html) I’m only getting around 40-60 FPS (sometimes up to 70), but I used to get 144 FPS matching my 144Hz screen.

I checked my Nvidia GPU (RTX 4060 Laptop) and it works fine in general, but it seems these apps are not using it, even though hardware acceleration is enabled in all of them. From nvidia-smi, I can see that Xorg and KWin are using the GPU, but not these browsers or Vesktop.

 ⚙ martin@martin-pulse17b13vfk  ~  nvidia-smi
Mon Aug 11 09:36:42 2025       
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 570.169                Driver Version: 570.169        CUDA Version: 12.8     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 4060 ...    Off |   00000000:01:00.0  On |                  N/A |
| N/A   51C    P8              6W /  115W |     133MiB /   8188MiB |      0%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
                                                                                         
+-----------------------------------------------------------------------------------------+
| Processes:                                                                              |
|  GPU   GI   CI              PID   Type   Process name                        GPU Memory |
|        ID   ID                                                               Usage      |
|=========================================================================================|
|    0   N/A  N/A            2313      G   /usr/lib/xorg/Xorg                       42MiB |
|    0   N/A  N/A          713364      G   /usr/bin/kwin_wayland                     2MiB |
+-----------------------------------------------------------------------------------------+

Here’s my system info:

  • OS: KDE neon User Edition

  • KDE Plasma: 6.4.3

  • KDE Frameworks: 6.16.0

  • Qt: 6.9.1

  • Kernel: 6.14.0-24-generic (64-bit)

  • Graphics Platform: Wayland

  • CPU: 13th Gen Intel® Core™ i7-13700H

  • RAM: 32GB

  • GPUs: Nvidia GeForce RTX 4060 Laptop + Intel® Graphics

I’m running a triple-monitor setup:

  • Internal 1080p 144Hz screen

  • External 2K 144Hz (HDMI)

  • External 1080p 60Hz (HDMI via USB-C)

For scaling and stylus/tablet compatibility reasons, I also have custom Exec entries in my .desktop files:

Brave:

Exec=env __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia brave-browser %U --enable-features=UseOzonePlatform,WaylandWindowDecorations,VaapiVideoDecoder,VaapiIgnoreDriverChecks --ozone-platform=wayland --ozone-platform-hint=wayland --use-gl=angle --enable-wayland-ime --enable-zero-copy --disable-features=GlobalShortcutsPortal

Chrome:

Exec=env __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia /usr/bin/google-chrome-stable --enable-features=UseOzonePlatform --ozone-platform=wayland --ozone-platform-hint=wayland --enable-wayland-ime --enable-features=WaylandWindowDecorations --enable-features=WebRTCPipeWireCapturer --disable-features=GlobalShortcutsPortal %U

I’m not sure exactly when this started, but I suspect it might have been after a recent update. Does anyone know how I can force Brave, Chrome, and Vesktop to use my Nvidia GPU for full performance?

Thanks in advance for your help!

Up please! No one have an idea?

I threw everything but the kitchen sink at it. Probably not all is necessary but whatevs…

__GLX_VENDOR_LIBRARY_NAME=nvidia __NV_PRIME_RENDER_OFFLOAD=1 __VK_LAYER_NV_optimus=NVIDIA_only LIBVA_DRIVER_NAME=nvidia DRI_PRIME=pci-0000_03_00_0 google-chrome-stable --render-node-override=/dev/dri/renderD129

DRI_PRIME would need to be modified to match the pci address of your nvidia GPU.

render-node-override likewise for your own sysfs path /dev/dri/renderxxx.

Hope one of these works for you.:crossed_fingers:t2:

I usually just do a prime-run Application just the prime-run “prefix” nothing else needed with my RTX 4070M.
I do not use any of the mentioned application and do this mainly with games, though.
It works for me for example with Firefox but only, and that may be important here as well, if Firefox (another instance of) is not already running.

Both of your solutions work and do allow Brave to use the Nvidia GPU, however, there’s a major issue: I’m running on Wayland, and without the following flags:

--enable-features=UseOzonePlatform,WaylandWindowDecorations,VaapiVideoDecoder,VaapiIgnoreDriverChecks --ozone-platform=wayland --ozone-platform-hint=wayland --use-gl=angle --enable-wayland-ime --enable-zero-copy --disable-features=GlobalShortcutsPortal

…Brave doesn’t behave correctly in my multi-monitor setup.

Each of my displays has its own scale factor, and only with those flags does the browser render properly across all of them. On X11, this kind of per-monitor scaling isn’t possible, which is one of the reasons I use Wayland in the first place.

Also, without the --enable-wayland-ime --disable-features=GlobalShortcutsPortal flags, my graphic tablet stylus doesn’t work correctly in Brave (input doesn’t register). So unfortunately, just using prime-run or switching to X11 isn’t really a viable workaround for me.

Any idea how I could keep using these Wayland-related flags while still offloading rendering to the Nvidia GPU?

Thanks again for your help!

Ok so, apparently with these flags it’s a bit better, but not as much as when using prime-run directly (certainly due to the Wayland compatibility-related flags):

env __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia __VK_LAYER_NV_optimus=NVIDIA_only LIBVA_DRIVER_NAME=nvidia DRI_PRIME=pci-0000_01_00_0 brave-browser %U --enable-features=UseOzonePlatform,WaylandWindowDecorations,VaapiVideoDecoder,VaapiIgnoreDriverChecks --ozone-platform=wayland --ozone-platform-hint=wayland --use-gl=angle --enable-accelerated-2d-canvas --enable-gpu-rasterization --enable-zero-copy --ignore-gpu-blocklist --enable-wayland-ime --disable-features=GlobalShortcutsPortal

(I added --enable-accelerated-2d-canvas --enable-gpu-rasterization --enable-zero-copy --ignore-gpu-blocklist)

If I add the --render-node-override=/dev/dri/renderD129 flag, which I found with:

 martin@martin-pulse17b13vfk  ~  ls /dev/dri/
by-path  card1  card2  renderD128  renderD129
 martin@martin-pulse17b13vfk  ~  ls -l /dev/dri/by-path/
total 0
lrwxrwxrwx 1 root root  8 août  17 04:14 pci-0000:00:02.0-card -> ../card1
lrwxrwxrwx 1 root root 13 août  17 04:14 pci-0000:00:02.0-render -> ../renderD128
lrwxrwxrwx 1 root root  8 août  17 04:14 pci-0000:01:00.0-card -> ../card2
lrwxrwxrwx 1 root root 13 août  17 04:14 pci-0000:01:00.0-render -> ../renderD129

Then WebGL is simply broken and doesn’t work at all…

Here are the process with prime-run and no other flags:

 martin@martin-pulse17b13vfk  ~  nvidia-smi
Wed Aug 20 12:21:47 2025       
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 570.169                Driver Version: 570.169        CUDA Version: 12.8     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 4060 ...    Off |   00000000:01:00.0  On |                  N/A |
| N/A   54C    P0             12W /  115W |    1710MiB /   8188MiB |     21%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
                                                                                         
+-----------------------------------------------------------------------------------------+
| Processes:                                                                              |
|  GPU   GI   CI              PID   Type   Process name                        GPU Memory |
|        ID   ID                                                               Usage      |
|=========================================================================================|
|    0   N/A  N/A            2367      G   /usr/lib/xorg/Xorg                       42MiB |
|    0   N/A  N/A            3045      G   /usr/bin/kwin_wayland                     2MiB |
|    0   N/A  N/A            3729      G   /usr/bin/xwaylandvideobridge              2MiB |
|    0   N/A  N/A         1028813      G   /opt/brave.com/brave/brave                2MiB |
|    0   N/A  N/A         1028852      G   ...dc794e3457ca51991b7cd87e8b74e       1545MiB |
+-----------------------------------------------------------------------------------------+
 martin@martin-pulse17b13vfk  ~  gpu-procs
    PID PROCESS
1028813 /opt/brave.com/brave/brave
1028852 /opt/brave.com/brave/brave --type=gpu-process --crashpad-handler-pid=1028815 --enable-crash-reporter=354cae97-dbac-4877-a425-76f382c3bf5a, --change-stack-guard-on-fork=enable --gpu-preferences=UAAAAAAAAAAgAAAIAAAAAAAAAAAAAGAAAQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAYAAAAAAAAABgAAAAAAAAAAQAAAAAAAAAIAAAAAAAAAAgAAAAAAAAA --shared-files --metrics-shmem-handle=4,i,17876283654401505373,9405295780224856860,262144 --field-trial-handle=3,i,4988437677770479961,13469430681794142690,262144 --disable-features=EyeDropper --variations-seed-version=main@8507055a00cdc794e3457ca51991b7cd87e8b74e
   2367 /usr/lib/xorg/Xorg -dpi 0 -background none -seat seat0 vt2 -auth /run/sddm/xauth_uhttTs -noreset -displayfd 16
   3045 /usr/bin/kwin_wayland --wayland-fd 7 --socket wayland-0 --xwayland-fd 8 --xwayland-fd 9 --xwayland-display :1 --xwayland-xauthority /run/user/1000/xauth_AihDHR --xwayland
   3729 /usr/bin/xwaylandvideobridge

And here are the process without prime-run and the flags I mentioned at the start of my message:

 martin@martin-pulse17b13vfk  ~  nvidia-smi
Wed Aug 20 12:23:05 2025       
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 570.169                Driver Version: 570.169        CUDA Version: 12.8     |
|-----------------------------------------+------------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
|                                         |                        |               MIG M. |
|=========================================+========================+======================|
|   0  NVIDIA GeForce RTX 4060 ...    Off |   00000000:01:00.0  On |                  N/A |
| N/A   56C    P0             19W /  115W |     216MiB /   8188MiB |      3%      Default |
|                                         |                        |                  N/A |
+-----------------------------------------+------------------------+----------------------+
                                                                                         
+-----------------------------------------------------------------------------------------+
| Processes:                                                                              |
|  GPU   GI   CI              PID   Type   Process name                        GPU Memory |
|        ID   ID                                                               Usage      |
|=========================================================================================|
|    0   N/A  N/A            2367      G   /usr/lib/xorg/Xorg                       42MiB |
|    0   N/A  N/A            3045      G   /usr/bin/kwin_wayland                     2MiB |
|    0   N/A  N/A            3729      G   /usr/bin/xwaylandvideobridge              2MiB |
|    0   N/A  N/A         1031249    C+G   ...eatures=GlobalShortcutsPortal         12MiB |
|    0   N/A  N/A         1031876    C+G   ...asma-browser-integration-host         12MiB |
+-----------------------------------------------------------------------------------------+
 martin@martin-pulse17b13vfk  ~  gpu-procs
    PID PROCESS
1031249 /opt/brave.com/brave/brave --enable-features=UseOzonePlatform,WaylandWindowDecorations,VaapiVideoDecoder,VaapiIgnoreDriverChecks --ozone-platform=wayland --ozone-platform-hint=wayland --use-gl=angle --enable-accelerated-2d-canvas --enable-gpu-rasterization --enable-zero-copy --ignore-gpu-blocklist --enable-wayland-ime --disable-features=GlobalShortcutsPortal
1031876 /usr/bin/plasma-browser-integration-host chrome-extension://cimiefiiaegbelhefglklhhakcgmhkai/
   2367 /usr/lib/xorg/Xorg -dpi 0 -background none -seat seat0 vt2 -auth /run/sddm/xauth_uhttTs -noreset -displayfd 16
   3045 /usr/bin/kwin_wayland --wayland-fd 7 --socket wayland-0 --xwayland-fd 8 --xwayland-fd 9 --xwayland-display :1 --xwayland-xauthority /run/user/1000/xauth_AihDHR --xwayland
   3729 /usr/bin/xwaylandvideobridge
1 Like

On my system Chrome cannot use Wayland – it’s glitchy and unstable to the point of being completely unusable. I’ve tried every switch, lever, and tweak I can find. Tried Thorium as well – it was even worse.

Firefox, OTOH, doesn’t care. It’s happy to run Wayland native or under XWayland.

I suspect something is up with Chrome on Wayland in general because I’m reading reports from others that sound a lot like my own.

I hope this mess with X11/Wayland will be fixed and behind us soon, because in 2025, and soon 2026, we can’t go in the direction of “Linux is better than Windows, just switch to it” with problems like this. People are arguing about gaming, but for me, while we still have issues like this on a computer that costs more than a month’s salary, it’s not even the time to talk about gaming on Linux… We need to fix these “common” problems first!

Hey y’all. I had a similar issue about a month ago. Threw everything I could at it, as you guys said, and nothing worked. What did work for me though, however inefficient, was changing a setting in the NVIDIA module. Specifically changing it to utilize the dedicated GPU at all times.

NVIDIA Settings > PRIME Profiles > Select the GPU you would like to use > Select “NVIDIA (Performance Mode)

It’s definitely not ideal, but I got really tired of the system deciding not to use the GPU whenever it wants, and having to troubleshoot every single time. Hope this helps!!

If it exists:


Maybe I miss a specific additional package?

Not that it matters to me, my i9-13900HX iGPU is IMO fast enough for the “simple” desktop stuff. I only ask / want to mention it because someone else maybe misses that setting as well.