Display will flicker and eventually not show an image when some apps are fullscreened until exited

thus far I’ve had this issue with the moonlight client, in both full screened and borderless, and Parsec. I haven’t tried anything running nativly on the computer, only streaming apps thus far. Mind you the entire display goes out till I hit the super key or exit out of the app (which in moonlight can be impossible depending on settings until I turn off the host…)

the GPU in the computer used for streaming is a RX 7600 on Arch Linux. the host of both moonlight and Parsec uses an Arc a770. however I don’t think it’s due to intel’s drivers or some issues in the server-side software. Primary because on my OneXplayer, which has an AMD APU running windows 11, I don’t have this issue which using it as a client, at least in Parsec.

EDIT: It happens with youtube as well running in Chromium, def seems like a bug but I want to make sure it’s not my hardware doing it

Hi - so folks can give you the best guidance, could you please share the basic system information for the devices involved in a post here? The easiest way to get that is from the Info Center app, using the Copy Details button in the top-right corner:

I would, if it weren’t for the fact I already found the source of the problem. it seems adaptive sync causes the issue, and has apparently been an issue since plasma 5 if the bugzilla tracker is anything to go by