Hi
I have big LG evo oled TV as a monitor connected via HDMI 2.1 from AMD 7900XTX graphic card. The problem is that it shows me that my pixel color mode is compressed YCBRC420… Now how can i change it? Or is it even possible on Arch linux running KDE wayland? On windows it is a simple change from Radeon Drivers UI interface… Here i dont know how to do it… i want to change it so i can use 10bit color mode… for HDR content… games etc…
The simple answer? Switch to Gnome or go back to X11. KDE’s wayland implementation is wholly lacking in color control and the methods to expose them in the driver are non-trivial and often is a matter of “trial and error” which is a no no for me. So, for the time being, I am back in X11. Wayland is not ready for prime time in KDE. Not ready by far.
- With the open source amdgpu driver, HDMI 2.1 cannot be achieved. (HDMI is closed source) If you want to get the uncompressed ones with HDMI, you will need to look at lower resolution monitors.
- If you are using DisplayPort and the problem still exists, perhaps check out some of these:
1
2
3
Note that the discussions in this links are more towards using the RGB format, but you might get something useful.
@shadow I won’t repeat this any more often. Stop spreading misinformation! Just don’t talk about things you very obviously don’t have any clue about!
@Red_Marx unfortunately there isn’t yet any way to set the color format that’s sent to the display. This is being worked on though, see no color format choice in amdgpu (#476) · Issues · drm / amd · GitLab
As @ulterno already mentioned, you’re also most likely affected by 4k@120hz unavailable via HDMI 2.1 (#1417) · Issues · drm / amd · GitLab - it’s not possible to get 4k@120Hz on AMD without chroma subsampling because of legal issues with the HDMI forum.
You don’t need 10 bit color or RGB/YCbCr444 transmission for HDR. Do you actually notice a problem, or were you only concerned about the TV’s UI showing you that?
FWIW I haven’t personally noticed any issues on my TV, outside of test images meant for checking if chroma subsampling is used. This is with some viewing distance of course, but if you also don’t notice a problem when looking at actual videos or games, I would just stop worrying about it.
I have no clue? Buddy this is what I do. There is a whole side of this you yourself seem to have no clue about. There are a whole lot of us who do more with our computers than watch movies on a smart TV. Did you happen to see my recent post about the fact that multiple displays across multiple GPUs still does not work, contrary to your claims otherwise? I have tested that now across three different types of GPUs.
For the record, 10 bit and HDR are not the same thing. Now you are telling this guy “well you don’t really need it”… WHAT? Does it work or not? If so, is there a non-convoluted way to achieve it? If not… just say, “not at this time”.
What you’re doing is spreading misinformation about topics you don’t have any clue about - kernel drivers, Wayland, X11, what Plasma supposedly is lacking in the color department and what Gnome or X11 can supposedly magically do, without any basis in reality.
That you can operate a camera doesn’t make you any more of an expert on any of these topics, than me being an expert in them gives me knowledge about cameras.
Stop dragging your personal grievances into other people’s threads. This thread is from someone that says they want to watch HDR videos and play games on their TV.
Like I said before, if something doesn’t work, make a bug report about it. Whining on a forum and insulting one of the few people that could actually do something about it just ensures that your problem gets ignored.
Why would you bother responding to a post if you don’t even read it? Noone claimed that 10 bpc and HDR were the same thing, I literally explained exactly that you don’t need 10bpc for HDR, and I already explained that they can’t set the color format, and that full chroma subsampling won’t be possible in their specific case because of HDMI.
Another such response from you on this thread and I’ll contact someone with moderator permissions. This thread is from someone asking for help, they deserve the chance to get it and not be annoyed with your nonsense.
Thanx for the reply… … Now im not an expert but when i switch to 60hz from 120hz… all is good
YCBCR4:4:4 is being used by the driver… when i press the green button on my remote 7x it will show me the mode… I was thinking… is this a limitation from the graphics driver? I would like to switch from YCBCR to RGB instead, maybe in that mode it can do 4K 120Hz uncompressed… But i dont know how to do it… If it is not possible or a hassle to do it… i leave it as it is… since i dont experience any problems running games in HDR … It was just my concern about YCBCR420 not being true HDR… But maybe the TV showing me wrong telemetry … I dont know…
It is a limitation of bandwidth. HDMI 2.0 doesn’t have enough bandwidth to do 4k 120Hz with RGB or YCbCr 4:4:4.
RGB and YCbCr 4:4:4 require exactly the same amount of bandwidth, and it’s more than YCbCr 4:2:2 or 4:2:0. The latter two are basically using lossy compression on color, so that you can increase the resolution or refresh rate even though you wouldn’t normally have enough bandwidth.
If you look very closely, you might be able to see some color artifacts on small lines or text. That’s pretty much all YCbCr420 does, and not worth worrying about on a TV.
Most videos are in fact YCbCr 4:2:0 anyways, as it reduces file size.
Thanks for the answer … but i have latest 2.1 HDMI cable … Im pretty sure that it can do 4k 120hz… It was expensive and under the windows it is working just fine Full RGB 4k 120hz… Shame that in linux i cant change pixel mode just like in windows… despite all that 420 looks fine in games thou… i can live with that
Yes, the cable can do HDMI 2.1, but the driver can’t. The HDMI forum isn’t allowing open source drivers to implement it
Ok … I see… i wonder why?.. I didn’t know about HDMI forum … long live the profit motive!
… anyway thanks for the explanation.
HDMI 2.1 is not supported by the open source driver due to HDMI forum stonewalling - Therefore the bandwidth doesn’t exist to achieve what you want to achieve.
The simplest solution is to switch to a Displayport connection, if you’re using a TV as a monitor this may not be a possibility due to the fact that many consumer grade TV’s don’t have Displayport connections.
Solution is DP to HDMI adapter… It works I have tried it. but… there is a little problem tho… When I turn the PC on it takes a little longer for the TV to recognize the signal… like 15 - 20 seconds… after that it works fine… or i can just replug the cable to TV… So yes it works