For users that cannot use a physical keyboard, but can use a touch screen (example: low mobility, paralysis, cerebral palsy) having touch screen gesture controls for keybindings would make a kde desktop much more accessible.
You might be able to achieve that with tools like touchegg or easystroke, but I’m not sure.
Sadly touchegg uses multi touch gestures which are not accessible
Ill look into easystroke
I looked up a bit and found out that easystroke is natively meant for the mouse, not for touchscreens. I really think you should contact the guys in #kde input handling on matrix, those guys really rock and are super friendly (they also develop the input handling in KDE, so they are the guys to make a feature request and make it happen, if possible.
Im on it but kde develelopers told me to post feature requests here, i’ll try for matrix again though
Im having trouble finding the input channel, accessibility channel seems dead
I asked in the channel that someone looks at this thread, maybe i can invite you if you give me your matrix handle ^^
Maybe this helps.. https://matrix.to/#/#kde-input:kde.org
Actually, I have long held that Mouse Gestures were the most amazing addition (Opera and Firefox on Windows way back in the day when i used internet shops in the early 2000’s).
The next find (on X11, Gnome2 first) was that Mouse Gestures worked not only in browsers, but across the entire desktop.
Later on, started writing bash scripts for advanced custom actions (like having 2 different conky launchers - one minimal, one full with some elements which timed out and left it minimal)… so one gesture to launch my Firefox, another for James’ firefox profile - all that good stuff.
Right now I’m colour coding my keyboard to bring back some of those actions (i.e. use colour pencils to draw icons on the fronts of my PBT keycaps for stuff) - but it’s very limited… what with Ctrl, Alt, Meta, Shift, and many many combinations available it’s impossible to mark or remember them all.
With Gestures, a SHAPE or a drawn letter are soooo much simpler to memorise… this obviously targeted at MOUSE gestures - not sure how much this could be applied to TOUCHSCREEN (I assume that you could draw a shape the same way).
Anyway, for now it’s a waiting game. Wayland has a LONG way to go (and in many ways will never get there) to reach parity with X11 functions because it just doesn’t work that way.
Currently, Easystroke works only on X11 and work is ongoing to bring similar functions into Wayland… and I have no idea how you’d apply that to a touchscreen.
they’ve repeatedly refused to improve anything related to touch and/or keyboard functionality so i don’t think anyone’s listening.
Things not being where you’d want them to be does not mean that anyone has “repeatedly refused” to improve things - as demonstrated by the active development work on the overall topic of touch interaction on Plasma devices: Plasma / Plasma Keyboard · GitLab.
It’s fine to be unhappy with how something is right now, but let’s be productive and accurate in how we discuss it.
I’d like to see some relevant links for this. I know for a fact that people are thinking about, working on, and desperately trying to factor in further Wayland functionality - libinput etc - initially this could bring back mouse gestures, and that would likely mean a lot more functions also relating to touch and trackpads.