Laptops equipped with touchscreens are no longer uncommon, but in practice they come with several usability issues.
For example, repeatedly lifting your arm to touch the screen can be tiring, the lower part of the display is hard to reach, UI elements are often too small for precise touch input, and the user’s hand frequently blocks the view of the screen while interacting.
I would like to propose a “Full-Screen Trackpad” interaction mode.
In this mode, the entire touchscreen is treated as a large trackpad rather than as a direct manipulation surface, allowing it to be used primarily for pointer control.
For instance, a user could operate the screen near the right or left edge with their thumb while holding the PC with the same hand. Even when interacting with only a limited area of the screen, the pointer can comfortably reach any part of the display, with accuracy comparable to a conventional trackpad.
Common multi-touch gestures such as two-finger scrolling and pinch-to-zoom could also be supported.
When a touchscreen is detected on the target system, the OS could allow the user to choose between:
-
a traditional “Direct Touch Mode” (standard touchscreen behavior), and
-
a new “Full-Screen Trackpad Mode”.
This approach could offer a novel interaction experience that is not currently implemented in other operating systems, and would showcase Linux as a platform for innovative input paradigms.
Note: The original idea was created by me. I used AI assistance for proofreading and for translating the text from Japanese into English.
Note: A similar interaction model has already proven practical in smartphone RDP and VNC applications; however, to the best of my knowledge, it has not been implemented directly at the operating system level.