Interacting with eyes and hands inputs is attractive for extended reality (XR). We explore how further human hand capabilities can become compatible with gaze for interaction. We introduce a gaze- and air-tap based interaction technique that enables mobile one-handed mid-air gestures for XR. By reimagining trackpad multi-touch gestures in 3D space as multi-air-taps based on a hand-attached control interface, our technique allows users to instantly access shortcuts to commands like object dragging, scrolling, and window switching. We present a prototype application that explores the capability for expressive input in compound object manipulation and UI navigation tasks. Our insights highlight our novel technique’s potential as a natural and efficient input method, expanding the interaction repertoire of extended reality users.

Full paper here: https://ieeexplore.ieee.org/abstract/document/11236166
Copyright © 2026 Lauren T. Zerbin