Toward Designing Usable Body-Based Gesture Interactions

Author: ORCID icon orcid.org/0000-0001-9744-8263
Azim, Md Aashikur Rahman, Computer Science - School of Engineering and Applied Science, University of Virginia
Advisor:
Heo, Seongkook, EN-Comp Science Dept, University of Virginia
Abstract:

The integration of wearables into our lives has revolutionized the way we interact with technology. A diverse array of form factors, including glasses, earphones, rings, watches, pendants, and VR headsets, are now equipped with advanced processors and sensors, facilitating effortless communication with other smart devices. This integration has given rise to quick microinteractions, with users relying on body-based gestures to perform various tasks. However, as we increasingly rely on body-based gestures to interact with technology, ensuring their usability and security becomes crucial.

Designing and implementing usable body-based gestures involves a multifaceted challenge that requires a fine balance between minimizing false activations and ensuring the gestures are intuitive and user-friendly. This balance is crucial for creating gestures that are not only easy to perform and remember but also socially acceptable and independent of specific devices. A good gesture is characterized by its ease of execution, memorability, compatibility across various devices, accurate recognizability with a minimal error rate, and social acceptability. Additionally, maintaining the integrity of the performed gesture is paramount, highlighting the importance of the gesture's reliability and consistent interpretation by the system.

This dissertation addresses the challenges in the design and implementation of user-friendly, memorable, and false-activation-resistant body-based gestures. It introduces SequenceSense, a tool that empowers gesture designers to easily modify gestures, assess recognition performance, and pinpoint potential false activations without resorting to extensive data gathering or experimental efforts. Addressing gesture compatibility and reusability across devices, the dissertation presents UnifiedSense. This novel method facilitates the detection of device-dependent gestures using sensors from various wearable devices, even in the absence of the device originally intended for gesture detection. Finally, to tackle the gesture integrity of the performed gesture, this dissertation proposes ManipulaSense, an Autoencoder-based anomaly detection technique that leverages users' inherent hand movements to identify manipulations of the hand movements, thereby preserving the integrity of application use.

This research significantly improves usability, efficiency, and reliability in gesture interactions for wearable devices, promoting their broader adoption and enhancing the overall user experience.

Degree:
PHD (Doctor of Philosophy)
Keywords:
Usable Gesture, Conflict-Free Gesture, Compatible Gesture, Gesture Integrity
Language:
English
Issued Date:
2024/04/22