HeadText: Exploring Hands-free Text Entry using Head Gestures by Motion Sensing on a Smart Earpiece

by   Songlin Xu, et al.

We present HeadText, a hands-free technique on a smart earpiece for text entry by motion sensing. Users input text utilizing only 7 head gestures for key selection, word selection, word commitment and word cancelling tasks. Head gesture recognition is supported by motion sensing on a smart earpiece to capture head moving signals and machine learning algorithms (K-Nearest-Neighbor (KNN) with a Dynamic Time Warping (DTW) distance measurement). A 10-participant user study proved that HeadText could recognize 7 head gestures at an accuracy of 94.29 achieve a maximum accuracy of 10.65 WPM and an average accuracy of 9.84 WPM for text entry. Finally, we demonstrate potential applications of HeadText in hands-free scenarios for (a). text entry of people with motor impairments, (b). private text entry, and (c). socially acceptable text entry.


page 6

page 9


TeethTap: Recognizing Discrete Teeth Gestures Using Motion and Acoustic Sensing on an Earpiece

Teeth gestures become an alternative input modality for different situat...

An Exploration of Hands-free Text Selection for Virtual Reality Head-Mounted Displays

Hand-based interaction, such as using a handheld controller or making ha...

Enhancing Older Adults' Gesture Typing Experience Using the T9 Keyboard on Small Touchscreen Devices

Older adults increasingly adopt small-screen devices, but limited motor ...

AirDraw: Leveraging Smart Watch Motion Sensors for Mobile Human Computer Interactions

Wearable computing is one of the fastest growing technologies today. Sma...

Classifying Eyes-Free Mobile Authentication Techniques

Mobile device users avoiding observational attacks and coping with situa...

Touchless Typing using Head Movement-based Gestures

Physical contact-based typing interfaces are not suitable for people wit...

Please sign up or login with your details

Forgot password? Click here to reset