"I Don't Want People to Look At Me Differently": Designing User-Defined Above-the-Neck Gestures for People with Upper Body Motor Impairments

02/12/2022
by   Xuan Zhao, et al.
0

Recent research proposed eyelid gestures for people with upper-body motor impairments (UMI) to interact with smartphones without finger touch. However, such eyelid gestures were designed by researchers. It remains unknown what eyelid gestures people with UMI would want and be able to perform. Moreover, other above-the-neck body parts (e.g., mouth, head) could be used to form more gestures. We conducted a user study in which 17 people with UMI designed above-the-neck gestures for 26 common commands on smartphones. We collected a total of 442 user-defined gestures involving the eyes, the mouth, and the head. Participants were more likely to make gestures with their eyes and preferred gestures that were simple, easy-to-remember, and less likely to draw attention from others. We further conducted a survey (N=24) to validate the usability and acceptance of these user-defined gestures. Results show that user-defined gestures were acceptable to both people with and without motor impairments.

READ FULL TEXT

page 6

page 17

research
05/20/2022

HeadText: Exploring Hands-free Text Entry using Head Gestures by Motion Sensing on a Smart Earpiece

We present HeadText, a hands-free technique on a smart earpiece for text...
research
01/10/2020

Recognition and Localisation of Pointing Gestures using a RGB-D Camera

Non-verbal communication is part of our regular conversation, and multip...
research
01/24/2020

Touchless Typing using Head Movement-based Gestures

Physical contact-based typing interfaces are not suitable for people wit...
research
02/22/2021

Zoomorphic Gestures for Communicating Cobot States

Communicating the robot state is vital to creating an efficient and trus...
research
04/08/2022

Controlling Traffic with Humanoid Social Robot

The advancement of technology such as artificial intelligence, machine l...
research
09/03/2019

Map plasticity

With the arrival of digital maps, the ubiquity of maps has increased sha...
research
01/26/2021

Text2Gestures: A Transformer-Based Network for Generating Emotive Body Gestures for Virtual Agents

We present Text2Gestures, a transformer-based learning method to interac...

Please sign up or login with your details

Forgot password? Click here to reset