Around-Body Interaction: Leveraging Limb Movements for Interacting in a Digitally Augmented Physical World

03/28/2023
by   Florian Müller, et al.
0

Recent technological advances have made head-mounted displays (HMDs) smaller and untethered, fostering the vision of ubiquitous interaction with information in a digitally augmented physical world. For interacting with such devices, three main types of input - besides not very intuitive finger gestures - have emerged so far: 1) Touch input on the frame of the devices or 2) on accessories (controller) as well as 3) voice input. While these techniques have both advantages and disadvantages depending on the current situation of the user, they largely ignore the skills and dexterity that we show when interacting with the real world: Throughout our lives, we have trained extensively to use our limbs to interact with and manipulate the physical world around us. This thesis explores how the skills and dexterity of our upper and lower limbs, acquired and trained in interacting with the real world, can be transferred to the interaction with HMDs. Thus, this thesis develops the vision of around-body interaction, in which we use the space around our body, defined by the reach of our limbs, for fast, accurate, and enjoyable interaction with such devices. This work contributes four interaction techniques, two for the upper limbs and two for the lower limbs: The first contribution shows how the proximity between our head and hand can be used to interact with HMDs. The second contribution extends the interaction with the upper limbs to multiple users and illustrates how the registration of augmented information in the real world can support cooperative use cases. The third contribution shifts the focus to the lower limbs and discusses how foot taps can be leveraged as an input modality for HMDs. The fourth contribution presents how lateral shifts of the walking path can be exploited for mobile and hands-free interaction with HMDs while walking.

READ FULL TEXT
research
03/28/2023

TicTacToes: Assessing Toe Movements as an Input Modality

From carrying grocery bags to holding onto handles on the bus, there are...
research
09/14/2022

An Exploration of Hands-free Text Selection for Virtual Reality Head-Mounted Displays

Hand-based interaction, such as using a handheld controller or making ha...
research
06/23/2016

Human Computer Interaction Using Marker Based Hand Gesture Recognition

Human Computer Interaction (HCI) has been redefined in this era. People ...
research
08/21/2023

Towards Ubiquitous Intelligent Hand Interaction

The development of ubiquitous computing and sensing devices has brought ...
research
09/14/2023

Self-Supervised Prediction of the Intention to Interact with a Service Robot

A service robot can provide a smoother interaction experience if it has ...
research
05/26/2021

A Concise Guide to Elicitation Methodology

One of the open questions in the field of interaction design is "what in...

Please sign up or login with your details

Forgot password? Click here to reset