Intention estimation from gaze and motion features for human-robot shared-control object manipulation

08/18/2022
by   Anna Belardinelli, et al.
0

Shared control can help in teleoperated object manipulation by assisting with the execution of the user's intention. To this end, robust and prompt intention estimation is needed, which relies on behavioral observations. Here, an intention estimation framework is presented, which uses natural gaze and motion features to predict the current action and the target object. The system is trained and tested in a simulated environment with pick and place sequences produced in a relatively cluttered scene and with both hands, with possible hand-over to the other hand. Validation is conducted across different users and hands, achieving good accuracy and earliness of prediction. An analysis of the predictive power of single features shows the predominance of the grasping trigger and the gaze features in the early identification of the current action. In the current framework, the same probabilistic model can be used for the two hands working in parallel and independently, while a rule-based model is proposed to identify the resulting bimanual action. Finally, limitations and perspectives of this approach to more complex, full-bimanual manipulations are discussed.

READ FULL TEXT

page 1

page 2

page 5

research
02/09/2023

Gaze-based intention estimation: principles, methodologies, and applications in HRI

Intention prediction has become a relevant field of research in Human-Ma...
research
10/05/2019

Early Estimation of User's Intention of Tele-Operation Using Object Affordance and Hand Motion in a Dual First-Person Vision

This paper describes a method of estimating the intention of a user's mo...
research
09/17/2019

What Are You Looking at? Detecting Human Intention in Gaze based Human-Robot Interaction

In gaze based Human-Robot Interaction (HRI), it is important to determin...
research
01/22/2022

MIDAS: Deep learning human action intention prediction from natural eye movement patterns

Eye movements have long been studied as a window into the attentional me...
research
05/28/2019

Fast human motion prediction for human-robot collaboration with wearable interfaces

In this paper, we aim at improving human motion prediction during human-...
research
10/15/2018

Towards Intention Prediction for Handheld Robots: a Case of Simulated Block Copying

Within this work, we explore intention inference for user actions in the...
research
09/21/2018

Gaze-based, Context-aware Robotic System for Assisted Reaching and Grasping

Assistive robotic systems endeavour to support those with movement disab...

Please sign up or login with your details

Forgot password? Click here to reset