Communicating Robot Arm Motion Intent Through Mixed Reality Head-mounted Displays

08/11/2017
by   Eric Rosen, et al.
0

Efficient motion intent communication is necessary for safe and collaborative work environments with collocated humans and robots. Humans efficiently communicate their motion intent to other humans through gestures, gaze, and social cues. However, robots often have difficulty efficiently communicating their motion intent to humans via these methods. Many existing methods for robot motion intent communication rely on 2D displays, which require the human to continually pause their work and check a visualization. We propose a mixed reality head-mounted display visualization of the proposed robot motion over the wearer's real-world view of the robot and its environment. To evaluate the effectiveness of this system against a 2D display visualization and against no visualization, we asked 32 participants to labeled different robot arm motions as either colliding or non-colliding with blocks on a table. We found a 16 increase in accuracy with a 62 task compared to the next best system. This demonstrates that a mixed-reality HMD allows a human to more quickly and accurately tell where the robot is going to move than the compared baselines.

READ FULL TEXT
research
06/30/2022

Mixed Reality as Communication Medium for Human-Robot Collaboration

Humans engaged in collaborative activities are naturally able to convey ...
research
07/03/2023

Advantages of Multimodal versus Verbal-Only Robot-to-Human Communication with an Anthropomorphic Robotic Mock Driver

Robots are increasingly used in shared environments with humans, making ...
research
03/01/2023

How to Communicate Robot Motion Intent: A Scoping Review

Robots are becoming increasingly omnipresent in our daily lives, support...
research
09/09/2023

RICO-MR: An Open-Source Architecture for Robot Intent Communication through Mixed Reality

This article presents an open-source architecture for conveying robots' ...
research
10/05/2020

Projection Mapping Implementation: Enabling Direct Externalization of Perception Results and Action Intent to Improve Robot Explainability

Existing research on non-verbal cues, e.g., eye gaze or arm movement, ma...
research
09/03/2021

Communicating Inferred Goals with Passive Augmented Reality and Active Haptic Feedback

Robots learn as they interact with humans. Consider a human teleoperatin...
research
01/15/2022

A new approach to evaluating legibility: Comparing legibility frameworks using framework-independent robot motion trajectories

Robots that share an environment with humans may communicate their inten...

Please sign up or login with your details

Forgot password? Click here to reset