Human-Robot Commensality: Bite Timing Prediction for Robot-Assisted Feeding in Groups

by   Jan Ondras, et al.

We develop data-driven models to predict when a robot should feed during social dining scenarios. Being able to eat independently with friends and family is considered one of the most memorable and important activities for people with mobility limitations. Robots can potentially help with this activity but robot-assisted feeding is a multi-faceted problem with challenges in bite acquisition, bite timing, and bite transfer. Bite timing in particular becomes uniquely challenging in social dining scenarios due to the possibility of interrupting a social human-robot group interaction during commensality. Our key insight is that bite timing strategies that take into account the delicate balance of social cues can lead to seamless interactions during robot-assisted feeding in a social dining scenario. We approach this problem by collecting a multimodal Human-Human Commensality Dataset (HHCD) containing 30 groups of three people eating together. We use this dataset to analyze human-human commensality behaviors and develop bite timing prediction models in social dining scenarios. We also transfer these models to human-robot commensality scenarios. Our user studies show that prediction improves when our algorithm uses multimodal social signaling cues between diners to model bite timing. The HHCD dataset, videos of user studies, and code will be publicly released after acceptance.


page 2

page 4

page 7

page 16

page 17

page 20

page 24


Crafting with a Robot Assistant: Use Social Cues to Inform Adaptive Handovers in Human-Robot Collaboration

We study human-robot handovers in a naturalistic collaboration scenario,...

Toward Designing Social Human-Robot Interactions for Deep Space Exploration

In planning for future human space exploration, it is important to consi...

Expressive Robot Motion Timing

Our goal is to enable robots to time their motion in a way that is purpo...

Enabling a Social Robot to Process Social Cues to Detect when to Help a User

It is important for socially assistive robots to be able to recognize wh...

Multimodal Signal Processing and Learning Aspects of Human-Robot Interaction for an Assistive Bathing Robot

We explore new aspects of assistive living on smart human-robot interact...

Long-Term, in-the-Wild Study of Feedback about Speech Intelligibility for K-12 Students Attending Class via a Telepresence Robot

Telepresence robots offer presence, embodiment, and mobility to remote u...

Learning and Blending Robot Hugging Behaviors in Time and Space

We introduce an imitation learning-based physical human-robot interactio...

Please sign up or login with your details

Forgot password? Click here to reset