A MultiModal Social Robot Toward Personalized Emotion Interaction

10/08/2021
by   Baijun Xie, et al.
0

Human emotions are expressed through multiple modalities, including verbal and non-verbal information. Moreover, the affective states of human users can be the indicator for the level of engagement and successful interaction, suitable for the robot to use as a rewarding factor to optimize robotic behaviors through interaction. This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy and personalize emotional interaction for a human user. The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset