HATSUKI : An anime character like robot figure platform with anime-style expressions and imitation learning based action generation

by   Pin-Chu Yang, et al.

Japanese character figurines are popular and have pivot position in Otaku culture. Although numerous robots have been developed, less have focused on otaku-culture or on embodying the anime character figurine. Therefore, we take the first steps to bridge this gap by developing Hatsuki, which is a humanoid robot platform with anime based design. Hatsuki's novelty lies in aesthetic design, 2D facial expressions, and anime-style behaviors that allows it to deliver rich interaction experiences resembling anime-characters. We explain our design implementation process of Hatsuki, followed by our evaluations. In order to explore user impressions and opinions towards Hatsuki, we conducted a questionnaire in the world's largest anime-figurine event. The results indicate that participants were generally very satisfied with Hatsuki's design, and proposed various use case scenarios and deployment contexts for Hatsuki. The second evaluation focused on imitation learning, as such method can provide better interaction ability in the real world and generate rich, context-adaptive behavior in different situations. We made Hatsuki learn 11 actions, combining voice, facial expressions and motions, through neuron network based policy model with our proposed interface. Results show our approach was successfully able to generate the actions through self-organized contexts, which shows the potential for generalizing our approach in further actions under different contexts. Lastly, we present our future research direction for Hatsuki, and provide our conclusion.


page 1

page 2

page 3

page 5

page 6

page 7


Learning from demonstrations: An intuitive VR environment for imitation learning of construction robots

Construction robots are challenging the traditional paradigm of labor in...

A Dynamic Neural Network Approach to Generating Robot's Novel Actions: A Simulation Experiment

In this study, we investigate how a robot can generate novel and creativ...

Motion Generation Using Bilateral Control-Based Imitation Learning with Autoregressive Learning

Robots that can execute various tasks automatically on behalf of humans ...

Navigation by Imitation in a Pedestrian-Rich Environment

Deep neural networks trained on demonstrations of human actions give rob...

Imitation learning of motor primitives and language bootstrapping in robots

Imitation learning in robots, also called programing by demonstration, h...

Learning Human Body Motions from Skeleton-Based Observations for Robot-Assisted Therapy

Robots applied in therapeutic scenarios, for instance in the therapy of ...

Imitation Learning for Neural Morphological String Transduction

We employ imitation learning to train a neural transition-based string t...

Please sign up or login with your details

Forgot password? Click here to reset