Atari-HEAD: Atari Human Eye-Tracking and Demonstration Dataset

03/15/2019
by   Ruohan Zhang, et al.
1

We introduce a large-scale dataset of human actions and eye movements while playing Atari videos games. The dataset currently has 44 hours of gameplay data from 16 games and a total of 2.97 million demonstrated actions. Human subjects played games in a frame-by-frame manner to allow enough decision time in order to obtain near-optimal decisions. This dataset could be potentially used for research in imitation learning, reinforcement learning, and visual saliency.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset