Where Are You Looking?: A Large-Scale Dataset of Head and Gaze Behavior for 360-Degree Videos and a Pilot Study

08/08/2022
by   Yili Jin, et al.
0

360 videos in recent years have experienced booming development. Compared to traditional videos, 360 videos are featured with uncertain user behaviors, bringing opportunities as well as challenges. Datasets are necessary for researchers and developers to explore new ideas and conduct reproducible analyses for fair comparisons among different solutions. However, existing related datasets mostly focused on users' field of view (FoV), ignoring the more important eye gaze information, not to mention the integrated extraction and analysis of both FoV and eye gaze. Besides, users' behavior patterns are highly related to videos, yet most existing datasets only contained videos with subjective and qualitative classification from video genres, which lack quantitative analysis and fail to characterize the intrinsic properties of a video scene. To this end, we first propose a quantitative taxonomy for 360 videos that contains three objective technical metrics. Based on this taxonomy, we collect a dataset containing users' head and gaze behaviors simultaneously, which outperforms existing datasets with rich dimensions, large scale, strong diversity, and high frequency. Then we conduct a pilot study on user's behaviors and get some interesting findings such as user's head direction will follow his/her gaze direction with the most possible time interval. A case of application in tile-based 360 video streaming based on our dataset is later conducted, demonstrating a great performance improvement of existing works by leveraging our provided gaze information. Our dataset is available at https://cuhksz-inml.github.io/head_gaze_dataset/

READ FULL TEXT

page 4

page 5

page 6

research
08/15/2023

Understanding User Behavior in Volumetric Video Watching: Dataset, Analysis and Prediction

Volumetric video emerges as a new attractive video paradigm in recent ye...
research
07/04/2019

Believe It or Not, We Know What You Are Looking at!

By borrowing the wisdom of human in gaze following, we propose a two-sta...
research
03/05/2020

Detecting Attended Visual Targets in Video

We address the problem of detecting attention targets in video. Specific...
research
06/01/2023

MammalNet: A Large-scale Video Benchmark for Mammal Recognition and Behavior Understanding

Monitoring animal behavior can facilitate conservation efforts by provid...
research
07/27/2018

Towards an Embodied Semantic Fovea: Semantic 3D scene reconstruction from ego-centric eye-tracker videos

Incorporating the physical environment is essential for a complete under...
research
10/22/2020

GAZED- Gaze-guided Cinematic Editing of Wide-Angle Monocular Video Recordings

We present GAZED- eye GAZe-guided EDiting for videos captured by a solit...
research
06/12/2019

LAEO-Net: revisiting people Looking At Each Other in videos

Capturing the `mutual gaze' of people is essential for understanding and...

Please sign up or login with your details

Forgot password? Click here to reset