STAIR Actions: A Video Dataset of Everyday Home Actions

04/12/2018
by   Yuya Yoshikawa, et al.
2

A new large-scale video dataset for human action recognition, called STAIR Actions is introduced. STAIR Actions contains 100 categories of action labels representing fine-grained everyday home actions so that it can be applied to research in various home tasks such as nursing, caring, and security. In STAIR Actions, each video has a single action label. Moreover, for each action category, there are around 1,000 videos that were obtained from YouTube or produced by crowdsource workers. The duration of each video is mostly five to six seconds. The total number of videos is 102,462. We explain how we constructed STAIR Actions and show the characteristics of STAIR Actions compared to existing datasets for human action recognition. Experiments with three major models for action recognition show that STAIR Actions can train large models and achieve good performance. STAIR Actions can be downloaded from https://actions.stair.center.

READ FULL TEXT

page 5

page 6

page 9

page 10

research
06/06/2020

ARID: A New Dataset for Recognizing Action in the Dark

The task of action recognition in dark videos is useful in various scena...
research
06/17/2021

BABEL: Bodies, Action and Behavior with English Labels

Understanding the semantics of human movement – the what, how and why of...
research
12/26/2017

SLAC: A Sparsely Labeled Dataset for Action Classification and Localization

This paper describes a procedure for the creation of large-scale video d...
research
04/23/2019

HAUAR: Home Automation Using Action Recognition

Today, many of the home automation systems deployed are mostly controlle...
research
12/02/2015

Actions Transformations

What defines an action like "kicking ball"? We argue that the true meani...
research
11/01/2019

Multi-Moments in Time: Learning and Interpreting Models for Multi-Action Video Understanding

An event happening in the world is often made of different activities an...
research
09/11/2020

HAA500: Human-Centric Atomic Action Dataset with Curated Videos

We contribute HAA500, a manually annotated human-centric atomic action d...

Please sign up or login with your details

Forgot password? Click here to reset