Knowledge Distillation for Action Anticipation via Label Smoothing

04/16/2020
by   Guglielmo Camporese, et al.
12

Human capability to anticipate near future from visual observations and non-verbal cues is essential for developing intelligent systems that need to interact with people. Several research areas, such as human-robot interaction (HRI), assisted living or autonomous driving need to foresee future events to avoid crashes or help visually impaired people. Such challenging task requires to capture and understand the underlying structure of the analyzed domain in order to reduce prediction uncertainty. Since the action anticipation task can be seen as a multi-label problem with missing labels, we design and extend the idea of label smoothing extracting semantics from the target labels. We show that such generalization is equivalent to considering a knowledge distillation framework where a teacher injects useful semantic information into the model during training. In our experiments, we implement a multi-modal framework based on long short-term memory (LSTM) networks to anticipate future actions which is able to summarise past observations while making predictions of the future at different time steps. To validate our soft labeling procedure we perform extensive experiments on the egocentric EPIC-Kitchens dataset which includes more than 2500 action classes. The experiments show that label smoothing systematically improves performance of state-of-the-art models.

READ FULL TEXT

page 1

page 7

research
06/06/2019

When Does Label Smoothing Help?

The generalization and learning speed of a multi-class neural network ca...
research
01/18/2022

Cross-modal Contrastive Distillation for Instructional Activity Anticipation

In this study, we aim to predict the plausible future action steps given...
research
10/22/2022

Adaptive Label Smoothing with Self-Knowledge in Natural Language Generation

Overconfidence has been shown to impair generalization and calibration o...
research
04/01/2021

Is Label Smoothing Truly Incompatible with Knowledge Distillation: An Empirical Study

This work aims to empirically clarify a recently discovered perspective ...
research
04/02/2022

A-ACT: Action Anticipation through Cycle Transformations

While action anticipation has garnered a lot of research interest recent...
research
03/04/2022

Better Supervisory Signals by Observing Learning Paths

Better-supervised models might have better performance. In this paper, w...
research
06/29/2023

Streaming egocentric action anticipation: An evaluation scheme and approach

Egocentric action anticipation aims to predict the future actions the ca...

Please sign up or login with your details

Forgot password? Click here to reset