Cyclical Focal Loss

02/16/2022
by   Leslie N. Smith, et al.
0

The cross-entropy softmax loss is the primary loss function used to train deep neural networks. On the other hand, the focal loss function has been demonstrated to provide improved performance when there is an imbalance in the number of training samples in each class, such as in long-tailed datasets. In this paper, we introduce a novel cyclical focal loss and demonstrate that it is a more universal loss function than cross-entropy softmax loss or focal loss. We describe the intuition behind the cyclical focal loss and our experiments provide evidence that cyclical focal loss provides superior performance for balanced, imbalanced, or long-tailed datasets. We provide numerous experimental results for CIFAR-10/CIFAR-100, ImageNet, balanced and imbalanced 4,000 training sample versions of CIFAR-10/CIFAR-100, and ImageNet-LT and Places-LT from the Open Long-Tailed Recognition (OLTR) challenge. Implementing the cyclical focal loss function requires only a few lines of code and does not increase training time. In the spirit of reproducibility, our code is available at <https://github.com/lnsmith54/CFL>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2023

Long-tailed Visual Recognition via Gaussian Clouded Logit Adjustment

Long-tailed data is still a big challenge for deep neural networks, even...
research
12/05/2022

Learning Imbalanced Data with Vision Transformers

The real-world data tends to be heavily imbalanced and severely skew the...
research
03/25/2021

Orthogonal Projection Loss

Deep neural networks have achieved remarkable performance on a range of ...
research
04/17/2019

Aggregation Cross-Entropy for Sequence Recognition

In this paper, we propose a novel method, aggregation cross-entropy (ACE...
research
12/11/2021

You Only Need End-to-End Training for Long-Tailed Recognition

The generalization gap on the long-tailed data sets is largely owing to ...
research
02/04/2022

Stochastic smoothing of the top-K calibrated hinge loss for deep imbalanced classification

In modern classification tasks, the number of labels is getting larger a...
research
06/18/2021

RSG: A Simple but Effective Module for Learning Imbalanced Datasets

Imbalanced datasets widely exist in practice and area great challenge fo...

Please sign up or login with your details

Forgot password? Click here to reset