TART: Improved Few-shot Text Classification Using Task-Adaptive Reference Transformation

06/03/2023
by   Shuo Lei, et al.
0

Meta-learning has emerged as a trending technique to tackle few-shot text classification and achieve state-of-the-art performance. However, the performance of existing approaches heavily depends on the inter-class variance of the support set. As a result, it can perform well on tasks when the semantics of sampled classes are distinct while failing to differentiate classes with similar semantics. In this paper, we propose a novel Task-Adaptive Reference Transformation (TART) network, aiming to enhance the generalization by transforming the class prototypes to per-class fixed reference points in task-adaptive metric spaces. To further maximize divergence between transformed prototypes in task-adaptive metric spaces, TART introduces a discriminative reference regularization among transformed prototypes. Extensive experiments are conducted on four benchmark datasets and our method demonstrates clear superiority over the state-of-the-art models in all the datasets. In particular, our model surpasses the state-of-the-art method by 7.4 1-shot and 5-shot classification on the 20 Newsgroups dataset, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2021

Meta-Learning Adversarial Domain Adaptation Network for Few-Shot Text Classification

Meta-learning has emerged as a trending technique to tackle few-shot tex...
research
05/16/2023

ContrastNet: A Contrastive Learning Framework for Few-Shot Text Classification

Few-shot text classification has recently been promoted by the meta-lear...
research
02/27/2019

Few-Shot Text Classification with Induction Network

Text classification tends to struggle when data is deficient or when it ...
research
04/11/2022

MGIMN: Multi-Grained Interactive Matching Network for Few-shot Text Classification

Text classification struggles to generalize to unseen classes with very ...
research
05/12/2020

Dynamic Memory Induction Networks for Few-Shot Text Classification

This paper proposes Dynamic Memory Induction Networks (DMIN) for few-sho...
research
05/10/2022

Sibylvariant Transformations for Robust Text Classification

The vast majority of text transformation techniques in NLP are inherentl...
research
12/18/2019

Class Regularization: Improve Few-shot Image Classification by Reducing Meta Shift

Few-shot image classification requires the classifier to robustly cope w...

Please sign up or login with your details

Forgot password? Click here to reset