Retrieval-Augmented Meta Learning for Low-Resource Text Classification

09/10/2023
by   Rongsheng Li, et al.
0

Meta learning have achieved promising performance in low-resource text classification which aims to identify target classes with knowledge transferred from source classes with sets of small tasks named episodes. However, due to the limited training data in the meta-learning scenario and the inherent properties of parameterized neural networks, poor generalization performance has become a pressing problem that needs to be addressed. To deal with this issue, we propose a meta-learning based method called Retrieval-Augmented Meta Learning(RAML). It not only uses parameterization for inference but also retrieves non-parametric knowledge from an external corpus to make inferences, which greatly alleviates the problem of poor generalization performance caused by the lack of diverse training data in meta-learning. This method differs from previous models that solely rely on parameters, as it explicitly emphasizes the importance of non-parametric knowledge, aiming to strike a balance between parameterized neural networks and non-parametric knowledge. The model is required to determine which knowledge to access and utilize during inference. Additionally, our multi-view passages fusion network module can effectively and efficiently integrate the retrieved information into low-resource classification task. The extensive experiments demonstrate that RAML significantly outperforms current SOTA low-resource text classification models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2021

Knowledge-Aware Meta-learning for Low-Resource Text Classification

Meta-learning has achieved great success in leveraging the historical le...
research
03/22/2022

Improving Meta-learning for Low-resource Text Classification and Generation via Memory Imitation

Building models of natural language processing (NLP) is challenging in l...
research
05/11/2022

Improved Meta Learning for Low Resource Speech Recognition

We propose a new meta learning based framework for low resource speech r...
research
10/18/2020

Meta-Learning for Low-Resource Unsupervised Neural MachineTranslation

Unsupervised machine translation, which utilizes unpaired monolingual co...
research
05/24/2020

When does MAML Work the Best? An Empirical Study on Model-Agnostic Meta-Learning in NLP Applications

Model-Agnostic Meta-Learning (MAML), a model-agnostic meta-learning meth...
research
04/26/2020

Challenge Closed-book Science Exam: A Meta-learning Based Question Answering System

Prior work in standardized science exams requires support from large tex...
research
03/24/2023

SPEC: Summary Preference Decomposition for Low-Resource Abstractive Summarization

Neural abstractive summarization has been widely studied and achieved gr...

Please sign up or login with your details

Forgot password? Click here to reset