Zero-Shot Learners for Natural Language Understanding via a Unified Multiple Choice Perspective

10/16/2022
by   Ping Yang, et al.
0

We propose a new paradigm for zero-shot learners that is format agnostic, i.e., it is compatible with any format and applicable to a list of language tasks, such as text classification, commonsense reasoning, coreference resolution, and sentiment analysis. Zero-shot learning aims to train a model on a given task such that it can address new learning tasks without any additional training. Our approach converts zero-shot learning into multiple-choice tasks, avoiding problems in commonly used large-scale generative models such as FLAN. It not only adds generalization ability to models but also significantly reduces the number of parameters. Our method shares the merits of efficient training and deployment. Our approach shows state-of-the-art performance on several benchmarks and produces satisfactory results on tasks such as natural language inference and text classification. Our model achieves this success with only 235M parameters, which is substantially smaller than state-of-the-art models with billions of parameters. The code and pre-trained models are available at https://github.com/IDEA-CCNL/Fengshenbang-LM .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2023

Zero-Shot Text Classification via Self-Supervised Tuning

Existing solutions to zero-shot text classification either conduct promp...
research
08/22/2019

Improving Few-shot Text Classification via Pretrained Language Representations

Text classification tends to be difficult when the data is deficient or ...
research
12/10/2019

Zero-shot Text Classification With Generative Language Models

This work investigates the use of natural language to enable zero-shot m...
research
08/14/2023

Approximating Human-Like Few-shot Learning with GPT-based Compression

In this work, we conceptualize the learning process as information compr...
research
05/01/2023

ZeroSearch: Local Image Search from Text with Zero Shot Learning

The problem of organizing and finding images in a user's directory has b...
research
05/25/2023

Zero-shot Approach to Overcome Perturbation Sensitivity of Prompts

Recent studies have demonstrated that natural-language prompts can help ...
research
03/27/2023

EVA-CLIP: Improved Training Techniques for CLIP at Scale

Contrastive language-image pre-training, CLIP for short, has gained incr...

Please sign up or login with your details

Forgot password? Click here to reset