Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt

02/23/2022
by   Lianzhe Huang, et al.
0

Prompt-based tuning has been proven effective for pretrained language models (PLMs). While most of the existing work focuses on the monolingual prompts, we study the multilingual prompts for multilingual PLMs, especially in the zero-shot cross-lingual setting. To alleviate the effort of designing different prompts for multiple languages, we propose a novel model that uses a unified prompt for all languages, called UniPrompt. Different from the discrete prompts and soft prompts, the unified prompt is model-based and language-agnostic. Specifically, the unified prompt is initialized by a multilingual PLM to produce language-independent representation, after which is fused with the text input. During inference, the prompts can be pre-computed so that no extra computation cost is needed. To collocate with the unified prompt, we propose a new initialization method for the target label word to further improve the model's transferability across languages. Extensive experiments show that our proposed methods can significantly outperform the strong baselines across different languages. We will release data and code to facilitate future research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2022

Cross-Lingual Text Classification with Multilingual Distillation and Zero-Shot-Aware Training

Multilingual pre-trained language models (MPLMs) not only can handle tas...
research
03/15/2022

Multilingual Generative Language Models for Zero-Shot Cross-Lingual Event Argument Extraction

We present a study on leveraging multilingual pre-trained generative lan...
research
08/26/2023

ZC3: Zero-Shot Cross-Language Code Clone Detection

Developers introduce code clones to improve programming productivity. Ma...
research
07/04/2022

Unify and Conquer: How Phonetic Feature Representation Affects Polyglot Text-To-Speech (TTS)

An essential design decision for multilingual Neural Text-To-Speech (NTT...
research
05/09/2023

Boosting Zero-shot Cross-lingual Retrieval by Training on Artificially Code-Switched Data

Transferring information retrieval (IR) models from a high-resource lang...
research
10/25/2022

Multilingual Relation Classification via Efficient and Effective Prompting

Prompting pre-trained language models has achieved impressive performanc...
research
10/05/2021

Analyzing the Effects of Reasoning Types on Cross-Lingual Transfer Performance

Multilingual language models achieve impressive zero-shot accuracies in ...

Please sign up or login with your details

Forgot password? Click here to reset