Parameter-Efficient Cross-lingual Transfer of Vision and Language Models via Translation-based Alignment

05/02/2023
by   Zhen Zhang, et al.
0

Pre-trained vision and language models such as CLIP have witnessed remarkable success in connecting images and texts with a primary focus on English texts. Despite recent efforts to extend CLIP to support other languages, disparities in performance among different languages have been observed due to uneven resource availability. Additionally, current cross-lingual transfer methods of those pre-trained models would consume excessive resources for a large number of languages. Therefore, we propose a new parameter-efficient cross-lingual transfer learning framework that utilizes a translation-based alignment method to mitigate multilingual disparities and explores parameter-efficient fine-tuning methods for parameter-efficient cross-lingual transfer. Extensive experiments on XTD and Multi30K datasets, covering 11 languages under zero-shot, few-shot, and full-dataset learning scenarios, show that our framework significantly reduces the multilingual disparities among languages and improves cross-lingual transfer results, especially in low-resource scenarios, while only keeping and fine-tuning an extremely small number of parameters compared to the full model (e.g., Our framework only requires 0.16% additional parameters of a full-model for each language in the few-shot learning scenario).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2022

Prompt-Tuning Can Be Much Better Than Fine-Tuning on Cross-lingual Understanding With Multilingual Language Models

Pre-trained multilingual language models show significant performance ga...
research
05/24/2023

Meta-Learning For Vision-and-Language Cross-lingual Transfer

Current pre-trained vison-language models (PVLMs) achieve excellent perf...
research
04/05/2022

Parameter-Efficient Neural Reranking for Cross-Lingual and Multilingual Retrieval

State-of-the-art neural (re)rankers are notoriously data hungry which - ...
research
04/13/2022

Multilingual Language Model Adaptive Fine-Tuning: A Study on African Languages

Multilingual pre-trained language models (PLMs) have demonstrated impres...
research
06/07/2021

LAWDR: Language-Agnostic Weighted Document Representations from Pre-trained Models

Cross-lingual document representations enable language understanding in ...
research
07/25/2019

Cross-Lingual Transfer for Distantly Supervised and Low-resources Indonesian NER

Manually annotated corpora for low-resource languages are usually small ...

Please sign up or login with your details

Forgot password? Click here to reset