Unsupervised Natural Language Inference via Decoupled Multimodal Contrastive Learning

10/16/2020
by   Wanyun Cui, et al.
0

We propose to solve the natural language inference problem without any supervision from the inference labels via task-agnostic multimodal pretraining. Although recent studies of multimodal self-supervised learning also represent the linguistic and visual context, their encoders for different modalities are coupled. Thus they cannot incorporate visual information when encoding plain text alone. In this paper, we propose Multimodal Aligned Contrastive Decoupled learning (MACD) network. MACD forces the decoupled text encoder to represent the visual information via contrastive learning. Therefore, it embeds visual knowledge even for plain text inference. We conducted comprehensive experiments over plain text inference datasets (i.e. SNLI and STS-B). The unsupervised MACD even outperforms the fully-supervised BiLSTM and BiLSTM+ELMO on STS-B.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/24/2021

Dense Contrastive Visual-Linguistic Pretraining

Inspired by the success of BERT, several multimodal representation learn...
research
03/06/2023

CleanCLIP: Mitigating Data Poisoning Attacks in Multimodal Contrastive Learning

Multimodal contrastive pretraining has been used to train multimodal rep...
research
05/25/2020

Incidental Supervision: Moving beyond Supervised Learning

Machine Learning and Inference methods have become ubiquitous in our att...
research
06/05/2022

Towards Fast Adaptation of Pretrained Contrastive Models for Multi-channel Video-Language Retrieval

Multi-channel video-language retrieval require models to understand info...
research
04/07/2023

Linking Representations with Multimodal Contrastive Learning

Many applications require grouping instances contained in diverse docume...
research
03/24/2023

Best of Both Worlds: Multimodal Contrastive Learning with Tabular and Imaging Data

Medical datasets and especially biobanks, often contain extensive tabula...
research
12/11/2018

Contrastive Training for Models of Information Cascades

This paper proposes a model of information cascades as directed spanning...

Please sign up or login with your details

Forgot password? Click here to reset