CausalAPM: Generalizable Literal Disentanglement for NLU Debiasing

by   Songyang Gao, et al.

Dataset bias, i.e., the over-reliance on dataset-specific literal heuristics, is getting increasing attention for its detrimental effect on the generalization ability of NLU models. Existing works focus on eliminating dataset bias by down-weighting problematic data in the training process, which induce the omission of valid feature information while mitigating bias. In this work, We analyze the causes of dataset bias from the perspective of causal inference and propose CausalAPM, a generalizable literal disentangling framework to ameliorate the bias problem from feature granularity. The proposed approach projects literal and semantic information into independent feature subspaces, and constrains the involvement of literal information in subsequent predictions. Extensive experiments on three NLP benchmarks (MNLI, FEVER, and QQP) demonstrate that our proposed framework significantly improves the OOD generalization performance while maintaining ID performance.


Kernel-Whitening: Overcome Dataset Bias with Isotropic Sentence Embedding

Dataset bias has attracted increasing attention recently for its detrime...

Word Embeddings via Causal Inference: Gender Bias Reducing and Semantic Information Preserving

With widening deployments of natural language processing (NLP) in daily ...

General Debiasing for Multimodal Sentiment Analysis

Existing work on Multimodal Sentiment Analysis (MSA) utilizes multimodal...

Imitating Targets from all sides: An Unsupervised Transfer Learning method for Person Re-identification

Person re-identification (Re-ID) models usually show a limited performan...

Towards Unbiased Visual Emotion Recognition via Causal Intervention

Although much progress has been made in visual emotion recognition, rese...

Accounting for recall bias in case-control studies: a causal inference approach

A case-control study is designed to help determine if an exposure is ass...

Please sign up or login with your details

Forgot password? Click here to reset