Individual and Structural Graph Information Bottlenecks for Out-of-Distribution Generalization

06/28/2023
by   Ling Yang, et al.
0

Out-of-distribution (OOD) graph generalization are critical for many real-world applications. Existing methods neglect to discard spurious or noisy features of inputs, which are irrelevant to the label. Besides, they mainly conduct instance-level class-invariant graph learning and fail to utilize the structural class relationships between graph instances. In this work, we endeavor to address these issues in a unified framework, dubbed Individual and Structural Graph Information Bottlenecks (IS-GIB). To remove class spurious feature caused by distribution shifts, we propose Individual Graph Information Bottleneck (I-GIB) which discards irrelevant information by minimizing the mutual information between the input graph and its embeddings. To leverage the structural intra- and inter-domain correlations, we propose Structural Graph Information Bottleneck (S-GIB). Specifically for a batch of graphs with multiple domains, S-GIB first computes the pair-wise input-input, embedding-embedding, and label-label correlations. Then it minimizes the mutual information between input graph and embedding pairs while maximizing the mutual information between embedding and label pairs. The critical insight of S-GIB is to simultaneously discard spurious features and learn invariant features from a high-order perspective by maintaining class relationships under multiple distributional shifts. Notably, we unify the proposed I-GIB and S-GIB to form our complementary framework IS-GIB. Extensive experiments conducted on both node- and graph-level tasks consistently demonstrate the superior generalization ability of IS-GIB. The code is available at https://github.com/YangLing0818/GraphOOD.

READ FULL TEXT

page 4

page 12

research
03/21/2022

Domain Generalization by Mutual-Information Regularization with Pre-trained Models

Domain generalization (DG) aims to learn a generalized model to an unsee...
research
06/11/2021

Invariant Information Bottleneck for Domain Generalization

The main challenge for domain generalization (DG) is to overcome the pot...
research
03/27/2023

Mind the Label Shift of Augmentation-based Graph OOD Generalization

Out-of-distribution (OOD) generalization is an important issue for Graph...
research
03/22/2020

Invariant Rationalization

Selective rationalization improves neural network interpretability by id...
research
07/24/2023

MARIO: Model Agnostic Recipe for Improving OOD Generalization of Graph Contrastive Learning

In this work, we investigate the problem of out-of-distribution (OOD) ge...
research
03/24/2023

Enhancing Multiple Reliability Measures via Nuisance-extended Information Bottleneck

In practical scenarios where training data is limited, many predictive s...
research
06/27/2022

Monitoring Shortcut Learning using Mutual Information

The failure of deep neural networks to generalize to out-of-distribution...

Please sign up or login with your details

Forgot password? Click here to reset