Multi network InfoMax: A pre-training method involving graph convolutional networks

11/01/2021
by   Usman Mahmood, et al.
0

Discovering distinct features and their relations from data can help us uncover valuable knowledge crucial for various tasks, e.g., classification. In neuroimaging, these features could help to understand, classify, and possibly prevent brain disorders. Model introspection of highly performant overparameterized deep learning (DL) models could help find these features and relations. However, to achieve high-performance level DL models require numerous labeled training samples (n) rarely available in many fields. This paper presents a pre-training method involving graph convolutional/neural networks (GCNs/GNNs), based on maximizing mutual information between two high-level embeddings of an input sample. Many of the recently proposed pre-training methods pre-train one of many possible networks of an architecture. Since almost every DL model is an ensemble of multiple networks, we take our high-level embeddings from two different networks of a model –a convolutional and a graph network–. The learned high-level graph latent representations help increase performance for downstream graph classification tasks and bypass the need for a high number of labeled data samples. We apply our method to a neuroimaging dataset for classifying subjects into healthy control (HC) and schizophrenia (SZ) groups. Our experiments show that the pre-trained model significantly outperforms the non-pre-trained model and requires 50% less data for similar performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2019

Pre-Training Graph Neural Networks for Generic Structural Feature Extraction

Graph neural networks (GNNs) are shown to be successful in modeling appl...
research
03/13/2023

A Survey of Graph Prompting Methods: Techniques, Applications, and Challenges

While deep learning has achieved great success on various tasks, the tas...
research
05/02/2023

BrainNPT: Pre-training of Transformer networks for brain network classification

Deep learning methods have advanced quickly in brain imaging analysis ov...
research
03/04/2021

Universal Representation for Code

Learning from source code usually requires a large amount of labeled dat...
research
03/30/2021

DAP: Detection-Aware Pre-training with Weak Supervision

This paper presents a detection-aware pre-training (DAP) approach, which...
research
11/29/2022

Self-Supervised Mental Disorder Classifiers via Time Reversal

Data scarcity is a notable problem, especially in the medical domain, du...
research
05/18/2023

Statistical Foundations of Prior-Data Fitted Networks

Prior-data fitted networks (PFNs) were recently proposed as a new paradi...

Please sign up or login with your details

Forgot password? Click here to reset