Breaking the Limit of Graph Neural Networks by Improving the Assortativity of Graphs with Local Mixing Patterns

by   Susheel Suresh, et al.

Graph neural networks (GNNs) have achieved tremendous success on multiple graph-based learning tasks by fusing network structure and node features. Modern GNN models are built upon iterative aggregation of neighbor's/proximity features by message passing. Its prediction performance has been shown to be strongly bounded by assortative mixing in the graph, a key property wherein nodes with similar attributes mix/connect with each other. We observe that real world networks exhibit heterogeneous or diverse mixing patterns and the conventional global measurement of assortativity, such as global assortativity coefficient, may not be a representative statistic in quantifying this mixing. We adopt a generalized concept, node-level assortativity, one that is based at the node level to better represent the diverse patterns and accurately quantify the learnability of GNNs. We find that the prediction performance of a wide range of GNN models is highly correlated with the node level assortativity. To break this limit, in this work, we focus on transforming the input graph into a computation graph which contains both proximity and structural information as distinct type of edges. The resulted multi-relational graph has an enhanced level of assortativity and, more importantly, preserves rich information from the original graph. We then propose to run GNNs on this computation graph and show that adaptively choosing between structure and proximity leads to improved performance under diverse mixing. Empirically, we show the benefits of adopting our transformation framework for semi-supervised node classification task on a variety of real world graph learning benchmarks.


Is Heterophily A Real Nightmare For Graph Neural Networks To Do Node Classification?

Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using...

Beyond Localized Graph Neural Networks: An Attributed Motif Regularization Framework

We present InfoMotif, a new semi-supervised, motif-regularized, learning...

p-Laplacian Based Graph Neural Networks

Graph neural networks (GNNs) have demonstrated superior performance for ...

Equivariant and Stable Positional Encoding for More Powerful Graph Neural Networks

Graph neural networks (GNN) have shown great advantages in many graph-ba...

Revisiting Adversarial Attacks on Graph Neural Networks for Graph Classification

Graph neural networks (GNNs) have achieved tremendous success in the tas...

Semi-decentralized Inference in Heterogeneous Graph Neural Networks for Traffic Demand Forecasting: An Edge-Computing Approach

Accurate and timely prediction of transportation demand and supply is es...

GraLSP: Graph Neural Networks with Local Structural Patterns

It is not until recently that graph neural networks (GNNs) are adopted t...

Please sign up or login with your details

Forgot password? Click here to reset