On Asymptotic Behaviors of Graph CNNs from Dynamical Systems Perspective

by   Kenta Oono, et al.

Graph Convolutional Neural Networks (graph CNNs) are a promising deep learning approach for analyzing graph-structured data. However, it is known that they do not improve (or sometimes worsen) their predictive performance as we pile up more layers and make them deeper. To tackle this problem, we investigate the expressive power of graph CNNs by analyzing their asymptotic behaviors as the layer size tends to infinity. Our strategy is to generalize the forward propagation of a Graph Convolutional Network (GCN), which is one of the most popular graph CNN variants, as a specific dynamical system. In the case of GCNs, we show that when the weights satisfy the conditions determined by the spectra of the (augmented) normalized Laplacian, the output of GCNs exponentially approaches the set of signals that carry only information of the connected components and node degrees for distinguishing nodes. Our theory enables us to directly relate the expressive power of GCNs with the topological information of the underlying graphs, which is inherent in the graph spectra. To demonstrate this, we characterize the asymptotic behavior of GCNs on the Erdős -- Rényi graph. We show that when the Erdős -- Rényi graph is sufficiently dense and large, a wide range of GCNs on them suffers from this "information loss" in the limit of infinite layers with high probability. Furthermore, our theory provides principled guidelines for the weight normalization of graph CNNs. We experimentally confirmed that weight scaling based on our theory enhanced the predictive performance of GCNs in real data.


page 1

page 2

page 3

page 4


Wide Graph Neural Networks: Aggregation Provably Leads to Exponentially Trainability Loss

Graph convolutional networks (GCNs) and their variants have achieved gre...

MotifNet: a motif-based Graph Convolutional Network for directed graphs

Deep learning on graphs and in particular, graph convolutional neural ne...

Adaptive Graph Convolutional Neural Networks

Graph Convolutional Neural Networks (Graph CNNs) are generalizations of ...

RawlsGCN: Towards Rawlsian Difference Principle on Graph Convolutional Network

Graph Convolutional Network (GCN) plays pivotal roles in many real-world...

Training Matters: Unlocking Potentials of Deeper Graph Convolutional Neural Networks

The performance limit of Graph Convolutional Networks (GCNs) and the fac...

Complex-Value Spatio-temporal Graph Convolutional Neural Networks and its Applications to Electric Power Systems AI

The effective representation, precessing, analysis, and visualization of...

The Power of Graph Convolutional Networks to Distinguish Random Graph Models

Graph convolutional networks (GCNs) are a widely used method for graph r...

Please sign up or login with your details

Forgot password? Click here to reset