Over-smoothing Effect of Graph Convolutional Networks

01/30/2022
by   Fang Sun, et al.
0

Over-smoothing is a severe problem which limits the depth of Graph Convolutional Networks. This article gives a comprehensive analysis of the mechanism behind Graph Convolutional Networks and the over-smoothing effect. The article proposes an upper bound for the occurrence of over-smoothing, which offers insight into the key factors behind over-smoothing. The results presented in this article successfully explain the feasibility of several algorithms that alleviate over-smoothing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2023

On Addressing the Limitations of Graph Neural Networks

This report gives a summary of two problems about graph convolutional ne...
research
08/19/2022

Graph Convolutional Networks from the Perspective of Sheaves and the Neural Tangent Kernel

Graph convolutional networks are a popular class of deep neural network ...
research
10/28/2021

On Provable Benefits of Depth in Training Graph Convolutional Networks

Graph Convolutional Networks (GCNs) are known to suffer from performance...
research
08/22/2020

Tackling Over-Smoothing for General Graph Convolutional Networks

Increasing the depth of Graph Convolutional Networks (GCN), which in pri...
research
12/22/2021

SkipNode: On Alleviating Over-smoothing for Deep Graph Convolutional Networks

Over-smoothing is a challenging problem, which degrades the performance ...
research
06/21/2023

Structure-Aware DropEdge Towards Deep Graph Convolutional Networks

It has been discovered that Graph Convolutional Networks (GCNs) encounte...
research
07/04/2018

BayesGrad: Explaining Predictions of Graph Convolutional Networks

Recent advances in graph convolutional networks have significantly impro...

Please sign up or login with your details

Forgot password? Click here to reset