Research reports

A Survey on Oversmoothing in Graph Neural Networks

by T. K. Rusch and M. M. Bronstein and S. Mishra

(Report number 2023-17)

Abstract
Node features of graph neural networks (GNNs) tend to become more similar with the increase of the network depth. This effect is known as over-smoothing, which we axiomatically define as the exponential convergence of suitable similarity measures on the node features. Our definition unifies previous approaches and gives rise to new quantitative measures of over-smoothing. Moreover, we empirically demonstrate this behavior for several over-smoothing measures on different graphs (small-, medium-, and large-scale). We also review several approaches for mitigating over-smoothing and empirically test their effectiveness on real-world graph datasets. Through illustrative examples, we demonstrate that mitigating over-smoothing is a necessary but not sufficient condition for building deep GNNs that are expressive on a wide range of graph learning tasks. Finally, we extend our definition of over-smoothing to the rapidly emerging field of continuous-time GNNs.

Keywords: Graph Neural Network (GNN), oversmoothing, Deep Learning

BibTeX
@Techreport{RBM23_1054,
  author = {T. K. Rusch and M. M. Bronstein and S. Mishra},
  title = {A Survey on Oversmoothing in Graph Neural Networks},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2023-17},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2023/2023-17.pdf },
  year = {2023}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser