Research reports

Gradient Gating for Deep Multi-Rate Learning on Graphs

by T. K. Rusch and B. P. Chamberlain and M. W. Mahoney and M. M. Bronstein and S. Mishra

(Report number 2022-41)

Abstract
We present Gradient Gating (G\(^2\)), a novel framework for improving the performance of Graph Neural Networks (GNNs). Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph. Local gradients are harnessed to further modulate message passing updates. Our framework flexibly allows one to use any basic GNN layer as a wrapper around which the multi-rate gradient gating mechanism is built. We rigorously prove that G\(^2\) alleviates the oversmoothing problem and allows the design of deep GNNs. Empirical results are presented to demonstrate that the proposed framework achieves state-of-the-art performance on a variety of graph learning tasks, including on large-scale heterophilic graphs.

Keywords: GNNs, message-passing, oversmoothing, heterophilic graphs, multi-rate learning, gating, large graphs

BibTeX
@Techreport{RCMBM22_1029,
  author = {T. K. Rusch and B. P. Chamberlain and M. W. Mahoney and M. M. Bronstein and S. Mishra},
  title = {Gradient Gating for Deep Multi-Rate Learning on Graphs},
  institution = {Seminar for Applied Mathematics, ETH Z{\"u}rich},
  number = {2022-41},
  address = {Switzerland},
  url = {https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2022/2022-41.pdf },
  year = {2022}
}

Disclaimer
© Copyright for documents on this server remains with the authors. Copies of these documents made by electronic or mechanical means including information storage and retrieval systems, may only be employed for personal use. The administrators respectfully request that authors inform them when any paper is published to avoid copyright infringement. Note that unauthorised copying of copyright material is illegal and may lead to prosecution. Neither the administrators nor the Seminar for Applied Mathematics (SAM) accept any liability in this respect. The most recent version of a SAM report may differ in formatting and style from published journal version. Do reference the published version if possible (see SAM Publications).

JavaScript has been disabled in your browser