DRew: Dynamically Rewired Message Passing with Delay

13 May 2023  ·  Benjamin Gutteridge, Xiaowen Dong, Michael Bronstein, Francesco Di Giovanni ·

Message passing neural networks (MPNNs) have been shown to suffer from the phenomenon of over-squashing that causes poor performance for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node's immediate neighbours. Rewiring approaches attempting to make graphs 'more connected', and supposedly better suited to long-range tasks, often lose the inductive bias provided by distance on the graph since they make distant nodes communicate instantly at every layer. In this paper we propose a framework, applicable to any MPNN architecture, that performs a layer-dependent rewiring to ensure gradual densification of the graph. We also propose a delay mechanism that permits skip connections between nodes depending on the layer and their mutual distance. We validate our approach on several long-range tasks and show that it outperforms graph Transformers and multi-hop MPNNs.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Node Classification PascalVOC-SP DRew-GatedGCN+LapPE macro F1 0.3314±0.0024 # 5
Link Prediction PCQM-Contact DRew-GCN MRR 0.3444±0.0017 # 6
Graph Classification Peptides-func DRew-GCN+LapPE AP 0.7150±0.0044 # 1
Graph Regression Peptides-struct DRew-GCN+LapPE MAE 0.2536±0.0015 # 16

Methods