SoftEdge: Regularizing Graph Classification with Random Soft Edges

21 Apr 2022  ·  Hongyu Guo, Sun Sun ·

Augmented graphs play a vital role in regularizing Graph Neural Networks (GNNs), which leverage information exchange along edges in graphs, in the form of message passing, for learning. Due to their effectiveness, simple edge and node manipulations (e.g., addition and deletion) have been widely used in graph augmentation. Nevertheless, such common augmentation techniques can dramatically change the semantics of the original graph, causing overaggressive augmentation and thus under-fitting in the GNN learning. To address this problem arising from dropping or adding graph edges and nodes, we propose SoftEdge, which assigns random weights to a portion of the edges of a given graph for augmentation. The synthetic graph generated by SoftEdge maintains the same nodes and their connectivities as the original graph, thus mitigating the semantic changes of the original graph. We empirically show that this simple method obtains superior accuracy to popular node and edge manipulation approaches and notable resilience to the accuracy degradation with the GNN depth.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here