DyG2Vec: Efficient Representation Learning for Dynamic Graphs

30 Oct 2022  ยท  Mohammad Ali Alomrani, Mahdi Biparva, Yingxue Zhang, Mark Coates ยท

Temporal graph neural networks have shown promising results in learning inductive representations by automatically extracting temporal patterns. However, previous works often rely on complex memory modules or inefficient random walk methods to construct temporal representations. To address these limitations, we present an efficient yet effective attention-based encoder that leverages temporal edge encodings and window-based subgraph sampling to generate task-agnostic embeddings. Moreover, we propose a joint-embedding architecture using non-contrastive SSL to learn rich temporal embeddings without labels. Experimental results on 7 benchmark datasets indicate that on average, our model outperforms SoTA baselines on the future link prediction task by 4.23% for the transductive setting and 3.30% for the inductive setting while only requiring 5-10x less training/inference time. Lastly, different aspects of the proposed framework are investigated through experimental analysis and ablation studies. The code is publicly available at https://github.com/huawei-noah/noah-research/tree/master/graph_atlas.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Dynamic Link Prediction Enron DyG2Vec AP 99.1 # 1
Dynamic Link Prediction LastFM DyG2Vec AP 96 # 1
Dynamic Link Prediction MOOC DyG2Vec AP 98 # 1
Dynamic Link Prediction Reddit DyG2Vec AP 99.6 # 1
Dynamic Link Prediction Social Evolution DyG2Vec AP 98.7 # 1
Dynamic Link Prediction UCI DyG2Vec AP 98.8 # 1
Dynamic Link Prediction Wikipedia DyG2Vec AP 99.5 # 1

Methods