Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction

22 May 2023  ยท  Adrian Kochsiek, Apoorv Saxena, Inderjeet Nair, Rainer Gemulla ยท

We propose KGT5-context, a simple sequence-to-sequence model for link prediction (LP) in knowledge graphs (KG). Our work expands on KGT5, a recent LP model that exploits textual features of the KG, has small model size, and is scalable. To reach good predictive performance, however, KGT5 relies on an ensemble with a knowledge graph embedding model, which itself is excessively large and costly to use. In this short paper, we show empirically that adding contextual information - i.e., information about the direct neighborhood of the query entity - alleviates the need for a separate KGE model to obtain good performance. The resulting KGT5-context model is simple, reduces model size significantly, and obtains state-of-the-art performance in our experimental study.

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Link Prediction Wikidata5M KGT5-context + Description MRR 0.426 # 1
Hits@1 0.406 # 1
Hits@3 0.44 # 1
Hits@10 0.46 # 1
Link Prediction Wikidata5M KGT5 + Description MRR 0.381 # 2
Hits@1 0.357 # 2
Hits@3 0.397 # 2
Hits@10 0.422 # 5
Link Prediction Wikidata5M KGT5-context MRR 0.378 # 3
Hits@1 0.35 # 3
Hits@3 0.396 # 3
Hits@10 0.427 # 3

Methods


No methods listed for this paper. Add relevant methods here