no code implementations • 7 Nov 2023 • Sang-hyun Je, Wontae Choi, Kwangjin Oh
In recent, pre-trained language model (PLM) based methods that utilize both textual and structural information are emerging, but their performances lag behind state-of-the-art (SOTA) structure-based methods or some methods lose their inductive inference capabilities in the process of fusing structure embedding to text encoder.
1 code implementation • 12 Oct 2022 • Sang-hyun Je
The proposed method can generate high-quality negative samples regardless of negative sample size and effectively mitigate the influence of false negative samples.
1 code implementation • 29 Jun 2022 • Minsang Kim, Sang-hyun Je, Eunjoo Park
We provide both a human-annotated test dataset and an auto-generated dataset.