1 code implementation • NAACL 2022 • Huda Hakami, Mona Hakami, Angrosh Mandya, Danushka Bollegala
In this paper, we propose and evaluate several methods to address this problem, where we borrow LDPs from the entity pairs that co-occur in sentences in the corpus (i. e. with mentions entity pairs) to represent entity pairs that do not co-occur in any sentence in the corpus (i. e. without mention entity pairs).
1 code implementation • 27 Apr 2022 • Huda Hakami, Mona Hakami, Angrosh Mandya, Danushka Bollegala
In this paper, we propose and evaluate several methods to address this problem, where we borrow LDPs from the entity pairs that co-occur in sentences in the corpus (i. e. with mention entity pairs) to represent entity pairs that do not co-occur in any sentence in the corpus (i. e. without mention entity pairs).
no code implementations • COLING 2020 • Angrosh Mandya, Danushka Bollegala, Frans Coenen
We propose a contextualised graph convolution network over multiple dependency-based sub-graphs for relation extraction.
no code implementations • 12 May 2020 • Angrosh Mandya, James O'Neill, Danushka Bollegala, Frans Coenen
The Conversational Question Answering (CoQA) task involves answering a sequence of inter-related conversational questions about a contextual paragraph.
1 code implementation • 22 Apr 2020 • Angrosh Mandya, Danushka Bollegala, Frans Coenen
This paper presents a contextualized graph attention network that combines edge features and multiple sub-graphs for improving relation extraction.
no code implementations • AKBC 2019 • Angrosh Mandya, Danushka Bollegala, Frans Coenen, Katie Atkinson
We propose in this paper a combined model of Long Short Term Memory and Convolutional Neural Networks (LSTM-CNN) that exploits word embeddings and positional embeddings for cross-sentence n-ary relation extraction.