no code implementations • 19 Apr 2023 • Hao Fei, Tat-Seng Chua, Chenliang Li, Donghong Ji, Meishan Zhang, Yafeng Ren
In this study, we propose to enhance the ABSA robustness by systematically rethinking the bottlenecks from all possible angles, including model, data, and training.
Aspect-Based Sentiment Analysis Aspect-Based Sentiment Analysis (ABSA) +2
no code implementations • 6 Oct 2022 • Hao Fei, Shengqiong Wu, Meishan Zhang, Yafeng Ren, Donghong Ji
In this work, we investigate the integration of a latent graph for CSRL.
no code implementations • IEEE 2021 • Hao Fei, Yafeng Ren, Yue Zhang, Donghong Ji
Aspect-based sentiment triplet extraction (ASTE) aims at recognizing the joint triplets from texts, i. e., aspect terms, opinion expressions, and correlated sentiment polarities.
Ranked #3 on Aspect Sentiment Triplet Extraction on ASTE-Data-V2
1 code implementation • Conference 2021 • Hao Fei, Fei Li, Bobo Li, Yijiang Liu, Yafeng Ren, Donghong Ji
A majority of research interests in irregular (eg, nested or discontinuous) named entity recognition (NER) have been paid on nested entities, while discontinuous entities received limited attention.
1 code implementation • 6 May 2021 • Shengqiong Wu, Hao Fei, Yafeng Ren, Donghong Ji, Jingye Li
In this paper, we propose to enhance the pair-wise aspect and opinion terms extraction (PAOTE) task by incorporating rich syntactic knowledge.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Hao Fei, Yafeng Ren, Donghong Ji
Recent studies show that integrating syntactic tree models with sequential semantic models can bring improved task performance, while these methods mostly employ shallow integration of syntax and semantics.
no code implementations • 19 Sep 2020 • Bobo Li, Hao Fei, Yafeng Ren, Donghong Ji
Lexical chain consists of cohesion words in a document, which implies the underlying structure of a text, and thus facilitates downstream NLP tasks.
no code implementations • EMNLP 2020 • Hao Fei, Yafeng Ren, Donghong Ji
We consider retrofitting structure-aware Transformer-based language model for facilitating end tasks by proposing to exploit syntactic distance to encode both the phrasal constituency and dependency connection into the language model.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Hao Fei, Yafeng Ren, Donghong Ji
Syntax has been shown useful for various NLP tasks, while existing work mostly encodes singleton syntactic tree using one hierarchical neural network.
no code implementations • Asian Chapter of the Association for Computational Linguistics 2020 • Hao Fei, Yafeng Ren, Donghong Ji
Current end-to-end semantic role labeling is mostly accomplished via graph-based neural models.
no code implementations • COLING 2016 • Yafeng Ren, Yue Zhang
Deceptive opinion spam detection has attracted significant attention from both business and research communities.