no code implementations • ICLR 2020 • Yuxian Meng, Muyu Li, Xiaoya Li, Wei Wu, Jiwei Li
In this paper, we aim at tackling a general issue in NLP tasks where some of the negative examples are highly similar to the positive examples, i. e., hard-negative examples.
2 code implementations • NeurIPS 2019 • Yuxian Meng, Wei Wu, Fei Wang, Xiaoya Li, Ping Nie, Fan Yin, Muyu Li, Qinghong Han, Xiaofei Sun, Jiwei Li
However, due to the lack of rich pictographic evidence in glyphs and the weak generalization ability of standard computer vision models on character data, an effective way to utilize the glyph information remains to be found.
Ranked #1 on Chinese Sentence Pair Classification on LCQMC
Chinese Dependency Parsing Chinese Named Entity Recognition +21
1 code implementation • EMNLP 2018 • Junyang Lin, Xu sun, Xuancheng Ren, Muyu Li, Qi Su
Most of the Neural Machine Translation (NMT) models are based on the sequence-to-sequence (Seq2Seq) model with an encoder-decoder framework equipped with the attention mechanism.
Ranked #7 on Machine Translation on IWSLT2015 English-Vietnamese