Search Results for author: Longtu Zhang

Found 6 papers, 1 papers with code

Simpson's Bias in NLP Training

no code implementations13 Mar 2021 Fei Yuan, Longtu Zhang, Huang Bojun, Yaobo Liang

In most machine learning tasks, we evaluate a model $M$ on a given data population $S$ by measuring a population-level metric $F(S;M)$.

Multi-class Classification Sentence +1

Chinese-Japanese Unsupervised Neural Machine Translation Using Sub-character Level Information

no code implementations1 Mar 2019 Longtu Zhang, Mamoru Komachi

Unsupervised neural machine translation (UNMT) requires only monolingual data of similar language pairs during training and can produce bi-directional translation models with relatively good performance on alphabetic languages (Lample et al., 2018).

Machine Translation Translation

Neural Machine Translation of Logographic Language Using Sub-character Level Information

no code implementations WS 2018 Longtu Zhang, Mamoru Komachi

Recent neural machine translation (NMT) systems have been greatly improved by encoder-decoder models with attention mechanisms and sub-word units.

Decoder Machine Translation +3

Neural Machine Translation of Logographic Languages Using Sub-character Level Information

no code implementations7 Sep 2018 Longtu Zhang, Mamoru Komachi

Recent neural machine translation (NMT) systems have been greatly improved by encoder-decoder models with attention mechanisms and sub-word units.

Decoder Machine Translation +2

Cannot find the paper you are looking for? You can Submit a new open access paper.