no code implementations • COLING 2018 • Hao Zhang, Axel Ng, Richard Sproat
Compared to a strong baseline of attention-based RNN, our ITG RNN re-ordering model can reach the same reordering accuracy with only 1/10 of the training data and is 2. 5x faster in decoding.