Search Results for author: Ashleigh Richardson

Found 1 papers, 0 papers with code

A Systematic Study Reveals Unexpected Interactions in Pre-Trained Neural Machine Translation

no code implementations LREC 2022 Ashleigh Richardson, Janet Wiles

When pre-training an NMT system for low-resource translation, the pre-training task is often chosen based on data abundance and similarity to the main task.

Low-Resource Neural Machine Translation NMT +2

Cannot find the paper you are looking for? You can Submit a new open access paper.