1 code implementation • 19 Jul 2022 • Félix Gaschi, François Plesse, Parisa Rastin, Yannick Toussaint
Some Transformer-based models can perform cross-lingual transfer learning: those models can be trained on a specific task in one language and give relatively good results on the same task in another language, despite having been pre-trained on monolingual tasks only.
no code implementations • 28 May 2018 • François Plesse, Alexandru Ginsca, Bertrand Delezoide, Françoise Prêteux
For this, we propose a framework that makes use of semantic knowledge and estimates the relevance of object pairs during both training and test phases.