no code implementations • 15 Mar 2024 • Cong Wang, Jinshan Pan, Yeying Jin, Liyan Wang, Wei Wang, Gang Fu, Wenqi Ren, Xiaochun Cao
Our designs provide a closer look at the attention mechanism and reveal that some simple operations can significantly affect the model performance.
1 code implementation • 17 Aug 2023 • Liyan Wang, Qinyu Yang, Cong Wang, Wei Wang, Jinshan Pan, Zhixun Su
Specifically, our C2F-DFT contains diffusion self-attention (DFSA) and diffusion feed-forward network (DFN) within a new coarse-to-fine training scheme.
no code implementations • JEPTALNRECITAL 2020 • Taill, Valentin ier, Liyan Wang, Yves Lepage
Cet article propose un mod{\`e}le de r{\'e}seau de neurones pour la r{\'e}solution d{'}{\'e}quations analogiques au niveau s{\'e}mantique et entre phrases dans le cadre de la traduction automatique par l{'}exemple.