no code implementations • Findings (NAACL) 2022 • Xiaozhi Zhu, Tianyong Hao, Sijie Cheng, Fu Lee Wang, Hai Liu
Pretrained language models such as BERT have been successfully applied to a wide range of natural language processing tasks and also achieved impressive performance in document reranking tasks.
no code implementations • 11 Apr 2022 • Pan Du, Xiaozhi Zhu, Jian-Xun Wang
An efficient supervised learning solution is proposed to map the geometric inputs to the hemodynamics predictions in latent spaces.