no code implementations • 30 Apr 2024 • Xiaoming Liu, Chen Liu, Zhaohan Zhang, Chengzhengxu Li, Longtian Wang, Yu Lan, Chao Shen
Large language models have shown their ability to become effective few-shot learners with prompting, revoluting the paradigm of learning with data scarcity.
no code implementations • 1 Feb 2024 • Shengchao Liu, Xiaoming Liu, Yichen Wang, Zehua Cheng, Chengzhengxu Li, Zhaohan Zhang, Yu Lan, Chao Shen
Hence, we propose a novel fine-tuned detector, Pecola, bridging metric-based and fine-tuned detectors by contrastive learning on selective perturbation.
1 code implementation • 20 Dec 2022 • Xiaoming Liu, Zhaohan Zhang, Yichen Wang, Hang Pu, Yu Lan, Chao Shen
Machine-Generated Text (MGT) detection, a task that discriminates MGT from Human-Written Text (HWT), plays a crucial role in preventing misuse of text generative models, which excel in mimicking human writing style recently.
1 code implementation • 3 Dec 2020 • Xiaoming Liu, Shaocong Wu, Zhaohan Zhang, Chao Shen
To tackle this research gap, we propose a novel duet representation learning framework named \sysname to fuse local information (user-item interaction data) and global information (external knowledge graph) for the top-$N$ recommendation, which is composed of two separate sub-models.
1 code implementation • 26 May 2020 • Zhaohan Zhang, Mu Li, Katharine Flores, Rohan Mishra
The model uses easily accessible elemental properties as descriptors and has a mean absolute error (MAE) of 0. 025 eV/atom in predicting the formation enthalpy of stable binary intermetallics reported in the Materials Project database.
Materials Science