no code implementations • AACL (NLP-TEA) 2020 • Xuan Zhang, Huizhou Zhao, Kexin Zhang, Yiyang Zhang
Text simplification is an important branch of natural language processing.
no code implementations • 31 Aug 2023 • Yongxin Shao, Aihong Tan, Binrui Wang, Tianhong Yan, Zhetao Sun, Yiyang Zhang, Jiaxin Liu
For feature points from shallow-level, we retain them on the object's surface to describe the geometric features of the object.
no code implementations • 26 Apr 2023 • Yiyang Zhang, Junyi Liu, Xiaobo Zhao
Focusing on stochastic programming (SP) with covariate information, this paper proposes an empirical risk minimization (ERM) method embedded within a nonconvex piecewise affine decision rule (PADR), which aims to learn the direct mapping from features to optimal decisions.
no code implementations • 23 Apr 2023 • Wenxiong Liao, Zhengliang Liu, Haixing Dai, Shaochen Xu, Zihao Wu, Yiyang Zhang, Xiaoke Huang, Dajiang Zhu, Hongmin Cai, Tianming Liu, Xiang Li
We focus on analyzing the differences between medical texts written by human experts and generated by ChatGPT, and designing machine learning workflows to effectively detect and differentiate medical texts generated by ChatGPT.
no code implementations • 21 Feb 2023 • Wenxiong Liao, Zhengliang Liu, Haixing Dai, Zihao Wu, Yiyang Zhang, Xiaoke Huang, Yuzhong Chen, Xi Jiang, Wei Liu, Dajiang Zhu, Tianming Liu, Sheng Li, Xiang Li, Hongmin Cai
The main challenge of FSL is the difficulty of training robust models on small amounts of samples, which frequently leads to overfitting.
no code implementations • 5 Nov 2022 • Hongmin Cai, Wenxiong Liao, Zhengliang Liu, Yiyang Zhang, Xiaoke Huang, Siqi Ding, Hui Ren, Zihao Wu, Haixing Dai, Sheng Li, Lingfei Wu, Ninghao Liu, Quanzheng Li, Tianming Liu, Xiang Li
In this framework, we apply distant-supervision on cross-domain knowledge graph adaptation.
1 code implementation • 4 Aug 2020 • Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu
We consider two cases of this setting, one is that the source domain only contains complementary-label data (completely complementary unsupervised domain adaptation, CC-UDA), and the other is that the source domain has plenty of complementary-label data and a small amount of true-label data (partly complementary unsupervised domain adaptation, PC-UDA).
1 code implementation • 29 Jul 2020 • Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu
To mitigate this problem, we consider a novel problem setting where the classifier for the target domain has to be trained with complementary-label data from the source domain and unlabeled data from the target domain named budget-friendly UDA (BFUDA).