no code implementations • 14 May 2024 • Yihong Chen, Zhen Fan, Shuai Dong, Zhiwei Chen, Wenjie Li, Minghui Qin, Min Zeng, Xubing Lu, Guofu Zhou, Xingsen Gao, Jun-Ming Liu
Here, we propose a simple yet efficient stereo image SR model called NAFRSSR, which is modified from the previous state-of-the-art model NAFSSR by introducing recursive connections and lightweighting the constituent modules.
no code implementations • 27 Feb 2024 • Izia Xiaoxiao Wang, Xihan Wu, Edith Coates, Min Zeng, Jiexin Kuang, Siliang Liu, Mengyang Qiu, Jungyeul Park
The utilization of technology in second language learning and teaching has become ubiquitous.
no code implementations • 24 Feb 2024 • Min Zeng, Jiexin Kuang, Mengyang Qiu, Jayoung Song, Jungyeul Park
The writing examples of English language learners may be different from those of native speakers.
no code implementations • 13 Sep 2023 • Min Zeng, Wei Xue, Qifeng Liu, Yike Guo
Recent advancements in data-driven task-oriented dialogue systems (ToDs) struggle with incremental learning due to computational constraints and time-consuming issues.
no code implementations • 9 Mar 2023 • Min Zeng, Haimiao Mo, Zhiming Liang, Hua Wang
The "LFWA+FD" focuses on searching the ideal feature subset by simplifying the fireworks algorithm and constraining the dimensionality of selected features by fractal dimensionality, which in turn reduces the approximate features and reduces the noise in the original data to improve the accuracy of the model.
no code implementations • 7 Jan 2023 • Haimiao Mo, Min Zeng
The fireworks algorithm is an optimization algorithm for simulating the explosion phenomenon of fireworks.
1 code implementation • 3 Dec 2022 • Ziwei Ji, Zihan Liu, Nayeon Lee, Tiezheng Yu, Bryan Wilie, Min Zeng, Pascale Fung
Dialogue systems can leverage large pre-trained language models and knowledge to generate fluent and informative responses.
no code implementations • 30 Mar 2022 • Holy Lovenia, Bryan Wilie, Willy Chung, Min Zeng, Samuel Cahyawijaya, Su Dan, Pascale Fung
Task-adaptive pre-training (TAPT) alleviates the lack of labelled data and provides performance lift by adapting unlabelled data to downstream task.
no code implementations • 1 Mar 2022 • Ziwei Ji, Yan Xu, I-Tsun Cheng, Samuel Cahyawijaya, Rita Frieske, Etsuko Ishii, Min Zeng, Andrea Madotto, Pascale Fung
In order to offer a customized script tool and inspire professional scriptwriters, we present VScript.
no code implementations • 12 Jun 2021 • Yifan Wu, Min Zeng, Ying Yu, Min Li
The label-wise attention mechanism is widely used in automatic ICD coding because it can assign weights to every word in full Electronic Medical Records (EMR) for different ICD codes.
1 code implementation • 29 Jan 2021 • Yifan Wu, Min Gao, Min Zeng, Feiyang Chen, Min Li, Jie Zhang
Therefore, we hope to develop a novel supervised learning method to learn the PPAs and DDAs effectively and thereby improve the prediction performance of the specific task of DPI.
no code implementations • IJCNLP 2019 • Min Zeng, Yisen Wang, Yuan Luo
Based on which, we further find that there is redundancy among the dimensions of latent variable, and the lengths and sentence patterns of the responses can be strongly correlated to each dimension of the latent variable.