no code implementations • 23 Apr 2024 • Weifeng Chen, Jiacheng Zhang, Jie Wu, Hefeng Wu, Xuefeng Xiao, Liang Lin
The rapid development of diffusion models has triggered diverse applications.
no code implementations • 21 Apr 2024 • Yuxi Ren, Xin Xia, Yanzuo Lu, Jiacheng Zhang, Jie Wu, Pan Xie, Xing Wang, Xuefeng Xiao
Current distillation techniques often dichotomize into two distinct aspects: i) ODE Trajectory Preservation; and ii) ODE Trajectory Reformulation.
no code implementations • 8 Apr 2024 • Jiacheng Zhang, Jie Wu, Yuxi Ren, Xin Xia, Huafeng Kuang, Pan Xie, Jiashi Li, Xuefeng Xiao, Weilin Huang, Min Zheng, Lean Fu, Guanbin Li
Diffusion models have revolutionized the field of image generation, leading to the proliferation of high-quality models and diverse downstream applications.
no code implementations • 26 Mar 2024 • Jiacheng Zhang, Jiaming Li, Xiangru Lin, Wei zhang, Xiao Tan, Junyu Han, Errui Ding, Jingdong Wang, Guanbin Li
Additionally, we present a DepthGradient Projection (DGP) module to mitigate optimization conflicts caused by noisy depth supervision of pseudo-labels, effectively decoupling the depth gradient and removing conflicting gradients.
no code implementations • ICCV 2023 • Fengyu Yang, Jiacheng Zhang, Andrew Owens
An emerging line of work has sought to generate plausible imagery from touch.
no code implementations • 1 Aug 2023 • Zhenyu Zhong, Qiliang Fan, Jiacheng Zhang, Minghua Ma, Shenglin Zhang, Yongqian Sun, QIngwei Lin, Yuzhi Zhang, Dan Pei
Internet-based services have seen remarkable success, generating vast amounts of monitored key performance indicators (KPIs) as univariate or multivariate time series.
3 code implementations • CVPR 2023 • Jiacheng Zhang, Xiangru Lin, Wei zhang, Kuo Wang, Xiao Tan, Junyu Han, Errui Ding, Jingdong Wang, Guanbin Li
Specifically, we propose a Stage-wise Hybrid Matching strategy that combines the one-to-many assignment and one-to-one assignment strategies to improve the training efficiency of the first stage and thus provide high-quality pseudo labels for the training of the second stage.
no code implementations • 15 Feb 2023 • Shuoqing Deng, Xiang Yu, Jiacheng Zhang
When the sufficient condition of the attitude function is violated, we can illustrate by various examples that the characterization of the optimal equilibrium may differ significantly from some existing results for an individual agent.
no code implementations • 24 Nov 2022 • Jiacheng Zhang, Wenyi Yan, Ye Zhang
In this paper, a new speech feature fusion method is proposed for speaker recognition on the basis of the cross gate parallel convolutional neural network (CG-PCNN).
no code implementations • 22 Nov 2022 • Fengyu Yang, Chenyang Ma, Jiacheng Zhang, Jing Zhu, Wenzhen Yuan, Andrew Owens
The ability to associate touch with sight is essential for tasks that require physically interacting with objects in the world.
1 code implementation • ACL 2022 • Xueqing Wu, Jiacheng Zhang, Hang Li
We first employ a seq2seq model fine-tuned from a pre-trained language model to perform the task.
1 code implementation • 14 Jul 2020 • Xuancheng Huang, Jiacheng Zhang, Zhixing Tan, Derek F. Wong, Huanbo Luan, Jingfang Xu, Maosong Sun, Yang Liu
System combination is an important technique for combining the hypotheses of different machine translation systems to improve translation performance.
no code implementations • 26 Nov 2019 • Jiacheng Zhang, Huanbo Luan, Maosong Sun, FeiFei Zhai, Jingfang Xu, Yang Liu
The lack of alignment in NMT models leads to three problems: it is hard to (1) interpret the translation process, (2) impose lexical constraints, and (3) impose structural constraints.
1 code implementation • ACL 2017 • Jiacheng Zhang, Yang Liu, Huanbo Luan, Jingfang Xu, Maosong Sun
Although neural machine translation has made significant progress recently, how to integrate multiple overlapping, arbitrary prior knowledge sources remains a challenge.
3 code implementations • EMNLP 2018 • Jiacheng Zhang, Huanbo Luan, Maosong Sun, FeiFei Zhai, Jingfang Xu, Min Zhang, Yang Liu
Although the Transformer translation model (Vaswani et al., 2017) has achieved state-of-the-art performance in a variety of translation tasks, how to use document-level context to deal with discourse phenomena problematic for Transformer still remains a challenge.
6 code implementations • 20 Jun 2017 • Jiacheng Zhang, Yanzhuo Ding, Shiqi Shen, Yong Cheng, Maosong Sun, Huanbo Luan, Yang Liu
This paper introduces THUMT, an open-source toolkit for neural machine translation (NMT) developed by the Natural Language Processing Group at Tsinghua University.