no code implementations • 19 Apr 2024 • Peng Liu, Cong Xu, Ming Zhao, Jiawei Zhu, Bin Wang, Yi Ren
Recently, to optimize long-term user satisfaction within a recommendation session, Reinforcement Learning (RL) is used for MTF in the industry.
no code implementations • 18 Feb 2024 • Kun Ma, Cong Xu, Zeyuan Chen, Wei zhang
However, achieving both model transparency and recommendation performance simultaneously is challenging, especially for models that take the entire sequence of items as input without screening.
no code implementations • 9 Feb 2024 • Cong Xu, Zhangchi Zhu, Jun Wang, Jianyong Wang, Wei zhang
Large language models (LLMs) have gained much attention in the recommendation community; some studies have observed that LLMs, fine-tuned by the cross-entropy loss with a full softmax, could achieve state-of-the-art performance already.
1 code implementation • 6 Nov 2023 • Hao Zhang, Cong Xu, Shuaijie Zhang
Based on the above, we first analyzed the BBR model and concluded that distinguishing different regression samples and using different scales of auxiliary bounding boxes to calculate losses can effectively accelerate the bounding box regression process.
Ranked #1 on Object Detection on AI-TOD (mAP50 metric)
no code implementations • 24 Sep 2023 • Cong Xu, Jun Wang, Jianyong Wang, Wei zhang
Embedding plays a critical role in modern recommender systems because they are virtual representations of real-world entities and the foundation for subsequent decision models.
1 code implementation • 3 Apr 2023 • Giacomo Pedretti, John Moon, Pedro Bruel, Sergey Serebryakov, Ron M. Roth, Luca Buonanno, Tobias Ziegler, Cong Xu, Martin Foltin, Paolo Faraboschi, Jim Ignowski, Catherine E. Graves
In this work, we focus on an overall analog-digital architecture implementing a novel increased precision analog CAM and a programmable network on chip allowing the inference of state-of-the-art tree-based ML models, such as XGBoost and CatBoost.
1 code implementation • 23 Oct 2022 • Yufeng Wang, Cong Xu, Min Yang, Jin Zhang
Although Physics-Informed Neural Networks (PINNs) have been successfully applied in a wide variety of science and engineering fields, they can fail to accurately predict the underlying solution in slightly challenging convection-diffusion-reaction problems.
no code implementations • 25 Jul 2022 • Huaying Hao, Cong Xu, Dan Zhang, Qifeng Yan, Jiong Zhang, Yue Liu, Yitian Zhao
To be more specific, we first perform a simple degradation of the 3x3 mm2/high-resolution (HR) image to obtain the synthetic LR image.
1 code implementation • 25 Feb 2022 • Cong Xu, Wei zhang, Jun Wang, Min Yang
Our theoretical analysis discovers that larger convolutional feature maps before average pooling can contribute to better resistance to perturbations, but the conclusion is not true for max pooling.
no code implementations • 13 Feb 2022 • Yufeng Wang, Dan Li, Cong Xu, Min Yang
Deep image inpainting research mainly focuses on constructing various neural network architectures or imposing novel optimization objectives.
1 code implementation • ACM Transactions on Knowledge Discovery from Data 2022 • Jianliang Gao, Xiaoting Ying, Cong Xu, Jianxin Wang, Shichao Zhang, Zhao Li
For a given group of stocks, the proposed TRAN model can output the ranking results of stocks according to their return ratios.
1 code implementation • 31 Jul 2021 • Yufeng Wang, Dan Li, Cong Xu, Min Yang
However, data augmentation, as a simple yet effective method, has not received enough attention in this area.
1 code implementation • 19 May 2021 • Cong Xu, Xiang Li, Min Yang
Neural networks are susceptible to artificially designed adversarial perturbations.
Ranked #1 on Adversarial Attack on CIFAR-10
1 code implementation • 24 Dec 2020 • Cong Xu, Dan Li, Min Yang
Recently proposed adversarial self-supervised learning methods usually require big batches and long training epochs to extract robust features, which will bring heavy computational overhead on platforms with limited resources.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Zhong Zhang, Chongming Gao, Cong Xu, Rui Miao, Qinli Yang, Junming Shao
They call it the representation degeneration problem and propose a cosine regularization to solve it.
no code implementations • 8 Oct 2020 • Cong Xu, Dan Li, Min Yang
It is well-known that deep neural networks are vulnerable to adversarial attacks.
no code implementations • 3 Dec 2019 • Cong Xu, Min Yang, Jin Zhang
The implementation of conventional sparse principal component analysis (SPCA) on high-dimensional data sets has become a time consuming work.
1 code implementation • 21 May 2018 • Wei Wen, Yandan Wang, Feng Yan, Cong Xu, Chunpeng Wu, Yiran Chen, Hai Li
It becomes an open question whether escaping sharp minima can improve the generalization.
1 code implementation • NeurIPS 2017 • Wei Wen, Cong Xu, Feng Yan, Chunpeng Wu, Yandan Wang, Yiran Chen, Hai Li
We mathematically prove the convergence of TernGrad under the assumption of a bound on gradients.
5 code implementations • ICCV 2017 • Wei Wen, Cong Xu, Chunpeng Wu, Yandan Wang, Yiran Chen, Hai Li
Moreover, Force Regularization better initializes the low-rank DNNs such that the fine-tuning can converge faster toward higher accuracy.