1 code implementation • 24 Apr 2024 • Junfeng Tian, Rui Wang, Cong Li, Yudong Zhou, Jun Liu, Jun Wang
This report details the development and key achievements of our latest language model designed for custom large language models.
no code implementations • 18 Feb 2024 • Cong Li, Meng Xiao, Pengfei Wang, Guihai Feng, Xin Li, Yuanchun Zhou
Despite the inherent limitations of existing Large Language Models in directly reading and interpreting single-cell omics data, they demonstrate significant potential and flexibility as the Foundation Model.
no code implementations • 6 Jan 2024 • Luyuan Xie, Cong Li, Xin Zhang, Shengfang Zhai, Yuejian Fang, Qingni Shen, Zhonghai Wu
Representation learning frameworks in unlabeled time series have been proposed for medical signal processing.
no code implementations • 25 Oct 2023 • Cong Li, Tianjiao Feng, Xiudeng Zheng, Sabin Lessard, Yi Tao
In order to better understand the impact of environmental stochastic fluctuations on the evolution of animal behavior, we introduce the concept of a stochastic Nash equilibrium (SNE) that extends the classical concept of a Nash equilibrium (NE).
no code implementations • 16 Sep 2023 • Shengbo Wang, Shuo Gao, Chenyu Tang, Edoardo Occhipinti, Cong Li, Shurui Wang, Jiaqi Wang, Hubin Zhao, Guohua Hu, Arokia Nathan, Ravinder Dahiya, Luigi Occhipinti
By mimicking the intrinsic nature of human low-level perception mechanisms, the electronic memristive neuromorphic circuit-based method, presented here shows the potential for adapting to diverse sensing technologies and helping intelligent machines generate smart high-level decisions in the real world.
1 code implementation • 25 Jun 2023 • Luyuan Xie, Cong Li, ZiRui Wang, Xin Zhang, Boyan Chen, Qingni Shen, Zhonghai Wu
CF module extracts and fuses the multi-scale features of SR images for classification.
Histopathological Image Classification Image Classification +1
no code implementations • 28 Mar 2023 • Jia-Wei Guo, Cong Li, Sen-Hua Zhu, Chang-Zheng Zhang, Ming Ouyang, Ning Ding, Hung-Chyun Chou
Our approach builds upon the state-of-the-art ensemble distillation method, in which we introduce a stereo-based model as a teacher model to improve the accuracy of the student model for depth completion.
no code implementations • 3 Mar 2023 • Shengfang Zhai, Qingni Shen, Xiaoyi Chen, Weilong Wang, Cong Li, Yuejian Fang, Zhonghai Wu
At present, backdoor attacks attract attention as they do great harm to deep learning models.
1 code implementation • 27 Aug 2022 • Yihong Ge, Sudan Yan, Shaolin Lü, Cong Li
This paper proposes a new method for RTK post-processing.
no code implementations • 20 Jun 2022 • Songjia Fan, Yi Tao, Cong Li
We highlight the importance of selection intensity and fitness, as well as their equivalents in the human mind, named as attention degree and meta-fitness, in the decision making process.
no code implementations • 1 Nov 2021 • Zhe Zhou, Cong Li, Xuechao Wei, Xiaoyang Wang, Guangyu Sun
However, to realize efficient GNN training is challenging, especially on large graphs.
no code implementations • 28 Oct 2021 • Cong Li, Fangzhou Liu, Yongchao Wang, Martin Buss
The learning inefficiency of reinforcement learning (RL) from scratch hinders its practical application towards continuous robotic tracking control, especially for high-dimensional robots.
no code implementations • 9 Jun 2021 • Cong Li, Zengjie Zhang, Ahmed Nesrin, Qingchen Liu, Fangzhou Liu, Martin Buss
This paper presents an integrated perception and control approach to accomplish safe autonomous navigation in unknown environments.
no code implementations • 4 May 2021 • Cong Li, Yongchao Wang, Fangzhou Liu, Qingchen Liu, Martin Buss
This paper presents a new formulation for model-free robust optimal regulation of continuous-time nonlinear systems.
no code implementations • 12 Apr 2021 • Cong Li, Min Shi, Bo Qu, Xiang Li
In this paper, we propose a deep attributed network representation learning via attribute enhanced neighborhood (DANRL-ANE) model to improve the robustness and effectiveness of node representations.
no code implementations • 16 Dec 2020 • Defa Liu, Xianxin Wu, Fangsen Li, Yong Hu, Jianwei Huang, Yu Xu, Cong Li, Yunyi Zang, Junfeng He, Lin Zhao, Shaolong He, Chenjia Tang, Zhi Li, Lili Wang, Qingyan Wang, Guodong Liu, Zuyan Xu, Xu-Cun Ma, Qi-Kun Xue, Jiangping Hu, X. J. Zhou
These observations not only show the first direct evidence that the electronic structure of single-layer FeSe/SrTiO3 films originates from bulk FeSe through a combined effect of an electronic phase transition and an interfacial charge transfer, but also provide a quantitative basis for theoretical models in describing the electronic structure and understanding the superconducting mechanism in single-layer FeSe/SrTiO3 films.
Band Gap Superconductivity Strongly Correlated Electrons
no code implementations • 23 Aug 2020 • Jinsong Li, Jianhua Peng, Shuxin Liu, Lintianran Weng, Cong Li
In this paper, we address the problem of temporal link prediction in directed networks and propose a deep learning model based on GCN and self-attention mechanism, namely TSAM.
no code implementations • 10 Jun 2020 • Cong Li, Qingchen Liu, Zhehua Zhou, Martin Buss, Fangzhou Liu
By introducing pseudo controls and risk-sensitive input and state penalty terms, the constrained robust stabilization problem of the original system is converted into an equivalent optimal control problem of an auxiliary system.
no code implementations • 19 Nov 2019 • Chao Tian, Cong Li, Jianping Shi
Recently, FCNs based methods have made great progress in semantic segmentation.
no code implementations • 10 Jul 2019 • Manas R. Joglekar, Cong Li, Jay K. Adams, Pranav Khaitan, Quoc V. Le
During training we use reinforcement learning to find the optimal vocabulary size for each feature and embedding dimension for each value of the feature.
no code implementations • 28 May 2019 • Shaozhu Xiao, Darren C. Peets, Wei Liu, Shiju Zhang, Ya Feng, Wen-He Jiao, Guang-Han Cao, Eike F. Schwier, Kenya Shimada, Cong Li, Xingjiang Zhou, Shaolong He
The iron-based superconductors represent a promising platform for high-temperature superconductivity, but the interactions underpinning their pairing present a puzzle.
Superconductivity Strongly Correlated Electrons
no code implementations • 21 Feb 2019 • Yinjie Huang, Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos
We propose a new method for local distance metric learning based on sample similarity as side information.
no code implementations • 16 Feb 2019 • Jiangmiao Pang, Cong Li, Jianping Shi, Zhihai Xu, Huajun Feng
To tackle these problems, we propose a unified and self-reinforced network called remote sensing region-based convolutional neural network ($\mathcal{R}^2$-CNN), composing of backbone Tiny-Net, intermediate global attention block, and final classifier and detector.
no code implementations • 11 Jul 2017 • Niloofar Yousefi, Cong Li, Mansooreh Mollaghasemi, Georgios Anagnostopoulos, Michael Georgiopoulos
As shown by our empirical results, our algorithm consistently outperforms the traditional kernel learning algorithms such as uniform combination solution, convex combinations of base kernels as well as some kernel alignment-based models, which have been proven to give promising results in the past.
1 code implementation • 20 Aug 2014 • Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos
Traditionally, Multi-task Learning (MTL) models optimize the average of task-related objective functions, which is an intuitive approach and which we will be referring to as Average MTL.
no code implementations • 11 Apr 2014 • Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos
A traditional and intuitively appealing Multi-Task Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing amongst tasks.
no code implementations • 21 Jan 2014 • Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos
Over the past few years, Multi-Kernel Learning (MKL) has received significant attention among data-driven feature selection techniques in the context of kernel-based learning.
no code implementations • 9 Dec 2013 • Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos
This paper presents a RKHS, in general, of vector-valued functions intended to be used as hypothesis space for multi-task classification.
no code implementations • 9 Dec 2013 • Cong Li, Michael Georgiopoulos, Georgios C. Anagnostopoulos
In this paper we present two related, kernel-based Distance Metric Learning (DML) methods.