no code implementations • 20 Apr 2024 • Yuling Jiao, Lican Kang, Huazhen Lin, Jin Liu, Heng Zuo
Our theoretical analysis encompasses the establishment of end-to-end error analysis for learning distributions via the latent Schr{\"o}dinger bridge diffusion model.
no code implementations • 24 Apr 2022 • Han Yuan, Mingxuan Liu, Lican Kang, Chenkui Miao, Ying Wu
In our empirical study on the MIMIC-III dataset, we show that the two core explanations - SHAP values and variable rankings fluctuate when using different background datasets acquired from random sampling, indicating that users cannot unquestioningly trust the one-shot interpretation from SHAP.
no code implementations • 21 Nov 2021 • Peili Li, Yuling Jiao, Xiliang Lu, Lican Kang
In this work, we consider the algorithm to the (nonlinear) regression problems with $\ell_0$ penalty.
no code implementations • 10 Jul 2021 • Yuling Jiao, Lican Kang, Yanyan Liu, Youzhou Zhou
Schr\"{o}dinger-F\"{o}llmer sampler (SFS) is a novel and efficient approach for sampling from possibly unnormalized distributions without ergodicity.
no code implementations • 27 Jan 2020 • Jian Huang, Yuling Jiao, Lican Kang, Jin Liu, Yanyan Liu, Xiliang Lu, Yuanyuan Yang
Based on this KKT system, a built-in working set with a relatively small size is first determined using the sum of primal and dual variables generated from the previous iteration, then the primal variable is updated by solving a least-squares problem on the working set and the dual variable updated based on a closed-form expression.
no code implementations • 16 Jan 2020 • Jian Huang, Yuling Jiao, Lican Kang, Jin Liu, Yanyan Liu, Xiliang Lu
Feature selection is important for modeling high-dimensional data, where the number of variables can be much larger than the sample size.