no code implementations • 25 Feb 2023 • Koki Okajima, Xiangming Meng, Takashi Takahashi, Yoshiyuki Kabashima
The obtained bound for perfect support recovery is a generalization of that given in previous literature, which only considers the case of Gaussian noise and diverging $d$.
2 code implementations • 2 Feb 2023 • Xiangming Meng, Yoshiyuki Kabashima
In practical compressed sensing (CS), the obtained measurements typically necessitate quantization to a limited number of bits prior to transmission or storage.
2 code implementations • 20 Nov 2022 • Xiangming Meng, Yoshiyuki Kabashima
We consider the ubiquitous linear inverse problems with additive Gaussian noise and propose an unsupervised sampling approach called diffusion model based posterior sampling (DMPS) to reconstruct the unknown signal from noisy linear measurements.
3 code implementations • 2 Nov 2022 • Xiangming Meng, Yoshiyuki Kabashima
We consider the general problem of recovering a high-dimensional signal from noisy quantized measurements.
1 code implementation • 17 Oct 2022 • Jiang Zhu, Xiangming Meng, Xupeng Lei, Qinghua Guo
We consider the problem of recovering an unknown signal ${\mathbf x}\in {\mathbb R}^n$ from general nonlinear measurements obtained through a generalized linear model (GLM), i. e., ${\mathbf y}= f\left({\mathbf A}{\mathbf x}+{\mathbf w}\right)$, where $f(\cdot)$ is a componentwise nonlinear function.
no code implementations • 10 Feb 2022 • Liu Ziyin, Botao Li, Xiangming Meng
This work finds the analytical expression of the global minima of a deep linear network with weight decay and stochastic neurons, a fundamental model for understanding the landscape of neural networks.
no code implementations • 30 Jan 2022 • Liu Ziyin, HANLIN ZHANG, Xiangming Meng, Yuting Lu, Eric Xing, Masahito Ueda
This work theoretically studies stochastic neural networks, a main type of neural network in use.
no code implementations • 16 Oct 2021 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
Moreover, we provide a rigorous proof of the model selection consistency of Lasso with post-thresholding for general tree-like graphs in the paramagnetic phase without further assumptions on the dependency and incoherence conditions.
no code implementations • NeurIPS 2021 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
We theoretically analyze the typical learning performance of $\ell_{1}$-regularized linear regression ($\ell_1$-LinR) for Ising model selection using the replica method from statistical mechanics.
no code implementations • 25 Jan 2021 • Man Luo, Qinghua Guo, Ming Jin, Yonina C. Eldar, Defeng, Huang, Xiangming Meng
Sparse Bayesian learning (SBL) can be implemented with low complexity based on the approximate message passing (AMP) algorithm.
no code implementations • 19 Aug 2020 • Xiangming Meng, Tomoyuki Obuchi, Yoshiyuki Kabashima
Further, to access the underdetermined region $M < N$, we examine the effect of the $\ell_2$ regularization, and find that biases appear in all the coupling estimates, preventing the perfect identification of the network structure.
no code implementations • 9 Jul 2020 • Xiangming Meng
However, training RBMs with binary synapses is challenging due to the discrete nature of synapses.
4 code implementations • ICML 2020 • Xiangming Meng, Roman Bachmann, Mohammad Emtiyaz Khan
Our work provides a principled approach for training binary neural networks which justifies and extends existing approaches.
no code implementations • 26 Aug 2018 • Jiang Zhu, Qi Zhang, Xiangming Meng, Zhiwei Xu
In this paper, we consider a general form of noisy compressive sensing (CS) where the sensing matrix is not precisely known.
Signal Processing
no code implementations • 29 Dec 2017 • Xiangming Meng, Sheng Wu, Jiang Zhu
In this letter, we present a unified Bayesian inference framework for generalized linear models (GLM) which iteratively reduces the GLM problem to a sequence of standard linear model (SLM) problems.
Information Theory Information Theory
no code implementations • 4 Jan 2016 • Xiangming Meng, Sheng Wu, Linling Kuang, Defeng, Huang, Jianhua Lu
We consider the problem of recovering clustered sparse signals with no prior knowledge of the sparsity pattern.