$\ell_1$-K-SVD: A Robust Dictionary Learning Algorithm With Simultaneous Update

26 Aug 2014  ·  Subhadip Mukherjee, Rupam Basu, Chandra Sekhar Seelamantula ·

We develop a dictionary learning algorithm by minimizing the $\ell_1$ distortion metric on the data term, which is known to be robust for non-Gaussian noise contamination. The proposed algorithm exploits the idea of iterative minimization of weighted $\ell_2$ error. We refer to this algorithm as $\ell_1$-K-SVD, where the dictionary atoms and the corresponding sparse coefficients are simultaneously updated to minimize the $\ell_1$ objective, resulting in noise-robustness. We demonstrate through experiments that the $\ell_1$-K-SVD algorithm results in higher atom recovery rate compared with the K-SVD and the robust dictionary learning (RDL) algorithm proposed by Lu et al., both in Gaussian and non-Gaussian noise conditions. We also show that, for fixed values of sparsity, number of dictionary atoms, and data-dimension, the $\ell_1$-K-SVD algorithm outperforms the K-SVD and RDL algorithms when the training set available is small. We apply the proposed algorithm for denoising natural images corrupted by additive Gaussian and Laplacian noise. The images denoised using $\ell_1$-K-SVD are observed to have slightly higher peak signal-to-noise ratio (PSNR) over K-SVD for Laplacian noise, but the improvement in structural similarity index (SSIM) is significant (approximately $0.1$) for lower values of input PSNR, indicating the efficacy of the $\ell_1$ metric.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here