Search Results for author: Gavin Zhang

Found 5 papers, 1 papers with code

Fast and Accurate Estimation of Low-Rank Matrices from Noisy Measurements via Preconditioned Non-Convex Gradient Descent

no code implementations26 May 2023 Gavin Zhang, Hong-Ming Chiu, Richard Y. Zhang

Recently, the technique of preconditioning was shown to be highly effective at accelerating the local convergence of non-convex gradient descent when the measurements are noiseless.

Image Denoising Medical Image Denoising

Simple Alternating Minimization Provably Solves Complete Dictionary Learning

no code implementations23 Oct 2022 Geyu Liang, Gavin Zhang, Salar Fattahi, Richard Y. Zhang

This paper focuses on complete dictionary learning problem, where the goal is to reparametrize a set of given signals as linear combinations of atoms from a learned dictionary.

Dictionary Learning

Accelerating SGD for Highly Ill-Conditioned Huge-Scale Online Matrix Completion

1 code implementation24 Aug 2022 Gavin Zhang, Hong-Ming Chiu, Richard Y. Zhang

The matrix completion problem seeks to recover a $d\times d$ ground truth matrix of low rank $r\ll d$ from observations of its individual elements.

Collaborative Filtering Matrix Completion

Preconditioned Gradient Descent for Overparameterized Nonconvex Burer--Monteiro Factorization with Global Optimality Certification

no code implementations7 Jun 2022 Gavin Zhang, Salar Fattahi, Richard Y. Zhang

We consider using gradient descent to minimize the nonconvex function $f(X)=\phi(XX^{T})$ over an $n\times r$ factor matrix $X$, in which $\phi$ is an underlying smooth convex cost function defined over $n\times n$ matrices.

How Many Samples is a Good Initial Point Worth in Low-rank Matrix Recovery?

no code implementations NeurIPS 2020 Gavin Zhang, Richard Y. Zhang

Optimizing the threshold over regions of the landscape, we see that for initial points around the ground truth, a linear improvement in the quality of the initial guess amounts to a constant factor improvement in the sample complexity.

Cannot find the paper you are looking for? You can Submit a new open access paper.