no code implementations • 27 Sep 2022 • Sangmin Lee, Byeongsu Sim, Jong Chul Ye
To understand learning the dynamics of deep ReLU networks, we investigate the dynamic system of gradient flow $w(t)$ by decomposing it to magnitude $w(t)$ and angle $\phi(t):= \pi - \theta(t) $ components.
2 code implementations • 2 Jun 2022 • Hyungjin Chung, Byeongsu Sim, Dohoon Ryu, Jong Chul Ye
Recently, diffusion models have been used to solve various inverse problems in an unsupervised manner with appropriate modifications to the sampling process.
no code implementations • 11 Feb 2022 • Sangmin Lee, Byeongsu Sim, Jong Chul Ye
Understanding implicit bias of gradient descent for generalization capability of ReLU networks has been an important research topic in machine learning research.
no code implementations • CVPR 2022 • Hyungjin Chung, Byeongsu Sim, Jong Chul Ye
In this work, we show that starting from Gaussian noise is unnecessary.
no code implementations • 29 Aug 2020 • Gyutaek Oh, Byeongsu Sim, Hyungjin Chung, Leonard Sunwoo, Jong Chul Ye
Recently, deep learning approaches for accelerated MRI have been extensively studied thanks to their high performance reconstruction in spite of significantly reduced runtime complexity.
no code implementations • 25 Sep 2019 • Byeongsu Sim, Gyutaek Oh, Jeongsol Kim, Chanyong Jung, Jong Chul Ye
To improve the performance of classical generative adversarial network (GAN), Wasserstein generative adversarial networks (W-GAN) was developed as a Kantorovich dual formulation of the optimal transport (OT) problem using Wasserstein-1 distance.
no code implementations • 25 Sep 2019 • Byeongsu Sim, Gyutaek Oh, Sungjun Lim, and Jong Chul Ye
Specifically, we reveal that a cycleGAN architecture can be derived as a dual formulation of the optimal transport problem, if the PLS with a deep learning penalty is used as a transport cost between the two probability measures from measurements and unknown images.