1 code implementation • 1 May 2023 • Namuk Park, Wonjae Kim, Byeongho Heo, Taekyung Kim, Sangdoo Yun
We present a comparative study on how and why contrastive learning (CL) and masked image modeling (MIM) differ in their representations and in their performance of downstream tasks.
3 code implementations • ICLR 2022 • Namuk Park, Songkuk Kim
In particular, we demonstrate the following properties of MSAs and Vision Transformers (ViTs): (1) MSAs improve not only accuracy but also generalization by flattening the loss landscapes.
2 code implementations • 26 May 2021 • Namuk Park, Songkuk Kim
Neural network ensembles, such as Bayesian neural networks (BNNs), have shown success in the areas of uncertainty estimation and robustness.
no code implementations • 25 Sep 2019 • Namuk Park, Taekyu Lee, Songkuk Kim
Instead of generating separate prediction for each data sample independently, this model estimates the increments of prediction for a new data sample from the previous predictions.
1 code implementation • 12 Jul 2019 • Namuk Park, Taekyu Lee, Songkuk Kim
The computational cost of this model is almost the same as that of non-Bayesian NNs.