1 code implementation • 13 May 2024 • Kyungeun Lee, Ye Seul Sim, Hye-Seung Cho, Moonjung Eo, Suhee Yoon, Sanghyu Yoon, Woohyung Lim
The ability of deep networks to learn superior representations hinges on leveraging the proper inductive biases, considering the inherent properties of datasets.
no code implementations • 19 Nov 2023 • Chanhui Lee, Juhyeon Kim, Yongjun Jeong, Juhyun Lyu, Junghee Kim, Sangmin Lee, Sangjun Han, Hyeokjun Choe, Soyeon Park, Woohyung Lim, Sungbin Lim, Sanghack Lee
Scaling laws have allowed Pre-trained Language Models (PLMs) into the field of causal reasoning.
no code implementations • 10 Oct 2023 • Sung Moon Ko, Sumin Lee, Dae-Woong Jeong, Woohyung Lim, Sehui Han
Transfer learning is a crucial technique for handling a small amount of data that is potentially related to other abundant data.
no code implementations • 10 Jul 2023 • Seohui Bae, Seoyoon Kim, Hyemin Jung, Woohyung Lim
Recent regulation on right-to-be-forgotten emerges tons of interest in unlearning pre-trained machine learning models.
no code implementations • 1 Jan 2021 • Dae-Woong Jeong, Kiyoung Kim, ChangYoung Park, Sehui Han, Woohyung Lim
We assume the existence of enough unlabeled data that follow the true distribution, and that the true distribution can be roughly estimated from domain knowledge or a few samples.