Search Results for author: Jaehyeon Son

Found 3 papers, 2 papers with code

Learning to Continually Learn with the Bayesian Principle

1 code implementation29 May 2024 Soochan Lee, Hyeonseong Jeon, Jaehyeon Son, Gunhee Kim

On the other hand, in the more classical literature of statistical machine learning, many models have sequential Bayesian update rules that yield the same learning outcome as the batch training, i. e., they are completely immune to catastrophic forgetting.

When Meta-Learning Meets Online and Continual Learning: A Survey

no code implementations9 Nov 2023 Jaehyeon Son, Soochan Lee, Gunhee Kim

Over the past decade, deep neural networks have demonstrated significant success using the training scheme that involves mini-batch stochastic gradient descent on extensive datasets.

Continual Learning Meta-Learning

Recasting Continual Learning as Sequence Modeling

1 code implementation NeurIPS 2023 Soochan Lee, Jaehyeon Son, Gunhee Kim

That is, we propose to formulate continual learning as a sequence modeling problem, allowing advanced sequence models to be utilized for continual learning.

Continual Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.