1 code implementation • 20 Oct 2023 • Jaeyoung Choe, Keonwoong Noh, Nayeon Kim, Seyun Ahn, Woohwan Jung
Over the past few years, various domain-specific pretrained language models (PLMs) have been proposed and have outperformed general-domain PLMs in specialized areas such as biomedical, scientific, and clinical domains.
Ranked #1 on Sentiment Analysis on Financial PhraseBank
1 code implementation • 18 Oct 2023 • Su ah Lee, Seokjin Oh, Woohwan Jung
Although $K$-shot learning techniques can be applied, their performance tends to saturate when the number of annotations exceeds several tens of labels.
no code implementations • 26 Jul 2023 • Seokjin Oh, Su ah Lee, Woohwan Jung
Data augmentation is a technique that enhances the performance of data-hungry models by generating synthetic data instead of collecting new ones.
1 code implementation • COLING 2020 • Woohwan Jung, Kyuseok Shim
To take advantage of the high accuracy of human annotation and the cheap cost of distant supervision, we propose the dual supervision framework which effectively utilizes both types of data.
Ranked #45 on Relation Extraction on DocRED