1 code implementation • Findings (EMNLP) 2021 • Hwiyeol Jo, Jaeseo Lim, Byoung-Tak Zhang
We present a new form of ensemble method–Devil’s Advocate, which uses a deliberately dissenting model to force other submodels within the ensemble to better collaborate.
no code implementations • 22 Sep 2022 • Seonil Son, Jaeseo Lim, Youwon Jang, Jaeyoung Lee, Byoung-Tak Zhang
We compare our approach with Unlikelihood (UL) training in a text continuation task on commonsense natural language inference (NLI) corpora to show which method better models the coherence by avoiding unlikely continuations.
no code implementations • 7 Nov 2020 • Jaeseo Lim, Hwiyeol Jo, Byoung-Tak Zhang, Jooyong Park
In the end, we showed not only that we can make build better machine training framework through the human experiment result, but also empirically confirm the result of human experiment through imitated machine experiments; human-like active learning have crucial effect on learning performance.
no code implementations • 9 May 2019 • Sungjae Cho, Jaeseo Lim, Chris Hickey, Jung Ae Park, Byoung-Tak Zhang
Problem difficulty was operationalized by the number of carries involved in solving a given problem.
no code implementations • 1 Apr 2019 • Yu-Jung Heo, Kyoung-Woon On, SeongHo Choi, Jaeseo Lim, Jinah Kim, Jeh-Kwang Ryu, Byung-Chull Bae, Byoung-Tak Zhang
Video understanding is emerging as a new paradigm for studying human-like AI.
2 code implementations • IJCNLP 2019 • Gi-Cheon Kang, Jaeseo Lim, Byoung-Tak Zhang
Specifically, REFER module learns latent relationships between a given question and a dialog history by employing a self-attention mechanism.
Ranked #2 on Visual Dialog on VisDial v0.9 val