1 code implementation • ICML 2020 • Jay Heo, Junhyeon Park, Hyewon Jeong, Kwang Joon Kim, Juho Lee, Eunho Yang, Sung Ju Hwang
Moreover, it is almost infeasible for the human annotators to examine attentions on tons of instances and features.
2 code implementations • 9 Jun 2020 • Jay Heo, Junhyeon Park, Hyewon Jeong, Kwang Joon Kim, Juho Lee, Eunho Yang, Sung Ju Hwang
Moreover, it is almost infeasible for the human annotators to examine attentions on tons of instances and features.
1 code implementation • ICLR 2020 • Joonyoung Yi, Juhyuk Lee, Kwang Joon Kim, Sung Ju Hwang, Eunho Yang
Among many approaches, the simplest and most intuitive way is zero imputation, which treats the value of a missing entry simply as zero.
2 code implementations • 5 Jun 2018 • Ingyo Chung, Saehoon Kim, Juho Lee, Kwang Joon Kim, Sung Ju Hwang, Eunho Yang
We present a personalized and reliable prediction model for healthcare, which can provide individually tailored medical services such as diagnosis, disease treatment, and prevention.
2 code implementations • NeurIPS 2018 • Jay Heo, Hae Beom Lee, Saehoon Kim, Juho Lee, Kwang Joon Kim, Eunho Yang, Sung Ju Hwang
Attention mechanism is effective in both focusing the deep learning models on relevant features and interpreting them.