no code implementations • EMNLP 2020 • Zheng Li, Mukul Kumar, William Headden, Bing Yin, Ying WEI, Yu Zhang, Qiang Yang
Recent emergence of multilingual pre-training language model (mPLM) has enabled breakthroughs on various downstream cross-lingual transfer (CLT) tasks.
1 code implementation • 14 May 2020 • Joydip Dhar, Ashaya Shukla, Mukul Kumar, Prashant Gupta
kNN is a very effective Instance based learning method, and it is easy to implement.
no code implementations • 20 Dec 2019 • Mukul Kumar, Youna Hu, Will Headden, Rahul Goutam, Heran Lin, Bing Yin
Recent works such as BERT have demonstrated the success of a large transformer encoder architecture with language model pre-training on a variety of NLP tasks.