Search Results for author: Shi Ya

Found 1 papers, 0 papers with code

A Robustly Optimized BERT Pre-training Approach with Post-training

no code implementations CCL 2021 Liu Zhuang, Lin Wayne, Shi Ya, Zhao Jun

“In the paper we present a ‘pre-training’+‘post-training’+‘fine-tuning’ three-stage paradigm which is a supplementary framework for the standard ‘pre-training’+‘fine-tuning’ languagemodel approach.

Extractive Question-Answering Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.