Search Results for author: Haytham ElFadeel

Found 2 papers, 0 papers with code

Decoupled Transformer for Scalable Inference in Open-domain Question Answering

no code implementations5 Aug 2021 Haytham ElFadeel, Stan Peshterliev

To reduce computational cost and latency, we propose decoupling the transformer MRC model into input-component and cross-component.

Knowledge Distillation Machine Reading Comprehension +1

Robustly Optimized and Distilled Training for Natural Language Understanding

no code implementations16 Mar 2021 Haytham ElFadeel, Stan Peshterliev

In this paper, we explore multi-task learning (MTL) as a second pretraining step to learn enhanced universal language representation for transformer language models.

Knowledge Distillation Machine Reading Comprehension +3

Cannot find the paper you are looking for? You can Submit a new open access paper.