Search Results for author: Harel Haskey

Found 1 papers, 0 papers with code

HeRo: RoBERTa and Longformer Hebrew Language Models

no code implementations18 Apr 2023 Vitaly Shalumov, Harel Haskey

In this paper, we fill in an existing gap in resources available to the Hebrew NLP community by providing it with the largest so far pre-train dataset HeDC4, a state-of-the-art pre-trained language model HeRo for standard length inputs and an efficient transformer LongHeRo for long input sequences.

Document Classification Language Modelling +4

Cannot find the paper you are looking for? You can Submit a new open access paper.