Search Results for author: Dung Dao

Found 1 papers, 0 papers with code

VinaLLaMA: LLaMA-based Vietnamese Foundation Model

no code implementations18 Dec 2023 Quan Nguyen, Huy Pham, Dung Dao

In this technical report, we present VinaLLaMA, an open-weight, state-of-the-art (SOTA) Large Language Model for the Vietnamese language, built upon LLaMA-2 with an additional 800 billion trained tokens.

Language Modelling Large Language Model

Cannot find the paper you are looking for? You can Submit a new open access paper.