Search Results for author: Anh-Duc Vu

Found 3 papers, 1 papers with code

What do Transformers Know about Government?

1 code implementation22 Apr 2024 Jue Hou, Anisia Katinskaia, Lari Kotilainen, Sathianpong Trangcasanchai, Anh-Duc Vu, Roman Yangarber

This paper investigates what insights about linguistic features and what knowledge about the structure of natural language can be obtained from the encodings in transformer language models. In particular, we explore how BERT encodes the government relation between constituents in a sentence.

Sentence

Effects of sub-word segmentation on performance of transformer language models

no code implementations9 May 2023 Jue Hou, Anisia Katinskaia, Anh-Duc Vu, Roman Yangarber

Lastly, we show 4. that LMs of smaller size using morphological segmentation can perform comparably to models of larger size trained with BPE -- both in terms of (1) perplexity and (3) scores on downstream tasks.

Language Modelling Segmentation

Cannot find the paper you are looking for? You can Submit a new open access paper.