no code implementations • EURALI (LREC) 2022 • Irene Sucameli, Michele De Quattro, Arash Eshghi, Alessandro Suglia, Maria Simi
Since the advent of Transformer-based, pretrained language models (LM) such as BERT, Natural Language Understanding (NLU) components in the form of Dialogue Act Recognition (DAR) and Slot Recognition (SR) for dialogue systems have become both more accurate and easier to create for specific application domains.