no code implementations • 16 Feb 2024 • Yusuf Brima, Ulf Krumnack, Simone Pika, Gunther Heidemann
This paper tackles the scarcity of benchmarking data in disentangled auditory representation learning.
no code implementations • 12 Feb 2024 • Mohamad Ballout, Ulf Krumnack, Gunther Heidemann, Kai-Uwe Kuehnberger
Our research demonstrates the significant benefits of using fine-tuning with explanations to enhance the performance of language models.
no code implementations • 4 Nov 2023 • Yusuf Brima, Ulf Krumnack, Simone Pika, Gunther Heidemann
This benchmark dataset and framework address the gap in the rigorous evaluation of state-of-the-art disentangled speech representation learning methods.
no code implementations • 7 Sep 2023 • Yusuf Brima, Ulf Krumnack, Simone Pika, Gunther Heidemann
This study provides an empirical analysis of Barlow Twins (BT), an SSL technique inspired by theories of redundancy reduction in human perception.
no code implementations • 21 Jun 2023 • Mohamad Ballout, Ulf Krumnack, Gunther Heidemann, Kai-Uwe Kühnberger
They all have similar performance and they outperform transformers that are trained from scratch by a large margin.
1 code implementation • 21 Jun 2023 • Mohamad Ballout, Ulf Krumnack, Gunther Heidemann, Kai-Uwe Kühnberger
Investigating deep learning language models has always been a significant research area due to the ``black box" nature of most advanced models.
no code implementations • 1 Jan 2021 • Michael Marino, Pascal Nieters, Gunther Heidemann, Joachim Hertzberg
Further, we use a new method for analyzing class hierarchy in hidden representations, Neurodynamical Agglomerative Analyisis (NAA), to show that latent class relationships in this analysis model tend toward the relationships of the label vectors as the data is projected deeper into the network.