no code implementations • 3 May 2024 • Patrick Krauss, Jannik Hösch, Claus Metzner, Andreas Maier, Peter Uhrig, Achim Schilling
We found that the activation vectors of the hidden units cluster according to stylistic variations in earlier layers of BERT (1) than narrative content (4-5).
no code implementations • 22 Dec 2023 • Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss
Cognitive maps, as represented by the entorhinal-hippocampal complex in the brain, organize and retrieve context from memories, suggesting that large language models (LLMs) like ChatGPT could harness similar architectures to function as a high-level processing center, akin to how the hippocampus operates within the cortex hierarchy.
no code implementations • 28 Nov 2023 • Claus Metzner, Achim Schilling, Patrick Krauss
In the evolving landscape of data science, the accurate quantification of clustering in high-dimensional data sets remains a significant challenge, especially in the absence of predefined labels.
no code implementations • 4 Jul 2023 • Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss
The human brain possesses the extraordinary capability to contextualize the information it receives from our environment.
no code implementations • 15 Feb 2023 • Kishore Surendra, Achim Schilling, Paul Stoewer, Andreas Maier, Patrick Krauss
Strikingly, we find that the internal representations of nine-word input sequences cluster according to the word class of the tenth word to be predicted as output, even though the neural network did not receive any explicit information about syntactic rules or word classes during training.
no code implementations • 30 Jan 2023 • Claus Metzner, Marius E. Yamakou, Dennis Voelkl, Achim Schilling, Patrick Krauss
We find that in networks with moderately strong connections, the mutual information $I$ is approximately a monotonic transformation of the root-mean-square averaged Pearson correlations between neuron-pairs, a quantity that can be efficiently computed even in large systems.
no code implementations • 17 Jan 2023 • Claus Metzner, Achim Schilling, Maximilian Traxdorf, Holger Schulze, Konstantin Tziridis, Patrick Krauss
The human sleep-cycle has been divided into discrete sleep stages that can be recognized in electroencephalographic (EEG) and other bio-signals by trained specialists or machine learning systems.
no code implementations • 28 Oct 2022 • Paul Stoewer, Achim Schilling, Andreas Maier, Patrick Krauss
The neural network successfully learns the similarities between different animal species, and constructs a cognitive map of 'animal space' based on the principle of successor representations with an accuracy of around 30% which is near to the theoretical maximum regarding the fact that all animal species have more than one possible successor, i. e. nearest neighbor in feature space.
no code implementations • 4 Jun 2022 • Claus Metzner, Achim Schilling, Maximilian Traxdorf, Konstantin Tziridis, Holger Schulze, Patrick Krauss
Remarkably, the accuracy limit is not affected by applying non-linear transformations to the data, even if these transformations are non-reversible and drastically reduce the information content of the input data.
no code implementations • 7 Apr 2022 • Achim Schilling, William Sedley, Richard Gerum, Claus Metzner, Konstantin Tziridis, Andreas Maier, Holger Schulze, Fan-Gang Zeng, Karl J. Friston, Patrick Krauss
How is information processed in the brain during perception?
no code implementations • 22 Feb 2022 • Paul Stoewer, Christian Schlieker, Achim Schilling, Claus Metzner, Andreas Maier, Patrick Krauss
We conclude that cognitive maps and neural network-based successor representations of structured knowledge provide a promising way to overcome some of the short comings of deep learning towards artificial general intelligence.
no code implementations • 10 Aug 2021 • Andreas Maier, Harald Köstler, Marco Heisig, Patrick Krauss, Seung Hee Yang
In this article, we perform a review of the state-of-the-art of hybrid machine learning in medical imaging.
no code implementations • 5 Aug 2021 • Claus Metzner, Patrick Krauss
Moreover, we find a completely new type of resonance phenomenon, called 'Import Resonance' (IR), where the information import shows a maximum, i. e. a peak-like dependence on the coupling strength between the RNN and its input.
no code implementations • 5 Oct 2020 • Patrick Krauss, Achim Schilling
In order to gain a mechanistic understanding of how tinnitus emerges in the brain, we must build biologically plausible computational models that mimic both tinnitus development and perception, and test the tentative models with brain and behavioral experiments.
no code implementations • 31 Mar 2020 • Patrick Krauss, Andreas Maier
The question of whether artificial beings or machines could become self-aware or consciousness has been a philosophical question for centuries.
no code implementations • 7 Nov 2019 • Richard C. Gerum, André Erpenbeck, Patrick Krauss, Achim Schilling
We conclude that sparsity is a central property of neural networks and should be considered for modern Machine learning approaches.
no code implementations • 5 Nov 2018 • Achim Schilling, Claus Metzner, Jonas Rietsch, Richard Gerum, Holger Schulze, Patrick Krauss
Deep neural networks typically outperform more traditional machine learning models in their ability to classify complex data, and yet is not clear how the individual hidden layers of a deep network contribute to the overall classification performance.