no code implementations • 15 Feb 2024 • Ruiqi Chen, Giacomo Vedovati, Todd Braver, ShiNung Ching
Evaluating the dynamics in such networks is key to understanding their learned generative mechanisms.
no code implementations • 22 Dec 2023 • Lulu Gong, Xudong Chen, ShiNung Ching
We are specifically interested in how the attractor landscapes of such networks become altered as a function of the strength and nature (Hebbian vs. anti-Hebbian) of learning, which may have a bearing on the ability of such rules to mediate large-scale optimization problems.
no code implementations • 6 Nov 2023 • Lulu Gong, Fabio Pasqualetti, Thomas Papouin, ShiNung Ching
We then embed this model in a bandit-based reinforcement learning task environment, and show how the presence of time-scale separated astrocytic modulation enables learning over multiple fluctuating contexts.
no code implementations • 17 Nov 2022 • Ciaran Murphy-Royal, ShiNung Ching, Thomas Papouin
The participation of astrocytes in brain computation was formally hypothesized in 1992, coinciding with the discovery that these glial cells display a complex form of Ca2+ excitability.
no code implementations • 12 May 2022 • Yuzhen Qin, Tommaso Menara, Samet Oymak, ShiNung Ching, Fabio Pasqualetti
Humans are capable of adjusting to changing environments flexibly and quickly.
no code implementations • 13 Jan 2022 • Yuzhen Qin, Tommaso Menara, Samet Oymak, ShiNung Ching, Fabio Pasqualetti
In this paper, we study representation learning for multi-task decision-making in non-stationary environments.
no code implementations • 6 Apr 2021 • Matthew F. Singh, Chong Wang, Michael W. Cole, ShiNung Ching
Intuitively, our approach consists of solving for the parameters that generate the most accurate state estimator (Extended Kalman Filter).
1 code implementation • 8 Jan 2021 • Elham Ghazizadeh, ShiNung Ching
Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time, thus making it crucial for context-dependent computation.