no code implementations • 22 Feb 2024 • Jules Berman, Benjamin Peherstorfer
This work introduces reduced models based on Continuous Low Rank Adaptation (CoLoRA) that pre-train neural networks for a given partial differential equation and then continuously adapt low-rank weights in time to rapidly predict the evolution of solution fields at new physics parameters and new initial conditions.
no code implementations • 6 Jan 2024 • Siavash Golkar, Jules Berman, David Lipshutz, Robert Mihai Haret, Tim Gollisch, Dmitri B. Chklovskii
Such variation in the temporal filter with input SNR resembles that observed experimentally in biological neurons.
1 code implementation • 11 Oct 2023 • Paul Schwerdtner, Philipp Schulze, Jules Berman, Benjamin Peherstorfer
This work focuses on the conservation of quantities such as Hamiltonians, mass, and momentum when solution fields of partial differential equations are approximated with nonlinear parametrizations such as deep networks.
2 code implementations • NeurIPS 2023 • Jules Berman, Benjamin Peherstorfer
Training neural networks sequentially in time to approximate solution fields of time-dependent partial differential equations can be beneficial for preserving causality and other physics properties; however, the sequential-in-time training is numerically challenging because training errors quickly accumulate and amplify over time.
1 code implementation • 21 Nov 2022 • Lyndon R. Duong, Jingyang Zhou, Josue Nassar, Jules Berman, Jeroen Olieslagers, Alex H. Williams
Quantifying similarity between neural representations -- e. g. hidden layer activation vectors -- is a perennial problem in deep learning and neuroscience research.
no code implementations • 3 Dec 2021 • Jules Berman, Dmitri B. Chklovskii, Jingpeng Wu
To address this problem, we propose a novel method based on point cloud representations of neurons.