Search Results for author: Jules Berman

Found 6 papers, 3 papers with code

CoLoRA: Continuous low-rank adaptation for reduced implicit neural modeling of parameterized partial differential equations

no code implementations22 Feb 2024 Jules Berman, Benjamin Peherstorfer

This work introduces reduced models based on Continuous Low Rank Adaptation (CoLoRA) that pre-train neural networks for a given partial differential equation and then continuously adapt low-rank weights in time to rapidly predict the evolution of solution fields at new physics parameters and new initial conditions.

Neuronal Temporal Filters as Normal Mode Extractors

no code implementations6 Jan 2024 Siavash Golkar, Jules Berman, David Lipshutz, Robert Mihai Haret, Tim Gollisch, Dmitri B. Chklovskii

Such variation in the temporal filter with input SNR resembles that observed experimentally in biological neurons.

Time Series

Nonlinear embeddings for conserving Hamiltonians and other quantities with Neural Galerkin schemes

1 code implementation11 Oct 2023 Paul Schwerdtner, Philipp Schulze, Jules Berman, Benjamin Peherstorfer

This work focuses on the conservation of quantities such as Hamiltonians, mass, and momentum when solution fields of partial differential equations are approximated with nonlinear parametrizations such as deep networks.

Randomized Sparse Neural Galerkin Schemes for Solving Evolution Equations with Deep Networks

2 code implementations NeurIPS 2023 Jules Berman, Benjamin Peherstorfer

Training neural networks sequentially in time to approximate solution fields of time-dependent partial differential equations can be beneficial for preserving causality and other physics properties; however, the sequential-in-time training is numerically challenging because training errors quickly accumulate and amplify over time.

Representational dissimilarity metric spaces for stochastic neural networks

1 code implementation21 Nov 2022 Lyndon R. Duong, Jingyang Zhou, Josue Nassar, Jules Berman, Jeroen Olieslagers, Alex H. Williams

Quantifying similarity between neural representations -- e. g. hidden layer activation vectors -- is a perennial problem in deep learning and neuroscience research.

Bridging the Gap: Point Clouds for Merging Neurons in Connectomics

no code implementations3 Dec 2021 Jules Berman, Dmitri B. Chklovskii, Jingpeng Wu

To address this problem, we propose a novel method based on point cloud representations of neurons.

Point Cloud Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.