no code implementations • 8 Dec 2023 • Benny Avelin
In this paper we explore the concept of sequential inductive prediction intervals using theory from sequential testing.
no code implementations • 31 Dec 2022 • Martin Andersson, Benny Avelin
We develop theory and methods that use the graph Laplacian to analyze the geometry of the underlying manifold of point clouds.
no code implementations • 4 Nov 2022 • Benny Avelin, Lauri Viitasaari
In this article we prove that estimator stability is enough to show that leave-one-out cross validation is a sound procedure, by providing concentration bounds in a general framework.
no code implementations • 21 Apr 2021 • Benny Avelin, Anders Karlsson
We consider dynamical and geometrical aspects of deep learning.
1 code implementation • 18 Jan 2021 • Taro Langner, Fredrik K. Gustafsson, Benny Avelin, Robin Strand, Håkan Ahlström, Joel Kullberg
The results indicate that deep regression ensembles could ultimately provide automated, uncertainty-aware measurements of body composition for more than 120, 000 UK Biobank neck-to-knee body MRI that are to be acquired within the coming years.
no code implementations • 15 Dec 2020 • Benny Avelin, Vesa Julin
We first study the convergence to equilibrium of the stochastic gradient flow associated with the cost function with a quadratic penalization.
2 code implementations • arXiv 2019 • Benny Avelin, Kaj Nyström
In this paper we prove that, in the deep limit, the stochastic gradient descent on a ResNet type deep neural network, where each layer shares the same weight matrix, converges to the stochastic gradient descent for a Neural ODE and that the corresponding value/loss functions converge.