1 code implementation • 30 Apr 2024 • Ziming Liu, YiXuan Wang, Sachin Vaidya, Fabian Ruehle, James Halverson, Marin Soljačić, Thomas Y. Hou, Max Tegmark
Inspired by the Kolmogorov-Arnold representation theorem, we propose Kolmogorov-Arnold Networks (KANs) as promising alternatives to Multi-Layer Perceptrons (MLPs).
no code implementations • 20 Feb 2024 • Sergei Gukov, James Halverson, Fabian Ruehle
Machine learning techniques are increasingly powerful, leading to many breakthroughs in the natural sciences, but they are often stochastic, error-prone, and blackbox.
no code implementations • 30 Oct 2023 • James Halverson, Fabian Ruehle
We develop a theory of flows in the space of Riemannian metrics induced by neural network gradient descent.
no code implementations • 6 Jul 2023 • Mehmet Demirtas, James Halverson, Anindita Maiti, Matthew D. Schwartz, Keegan Stoner
Conversely, the correspondence allows one to engineer architectures realizing a given field theory by representing action deformations as deformations of neural network parameter densities.
no code implementations • 18 Apr 2023 • Sergei Gukov, James Halverson, Ciprian Manolescu, Fabian Ruehle
We apply Bayesian optimization and reinforcement learning to a problem in topology: the question of when a knot bounds a ribbon disk.
no code implementations • 8 Dec 2021 • James Halverson
An approach to field theory is studied in which fields are comprised of $N$ constituent random neurons.
no code implementations • 1 Dec 2021 • Di Luo, James Halverson
We study infinite limits of neural network quantum states ($\infty$-NNQS), which exhibit representation power through ensemble statistics, and also tractable gradient descent dynamics.
1 code implementation • 1 Jun 2021 • Anindita Maiti, Keegan Stoner, James Halverson
We demonstrate that symmetries of network densities may be determined via dual computations of network correlation functions, even when the density is unknown and the network is not equivariant.
no code implementations • 28 Oct 2020 • Sergei Gukov, James Halverson, Fabian Ruehle, Piotr Sułkowski
We introduce natural language processing into the study of knot theory, as made natural by the braid word representation of knots.
1 code implementation • 19 Aug 2020 • James Halverson, Anindita Maiti, Keegan Stoner
We propose a theoretical understanding of neural networks in terms of Wilsonian effective field theory.
1 code implementation • 27 Mar 2019 • James Halverson, Brent Nelson, Fabian Ruehle
In one case, we demonstrate that the agent learns a human-derived strategy for finding consistent string models.
High Energy Physics - Theory
no code implementations • 15 May 2018 • James Halverson, Brent D. Nelson, Fabian Ruehle, Gustavo Salinas
Dark gauge sectors and axions are well-motivated in string theory.
High Energy Physics - Phenomenology High Energy Physics - Theory