no code implementations • 25 Feb 2024 • Nadav Dym, Hannah Lawrence, Jonathan W. Siegel
Canonicalization provides an architecture-agnostic method for enforcing equivariance, with generalizations such as frame-averaging recently gaining prominence as a lightweight and flexible alternative to equivariant architectures.
no code implementations • 4 Feb 2024 • Snir Hordan, Tal Amir, Nadav Dym
Finally, we show that a simple modification of this PPGN architecture can be used to obtain a universal equivariant architecture that can approximate all continuous equivariant functions uniformly.
no code implementations • 3 Feb 2024 • Christopher Morris, Nadav Dym, Haggai Maron, İsmail İlkan Ceylan, Fabrizio Frasca, Ron Levie, Derek Lim, Michael Bronstein, Martin Grohe, Stefanie Jegelka
Machine learning on graphs, especially using graph neural networks (GNNs), has seen a surge in interest due to the wide availability of graph data across a broad spectrum of disciplines, from life to social and engineering sciences.
no code implementations • 15 Nov 2023 • Tamir Bendory, Nadav Dym, Dan Edidin, Arun Suresh
In this paper, we study the phase retrieval problem under the prior that the signal lies in a semi-algebraic set.
no code implementations • 20 Oct 2023 • Aviv Navon, Aviv Shamsian, Ethan Fetaya, Gal Chechik, Nadav Dym, Haggai Maron
To accelerate the alignment process and improve its quality, we propose a novel framework aimed at learning to solve the weight alignment problem, which we name Deep-Align.
no code implementations • 31 Jan 2023 • Snir Hordan, Tal Amir, Steven J. Gortler, Nadav Dym
Neural networks for point clouds, which respect their natural invariance to permutation and rigid motion, have enjoyed recent success in modeling geometric phenomena, from molecular dynamics to recommender systems.
no code implementations • 18 Jul 2022 • Tal Amir, Shahar Kovalsky, Nadav Dym
Our relaxation enjoys several theoretical and practical advantages: Theoretically, we prove that our method provides a $\sqrt{2}$-factor approximation to the Robust Procrustes problem, and that, under appropriate assumptions, it exactly recovers the true rigid motion from point correspondences contaminated by outliers.
no code implementations • 5 May 2022 • Nadav Dym, Steven J. Gortler
We show that when a continuous family of semi-algebraic separating invariants is available, separation can be obtained by randomly selecting $2D+1 $ of these invariants.
1 code implementation • 2 Mar 2022 • Ben Finkelshtein, Chaim Baskin, Haggai Maron, Nadav Dym
Equivariance to permutations and rigid motions is an important inductive bias for various 3D learning problems.
no code implementations • 28 Jul 2021 • Ingrid Daubechies, Ronald DeVore, Nadav Dym, Shira Faigenbaum-Golovin, Shahar Z. Kovalsky, Kung-Ching Lin, Josiah Park, Guergana Petrova, Barak Sober
Namely, we show that refinable functions are approximated by the outputs of deep ReLU networks with a fixed width and increasing depth with accuracy exponential in terms of their number of parameters.
no code implementations • ICLR 2021 • Nadav Dym, Haggai Maron
We first derive two sufficient conditions for an equivariant architecture to have the universal approximation property, based on a novel characterization of the space of equivariant polynomials.
no code implementations • 27 May 2019 • Nadav Dym, Barak Sober, Ingrid Daubechies
The combination of this phenomenon with the capacity, demonstrated here, of DNNs to efficiently approximate IFS may contribute to the success of DNNs, particularly striking for image processing tasks, as well as suggest new algorithms for representing self similarities in images based on the DNN mechanism.
no code implementations • ICCV 2019 • Nadav Dym, Shahar Ziv Kovalsky
In recent years, several branch-and-bound (BnB) algorithms have been proposed to globally optimize rigid registration problems.