no code implementations • 29 Sep 2023 • Andrea Montanari, Feng Ruan, Basil Saeed, Youngtak Sohn
Working in the high-dimensional regime in which the number of features $p$, the number of samples $n$ and the input dimension $d$ (in the nonlinear featurization setting) diverge, with ratios of order one, we prove a universality result establishing that the asymptotic behavior is completely determined by the expected covariance of feature vectors and by the covariance between features and labels.
no code implementations • 5 Nov 2019 • Andrea Montanari, Feng Ruan, Youngtak Sohn, Jun Yan
They achieve this by learning nonlinear representations of the inputs that maps the data into linearly separable classes.