no code implementations • NeurIPS 2021 • Tao Yu, Christopher M. De Sa
Hyperbolic space is particularly useful for embedding data with hierarchical structure; however, representing hyperbolic space with ordinary floating-point numbers greatly affects the performance due to its \emph{ineluctable} numerical errors.
no code implementations • NeurIPS 2020 • Christopher M. De Sa
Many learning algorithms, such as stochastic gradient descent, are affected by the order in which training examples are used.
2 code implementations • NeurIPS 2019 • Tao Yu, Christopher M. De Sa
Hyperbolic embeddings achieve excellent performance when embedding hierarchical data structures like synonym or type hierarchies, but they can be limited by numerical error when ordinary floating-point numbers are used to represent points in hyperbolic space.
no code implementations • NeurIPS 2015 • Christopher M. De Sa, Ce Zhang, Kunle Olukotun, Christopher Ré
Stochastic gradient descent (SGD) is a ubiquitous algorithm for a variety of machine learning problems.