no code implementations • 13 Feb 2024 • Kordag Mehmet Kilic, Jin Sima, Jehoshua Bruck
Neural networks successfully capture the computational power of the human brain for many tasks.
no code implementations • 13 Feb 2024 • Kordag Mehmet Kilic, Jin Sima, Jehoshua Bruck
It is known that two anchors (the points to which NN is computed) are sufficient for a NN representation of a threshold function, however, the resolution (the maximum number of bits required for the entries of an anchor) is $O(n\log{n})$.
no code implementations • 1 Feb 2024 • Jin Sima, Changlong Wu, Olgica Milenkovic, Wojciech Szpankowski
We study the problem of online conditional distribution estimation with \emph{unbounded} label sets under local differential privacy.
no code implementations • 27 Jan 2024 • Changlong Wu, Jin Sima, Wojciech Szpankowski
We study the problem of oracle-efficient hybrid online learning when the features are generated by an unknown i. i. d.
2 code implementations • 14 Aug 2023 • Saurav Prakash, Jin Sima, Chao Pan, Eli Chien, Olgica Milenkovic
Third, we compute the complexity of the convex hulls in hyperbolic spaces to assess the extent of data leakage; at the same time, in order to limit communication cost for the hulls, we propose a new quantization method for the Poincar\'e disc coupled with Reed-Solomon-like encoding.
no code implementations • 9 May 2023 • Kordag Mehmet Kilic, Jin Sima, Jehoshua Bruck
Specifically, in this paper, we study the representation of Boolean functions in the associative computation model, where the inputs are binary vectors and the corresponding outputs are the labels ($0$ or $1$) of the nearest neighbor anchors.
1 code implementation • 28 Oct 2022 • Chao Pan, Jin Sima, Saurav Prakash, Vishal Rana, Olgica Milenkovic
We introduce, for the first time, the problem of machine unlearning for FC, and propose an efficient unlearning mechanism for a customized secure FC framework.
no code implementations • 17 May 2022 • Kordag Mehmet Kilic, Jin Sima, Jehoshua Bruck
The expressive power of neural gates (number of distinct functions it can compute) depends on the weight sizes and, in general, large weights (exponential in the number of inputs) are required.