no code implementations • 9 Jan 2023 • Ankit Pensia, Amir R. Asadi, Varun Jog, Po-Ling Loh
For the sample complexity of simple hypothesis testing under pure LDP constraints, we establish instance-optimal bounds for distributions with binary support; minimax-optimal bounds for general distributions; and (approximately) instance-optimal, computationally efficient algorithms for general distributions.
no code implementations • 30 Dec 2022 • Amir R. Asadi
In the framework of supervised learning, a necessity for a computer to learn from data accurately and efficiently is to be provided with auxiliary information about the data distribution and target function through the learning model.
no code implementations • 25 Jun 2020 • Amir R. Asadi, Emmanuel Abbe
For different entropies and arbitrary scale transformations, it is shown that the distribution maximizing a multiscale entropy is characterized by a procedure which has an analogy to the renormalization group procedure in statistical physics.
1 code implementation • 26 Jun 2019 • Amir R. Asadi, Emmanuel Abbe
The bounds are obtained by introducing the notion of generated hierarchical coverings of neural nets and by using the technique of chaining mutual information introduced in Asadi et al. NeurIPS'18.
no code implementations • NeurIPS 2018 • Amir R. Asadi, Emmanuel Abbe, Sergio Verdú
Two important difficulties are (i) exploiting the dependencies between the hypotheses, (ii) exploiting the dependence between the algorithm's input and output.