Search Results for author: Rasa Hosseinzadeh

Found 6 papers, 3 papers with code

Tabular Data Contrastive Learning via Class-Conditioned and Feature-Correlation Based Augmentation

2 code implementations26 Apr 2024 Wei Cui, Rasa Hosseinzadeh, Junwei Ma, Tongzi Wu, Yi Sui, Keyvan Golestan

Contrastive learning is a model pre-training technique by first creating similar views of the original data, and then encouraging the data and its corresponding views to be close in the embedding space.

Contrastive Learning Feature Correlation

Deep Generative Models through the Lens of the Manifold Hypothesis: A Survey and New Connections

no code implementations3 Apr 2024 Gabriel Loaiza-Ganem, Brendan Leigh Ross, Rasa Hosseinzadeh, Anthony L. Caterini, Jesse C. Cresswell

This manifold lens provides both clarity as to why some DGMs (e. g. diffusion models and some generative adversarial networks) empirically surpass others (e. g. likelihood-based models such as variational autoencoders, normalizing flows, or energy-based models) at sample generation, and guidance for devising more performant DGMs.

Exposing flaws of generative model evaluation metrics and their unfair treatment of diffusion models

2 code implementations NeurIPS 2023 George Stein, Jesse C. Cresswell, Rasa Hosseinzadeh, Yi Sui, Brendan Leigh Ross, Valentin Villecroze, Zhaoyan Liu, Anthony L. Caterini, J. Eric T. Taylor, Gabriel Loaiza-Ganem

Comparing to 17 modern metrics for evaluating the overall performance, fidelity, diversity, rarity, and memorization of generative models, we find that the state-of-the-art perceptual realism of diffusion models as judged by humans is not reflected in commonly reported metrics such as FID.

Memorization

DiMS: Distilling Multiple Steps of Iterative Non-Autoregressive Transformers for Machine Translation

1 code implementation7 Jun 2022 Sajad Norouzi, Rasa Hosseinzadeh, Felipe Perez, Maksims Volkovs

The student is optimized to predict the output of the teacher after multiple decoding steps while the teacher follows the student via a slow-moving average.

Machine Translation Translation

Convergence of Langevin Monte Carlo in Chi-Squared and Renyi Divergence

no code implementations22 Jul 2020 Murat A. Erdogdu, Rasa Hosseinzadeh, Matthew S. Zhang

We prove that, initialized with a Gaussian random vector that has sufficiently small variance, iterating the LMC algorithm for $\widetilde{\mathcal{O}}(\lambda^2 d\epsilon^{-1})$ steps is sufficient to reach $\epsilon$-neighborhood of the target in both Chi-squared and Renyi divergence, where $\lambda$ is the logarithmic Sobolev constant of $\nu_*$.

On the Convergence of Langevin Monte Carlo: The Interplay between Tail Growth and Smoothness

no code implementations27 May 2020 Murat A. Erdogdu, Rasa Hosseinzadeh

This convergence rate, in terms of $\epsilon$ dependency, is not directly influenced by the tail growth rate $\alpha$ of the potential function as long as its growth is at least linear, and it only relies on the order of smoothness $\beta$.

Cannot find the paper you are looking for? You can Submit a new open access paper.