no code implementations • 25 May 2023 • Daniel Wesego, Amirmohammad Rooshenas
Multimodal Variational Autoencoders (VAEs) represent a promising group of generative models that facilitate the construction of a tractable posterior within the latent space, given multiple modalities.
1 code implementation • ACL 2021 • Sumanta Bhattacharyya, Amirmohammad Rooshenas, Subhajit Naskar, Simeng Sun, Mohit Iyyer, Andrew McCallum
To benefit from this observation, we train an energy-based model to mimic the behavior of the task measure (i. e., the energy-based model assigns lower energy to samples with higher BLEU score), which is resulted in a re-ranking algorithm based on the samples drawn from NMT: energy-based re-ranking (EBR).
1 code implementation • 6 Sep 2019 • MohamadAli Torkamani, Shiv Shankar, Amirmohammad Rooshenas, Phillip Wallis
Most deep neural networks use simple, fixed activation functions, such as sigmoids or rectified linear units, regardless of domain or network structure.
no code implementations • 19 May 2019 • MohamadAli Torkamani, Phillip Wallis, Shiv Shankar, Amirmohammad Rooshenas
Most deep neural networks use simple, fixed activation functions, such as sigmoids or rectified linear units, regardless of domain or network structure.
no code implementations • 22 Dec 2018 • Amirmohammad Rooshenas, Dongxu Zhang, Gopal Sharma, Andrew McCallum
In this paper, we instead use efficient truncated randomized search in this reward function to train structured prediction energy networks (SPENs), which provide efficient test-time inference using gradient-based search on a smooth, learned representation of the score landscape, and have previously yielded state-of-the-art results in structured prediction.
no code implementations • NAACL 2018 • Amirmohammad Rooshenas, Aishwarya Kamath, Andrew McCallum
This paper introduces rank-based training of structured prediction energy networks (SPENs).
no code implementations • 1 Apr 2015 • Daniel Lowd, Amirmohammad Rooshenas
The Libra Toolkit is a collection of algorithms for learning and inference with discrete probabilistic models, including Bayesian networks, Markov networks, dependency networks, and sum-product networks.