Regularization

Gaussian Mixture Variational Autoencoder

Introduced by Aguilera et al. in Regularizing Transformers With Deep Probabilistic Layers

GMVAE, or Gaussian Mixture Variational Autoencoder, is a stochastic regularization layer for transformers. A GMVAE layer is trained using a 700-dimensional internal representation of the first MLP layer. For every output from the first MLP layer, the GMVAE layer first computes a latent low-dimensional representation sampling from the GMVAE posterior distribution to then provide at the output a reconstruction sampled from a generative model.

Source: Regularizing Transformers With Deep Probabilistic Layers

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Dimensionality Reduction 1 33.33%
Protein Folding 1 33.33%
Decoder 1 33.33%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories