no code implementations • 1 Sep 2023 • Marcel Hirt, Domenico Campolo, Victoria Leong, Juan-Pablo Ortega
To encode latent variables from different modality subsets, Product-of-Experts (PoE) or Mixture-of-Experts (MoE) aggregation schemes have been routinely used and shown to yield different trade-offs, for instance, regarding their generative quality or consistency across multiple modalities.