no code implementations • 1 Sep 2023 • Marcel Hirt, Domenico Campolo, Victoria Leong, Juan-Pablo Ortega
To encode latent variables from different modality subsets, Product-of-Experts (PoE) or Mixture-of-Experts (MoE) aggregation schemes have been routinely used and shown to yield different trade-offs, for instance, regarding their generative quality or consistency across multiple modalities.
no code implementations • 14 Apr 2022 • Anil Kurkcu, Cihan Acar, Domenico Campolo, Keng Peng Tee
The efficacy and efficiency of our GloCAL algorithm are compared with other approaches in the domain of grasp learning for 49 objects with varied object complexity and grasp difficulty from the EGAD!