1 code implementation • 14 Jul 2019 • Bao Tuyen Huynh, Faicel Chamroukhi
Mixtures-of-Experts (MoE) are conditional mixture models that have shown their performance in modeling heterogeneity in data in many statistical learning approaches for prediction, including regression and classification, as well as for clustering.