Mixture of Experts

Mixture of Experts is a modeling approach whereby a number of 'expert' models are combined in order to generate the output. Each expert can be optimized to a particular subset of the training set, a particular data modality, or other component of the input domain. Mixture of Experts is similar to Ensemble Learning, except it does not necessarily require the use of all component models for a given input, but rather a subset of models, determined by a gating mechanism.
Related concepts:
Ensemble