Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Mixtures of probabilistic principal component analyzers
Neural Computation
Proceedings of the 1998 conference on Advances in neural information processing systems II
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
This paper proposes a fast and sub-optimal selection method of model order such as the number of mixture components and the number of PCA bases for the PCA mixture model, consisting of a combination of many PCAs. Once the model order is determined, the parameters of the model can be easily estimated by the expectation maximization (EM) learning using the decorrelatedness of feature data in the PCA transformed space. The conventional model order selection method takes a long processing time because it requires to perform the time-consuming EM learning over all possible model orders. We try to simplify the model order selection method as follows. First, the time-consuming EM learning over the training data set has been performed once for a given number of mixture components, with all PCA bases kept. Second, in virtue of ordering property of PCA bases, the evaluation step to measure the fitness of model selection criterion over the validation data set has been performed sequentially by pruning less significant PCA base one by one, starting from the most insignificant PCA base. A pair of the number of mixture components and PCA bases that satisfies the model selection criterion fully is selected as the optimal model order for the given problem. Simulation results of the synthetic data classification and a practical problem of alphabet recognition show that the proposed model selection method determines the model order appropriately and improves the classification and detection performances.