Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Soft competitive adaptation: neural network learning algorithms based on fitting statistical mixtures
Vector quantization and signal compression
Vector quantization and signal compression
Relations between prototype, exemplar, and decision bound models of categorization
Journal of Mathematical Psychology
Maximum entropy interpretation of decision bound and context models of categorization
Journal of Mathematical Psychology
Testing for the number of components in a mixture of normal distributions using moment estimators
Computational Statistics & Data Analysis
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Categorization as probability density estimation
Journal of Mathematical Psychology
Mixture models of categorization
Journal of Mathematical Psychology
The LBG-U Method for Vector Quantization – an Improvement over LBGInspired from Neural Networks
Neural Processing Letters
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
IEEE Transactions on Neural Networks
Mixture models of categorization
Journal of Mathematical Psychology
AI*IA '09: Proceedings of the XIth International Conference of the Italian Association for Artificial Intelligence Reggio Emilia on Emergent Perspectives in Artificial Intelligence
Multi-prototype vector-space models of word meaning
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Improving word representations via global context and multiple word prototypes
ACL '12 Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Long Papers - Volume 1
Hi-index | 0.00 |
Many currently popular models of categorization are either strictly parametric (e.g., prototype models, decision bound models) or strictly nonparametric (e.g., exemplar models) (F. G. Ashby & L. A. Alfonso-Reese, 1995, Journal of Mathematical Psychology, 39, 216-233). In this article, a family of semiparametric classifiers is investigated where categories are represented by a finite mixture distribution. The advantage of these mixture models of categorization is that they contain several parametric models and nonparametric models as a special case. Specifically, it is shown that both decision bound models (F. G. Ashby & W. T. Maddox, 1992, Journal of Experimental Psychology: Human Perception and Performance, 16, 598-612; 1993, Journal of Mathematical Psychology, 37, 372-400) and the generalized context model (R. M. Nosofsky, 1986, Journal of Experimental Psychology: General, 115, 39-57) can be interpreted as two extreme cases of a common mixture model. Furthermore, many other (semiparametric) models of categorization can be derived from the same generic mixture framework. In this article, several examples are discussed and a parameter estimation procedure for fitting these models is outlined. To illustrate the approach, several specific models are fitted to a data set collected by S. C. McKinley and R. M. Nosofsky (1995, Journal of Experimental Psychology: Human Perception and Performance, 21, 128-148). The results suggest that semi-parametric models are a promising alternative for future model development.