Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Projection pursuit discriminant analysis
Computational Statistics & Data Analysis
Neural Computation
Editorial: recent developments in mixture models
Computational Statistics & Data Analysis
Modelling high-dimensional data by mixtures of factor analyzers
Computational Statistics & Data Analysis
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
The Journal of Machine Learning Research
Editorial: Advances in Mixture Models
Computational Statistics & Data Analysis
Extension of the mixture of factor analyzers model to incorporate the multivariate t-distribution
Computational Statistics & Data Analysis
Editorial: Some Recent Trends in Applied Stochastic Modeling and Multidimensional Data Analysis
Computational Statistics & Data Analysis
Favorability functions based on kernel density estimation for logistic models: A case study
Computational Statistics & Data Analysis
Hi-index | 0.03 |
In the general classification context the recourse to the so-called Bayes decision rule requires to estimate the class conditional probability density functions. A mixture model for the observed variables which is derived by assuming that the data have been generated by an independent factor model is proposed. Independent factor analysis is in fact a generative latent variable model whose structure closely resembles the one of the ordinary factor model, but it assumes that the latent variables are mutually independent and not necessarily Gaussian. The method therefore provides a dimension reduction together with a semiparametric estimate of the class conditional probability density functions. This density approximation is plugged into the classic Bayes rule and its performance is evaluated both on real and simulated data.