Neural Information Processing
Visualization by Linear Projections as Information Retrieval
WSOM '09 Proceedings of the 7th International Workshop on Advances in Self-Organizing Maps
Large margin nearest local mean classifier
Signal Processing
Visualizations for assessing convergence and mixing of Markov chain Monte Carlo simulations
Computational Statistics & Data Analysis
Information Retrieval Perspective to Nonlinear Dimensionality Reduction for Data Visualization
The Journal of Machine Learning Research
Regularized neighborhood component analysis
SCIA'07 Proceedings of the 15th Scandinavian conference on Image analysis
Face recognition using parzenfaces
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
On the relevance of linear discriminative features
Information Sciences: an International Journal
A fast fixed-point algorithm for two-class discriminative feature extraction
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
An unsupervised data projection that preserves the cluster structure
Pattern Recognition Letters
Human figure segmentation using independent component analysis
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part I
Algorithms for maximum-likelihood bandwidth selection in kernel density estimators
Pattern Recognition Letters
Hybrid random subsample classifier ensemble for high dimensional data sets
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
A simple probabilistic model is introduced to generalize classical linear discriminant analysis (LDA) in finding components that are informative of or relevant for data classes. The components maximize the predictability of the class distribution which is asymptotically equivalent to 1) maximizing mutual information with the classes, and 2) finding principal components in the so-called learning or Fisher metrics. The Fisher metric measures only distances that are relevant to the classes, that is, distances that cause changes in the class distribution. The components have applications in data exploration, visualization, and dimensionality reduction. In empirical experiments, the method outperformed, in addition to more classical methods, a Renyi entropy-based alternative while having essentially equivalent computational cost.