Universal approximation using radial-basis-function networks
Neural Computation
Maximum conditional likelihood via bound maximization and the CEM algorithm
Proceedings of the 1998 conference on Advances in neural information processing systems II
Mutual Information in Learning Feature Transformations
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Sufficient dimensionality reduction
The Journal of Machine Learning Research
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
The Journal of Machine Learning Research
Regression on manifolds using kernel dimension reduction
Proceedings of the 24th international conference on Machine learning
Closed-form supervised dimensionality reduction with generalized linear models
Proceedings of the 25th international conference on Machine learning
Emotion recognition from arbitrary view facial images
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part VI
Cloud-based malware detection for evolving data streams
ACM Transactions on Management Information Systems (TMIS)
Pattern Recognition Letters
Hi-index | 0.00 |
Given a classification problem, our goal is to find a low-dimensional linear transformation of the feature vectors which retains information needed to predict the class labels. We present a method based on maximum conditional likelihood estimation of mixture models. Use of mixture models allows us to approximate the distributions to any desired accuracy while use of conditional likelihood as the contrast function ensures that the selected subspace retains maximum possible mutual information between feature vectors and class labels. Classification experiments using Gaussian mixture components show that this method compares favorably to related dimension reduction techniques. Other distributions belonging to the exponential family can be used to reduce dimensions when data is of a special type, for example binary or integer valued data. We provide an EM-like algorithm for model estimation and present visualization experiments using Gaussian and Bernoulli mixture models.