Elements of information theory
Elements of information theory
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Machine Learning
Mutual Information in Learning Feature Transformations
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Using the Fisher Kernel Method to Detect Remote Protein Homologies
Proceedings of the Seventh International Conference on Intelligent Systems for Molecular Biology
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Supervised dimensionality reduction using mixture models
ICML '05 Proceedings of the 22nd international conference on Machine learning
Nonstationary kernel combination
ICML '06 Proceedings of the 23rd international conference on Machine learning
Feature Extraction Using Information-Theoretic Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
Heteroscedastic Probabilistic Linear Discriminant Analysis with Semi-supervised Extension
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Margin and Radius Based Multiple Kernel Learning
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part I
Robust Discriminant Analysis Based on Nonparametric Maximum Entropy
ACML '09 Proceedings of the 1st Asian Conference on Machine Learning: Advances in Machine Learning
Recursive Support Vector Machines for Dimensionality Reduction
IEEE Transactions on Neural Networks
Entropy-based sliced inverse regression
Computational Statistics & Data Analysis
Information estimators for weighted observations
Neural Networks
Hi-index | 0.00 |
Reducing the dimensionality of high-dimensional data without losing its essential information is an important task in information processing. When class labels of training data are available, Fisher discriminant analysis (FDA) has been widely used. However, the optimality of FDA is guaranteed only in a very restricted ideal circumstance, and it is often observed that FDA does not provide a good classification surface for many real problems. This letter treats the problem of supervised dimensionality reduction from the viewpoint of information theory and proposes a framework of dimensionality reduction based on class-conditional entropy minimization. The proposed linear dimensionality-reduction technique is validated both theoretically and experimentally. Then, through kernel Fisher discriminant analysis (KFDA), the multiple kernel learning problem is treated in the proposed framework, and a novel algorithm, which iteratively optimizes the parameters of the classification function and kernel combination coefficients, is proposed. The algorithm is experimentally shown to be comparable to or outperforms KFDA for large-scale benchmark data sets, and comparable to other multiple kernel learning techniques on the yeast protein function annotation task.