Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
The Journal of Machine Learning Research
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
A Projection Pursuit Algorithm for Exploratory Data Analysis
IEEE Transactions on Computers
Divergence estimation for multidimensional densities via k-nearest-neighbor distances
IEEE Transactions on Information Theory
Maximization of Mutual Information for Supervised Linear Feature Extraction
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this paper we introduce a supervised linear dimensionality reduction algorithm which finds a projected input space that maximizes the mutual information between input and output values. The algorithm utilizes the recently introduced MeanNN estimator for differential entropy. We show that the estimator is an appropriate tool for the dimensionality reduction task. Next we provide a nonlinear regression algorithm based on the proposed dimensionality reduction approach. The regression algorithm achieves comparable to state-of-the-art performance on the standard data sets but is three orders of magnitude faster. In addition we describe applications of the proposed dimensionality reduction algorithm to reduced-complexity supervised and semisupervised classification tasks.