Dimension Selection for Feature Selection and Dimension Reduction with Principal and Independent Component Analysis

  • Authors:
  • Inge Koch;Kanta Naito

  • Affiliations:
  • Department of Statistics, School of Mathematics, University of New South Wales, Sydney, NSW 2052 Australia, inge@maths.unsw.edu.au;Department of Mathematics, Faculty of Science and Engineering, Shimane University, Matsue 690-8504 Japan, naito@riko.shimane-u.ac.jp

  • Venue:
  • Neural Computation
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

This letter is concerned with the problem of selecting the best or most informative dimension for dimension reduction and feature extraction in high-dimensional data. The dimension of the data is reduced by principal component analysis; subsequent application of independent component analysis to the principal component scores determines the most nongaussian directions in the lower-dimensional space. A criterion for choosing the optimal dimension based on bias-adjusted skewness and kurtosis is proposed. This new dimension selector is applied to real data sets and compared to existing methods. Simulation studies for a range of densities show that the proposed method performs well and is more appropriate for nongaussian data than existing methods.