Improvements over adaptive local hyperplane to achieve better classification

  • Authors:
  • Hongmin Cai

  • Affiliations:
  • School of Information Science and Technology, The Sun Yat-sen University, P.R. China

  • Venue:
  • ICDM'11 Proceedings of the 11th international conference on Advances in data mining: applications and theoretical aspects
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new classification model called adaptive local hyperplane (ALH) has been shown to outperform many state-of-the-arts classifiers on benchmark data sets. By representing the data in a local subspace spanned by samples carefully chosen by Fisher's feature weighting scheme, ALH attempts to search for optimal pruning parameters after large number of iterations. However, the feature weight scheme is less accurate in quantifying multi-class problems and samples being rich of redundance. It results in an unreliable selection of prototypes and degrades the classification performance. In this paper, we propose improvement over standard ALH in two aspects. Firstly, we quantify and demonstrate that feature weighting after mutual information is more accurate and robust. Secondly, we propose an economical numerical algorithm to facilitate the matrix inversion, which is a key step in hyperplane construction. The proposed step could greatly low the computational cost and is promising fast applications, such as on-line data mining. Experimental results on both synthetic and real benchmarks data sets have shown that the improvements achieved better performance.