Independent component analysis: algorithms and applications
Neural Networks
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature Extraction Based on Decision Boundaries
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast and robust fixed-point algorithms for independent component analysis
IEEE Transactions on Neural Networks
An overview of statistical learning theory
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Binary tree of SVM: a new fast multiclass training and classification algorithm
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a hybrid feature extraction framework based on two diverse optimization problems in aspects of risk and independence to extract features for higher classification performance. The risk minimization as a supervised approach pursues maximum generalization capability among data to directly improve classification performance, whereas the independence maximization process as an unsupervised method projects data onto a space which satisfies maximum independence to indirectly achieve better classification accuracy. Due to the direct and indirect relationship of risk minimization and independence maximization toward classification accuracy improvement, it is expected that features from the hybrid framework simultaneously satisfying both risk and independence criteria would result in the classification performance better than using either criterion. Experimental results show that the proposed hybrid framework provides higher classification performance than various existing feature extractors.