Proceedings of the 1998 conference on Advances in neural information processing systems II
Kernel PCA and de-noising in feature spaces
Proceedings of the 1998 conference on Advances in neural information processing systems II
Candid Covariance-Free Incremental Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Support Vector Data Description
Machine Learning
SVM-Based feature selection of latent semantic features
Pattern Recognition Letters
A Polygonal Line Algorithm based Nonlinear Feature Extraction Method
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
IEEE Transactions on Pattern Analysis and Machine Intelligence
Face Recognition Using IPCA-ICA Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
ICDM '06 Proceedings of the Sixth International Conference on Data Mining
Fuzzy support vector machine for multi-class text categorization
Information Processing and Management: an International Journal
Margin Trees for High-dimensional Classification
The Journal of Machine Learning Research
Supervised feature selection via dependence estimation
Proceedings of the 24th international conference on Machine learning
Representing Images Using Nonorthogonal Haar-Like Bases
IEEE Transactions on Pattern Analysis and Machine Intelligence
Real-time data pre-processing technique for efficient feature extraction in large scale datasets
Proceedings of the 17th ACM conference on Information and knowledge management
Multi-Space-Mapped SVMs for Multi-class Classification
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
A Novel Method of Combined Feature Extraction for Recognition
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
To obtain orthogonal feature extraction using training data selection
Proceedings of the 18th ACM conference on Information and knowledge management
Feature selection for ranking using boosted trees
Proceedings of the 18th ACM conference on Information and knowledge management
Forward semi-supervised feature selection
PAKDD'08 Proceedings of the 12th Pacific-Asia conference on Advances in knowledge discovery and data mining
A new independent component analysis for speech recognition and separation
IEEE Transactions on Audio, Speech, and Language Processing
BDPCA plus LDA: a novel fast feature extraction technique for face recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
New results on error correcting output codes of kernel machines
IEEE Transactions on Neural Networks
Nesting One-Against-One Algorithm Based on SVMs for Pattern Classification
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Feature extraction is an effective step in data mining and machine learning. While many feature extraction methods have been proposed for clustering, classification and regression, very limited work has been done on multi-class classification problems. In fact, the accuracy of multi-class classification problems relies on well-extracted features, the modeling part aside. This paper proposes a new feature extraction method, namely extracting orientation distance-based discriminative (ODD) features, which is particularly designed for multi-class classification problems. The proposed method works in two steps. In the first step, we extend the Fisher Discriminant idea to determine more appropriate kernel function and map the input data with all classes into a feature space. In the second step, the ODD features are extracted based on the one-vs-all scheme to generate discriminative features between a pattern and each hyperplane. These newly extracted features are treated as the representative features and are further used in the subsequent classification procedure. Substantial experiments on both UCI and real-world datasets have been conducted to investigate the performance of ODD features based multi-class classification. The statistical results show that the classification accuracy based on ODD features outperforms that of the state-of-the-art feature extraction methods.