A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Visual learning and recognition of 3-D objects from appearance
International Journal of Computer Vision
Artificial Intelligence Review - Special issue on lazy learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Novel Methods for Subset Selection with Respect to Problem Knowledge
IEEE Intelligent Systems
Dimensionality Reduction of Unsupervised Data
ICTAI '97 Proceedings of the 9th International Conference on Tools with Artificial Intelligence
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Automated Variable Weighting in k-Means Type Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Effective and Efficient Dimensionality Reduction for Large-Scale and Streaming Data Preprocessing
IEEE Transactions on Knowledge and Data Engineering
Computational Methods of Feature Selection (Chapman & Hall/Crc Data Mining and Knowledge Discovery Series)
Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Cognitive Neuroscience
Computational Intelligence and Feature Selection: Rough and Fuzzy Approaches
Computational Intelligence and Feature Selection: Rough and Fuzzy Approaches
Different metaheuristic strategies to solve the feature selection problem
Pattern Recognition Letters
A Weighted Principal Component Analysis and Its Application to Gene Expression Data
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Hi-index | 0.00 |
We propose a preprocessing method to improve the performance of Principal Component Analysis (PCA) for classification problems composed of two steps; in the first step, the weight of each feature is calculated by using a feature weighting method. Then the features with weights larger than a predefined threshold are selected. The selected relevant features are then subject to the second step. In the second step, variances of features are changed until the variances of the features are corresponded to their importance. By taking the advantage of step 2 to reveal the class structure, we expect that the performance of PCA increases in classification problems. Results confirm the effectiveness of our proposed methods.