Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
The relationship between Precision-Recall and ROC curves
ICML '06 Proceedings of the 23rd international conference on Machine learning
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast solvers and efficient implementations for distance metric learning
Proceedings of the 25th international conference on Machine learning
Support vector machine for functional data classification
Neurocomputing
A wrapper-based feature selection method for ADMET prediction using evolutionary computing
EvoBIO'08 Proceedings of the 6th European conference on Evolutionary computation, machine learning and data mining in bioinformatics
Hi-index | 0.00 |
A structurally simple, yet powerful, formalism is presented for adapting attribute combinations in high-dimensional data, given categorical data class labels. The rank-1 Mahalanobis distance is optimized in a way that maximizes between-class variability while minimizing within-class variability. This optimization target has resemblance to Fisher's linear discriminant analysis (LDA), but the proposed formulation is more general and yields improved class separation, which is demonstrated for spectrum data and gene expression data.