Kalman filtering: theory and practice
Kalman filtering: theory and practice
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Trainable System for Object Detection
International Journal of Computer Vision - special issue on learning and vision at the center for biological and computational learning, Massachusetts Institute of Technology
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
An Experimental Study on Pedestrian Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Fast linear discriminant analysis using binary bases
Pattern Recognition Letters
Evolutionary Optimization ofWavelet Feature Sets for Real-Time Pedestrian Classification
HIS '07 Proceedings of the 7th International Conference on Hybrid Intelligent Systems
The Journal of Machine Learning Research
Writer Adaptive Online Handwriting Recognition Using Incremental Linear Discriminant Analysis
ICDAR '09 Proceedings of the 2009 10th International Conference on Document Analysis and Recognition
A Problem of Dimensionality: A Simple Example
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental linear discriminant analysis for classification of data streams
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.11 |
For fast classification under real-time constraints, as required in many image-based pattern recognition applications, linear discriminant functions are a good choice. Linear discriminant analysis (LDA) computes such discriminant functions in a space spanned by real-valued features extracted from the input. The accuracy of the trained classifier crucially depends on these features, its time complexity on their number. As the number of available features is immense in most real-world problems, it becomes essential to use meta-heuristics for feature selection and/or feature optimization. These methods typically involve iterated training of a classifier after substitutions or modifications of features. Therefore, we derive an efficient incremental update formula for LDA discriminant functions for the substitution of features. It scales linearly in the number of altered features and quadratically in the overall number of features, while completely retraining scales cubically in the number of features. The update rule allows for efficient feature selection and optimization with any meta-heuristic that is based on iteratively modifying existing solutions. The proposed method was tested on an artificial benchmark problem as well as on a real-world problem. Results show that significant time savings during training are achieved while numerical stability is maintained.