Adaptive signal processing
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Adaptive algorithms and stochastic approximations
Adaptive algorithms and stochastic approximations
Hidden Markov Models for Speech Recognition
Hidden Markov Models for Speech Recognition
Discriminant Analysis of Principal Components for Face Recognition
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
On self-organizing algorithms and networks for class-separability features
IEEE Transactions on Neural Networks
Algorithms for accelerated convergence of adaptive PCA
IEEE Transactions on Neural Networks
Principal component extraction using recursive least squares learning
IEEE Transactions on Neural Networks
Fast adaptive LDA using quasi-Newton algorithm
Pattern Recognition Letters
Incremental local linear fuzzy classifier in fisher space
EURASIP Journal on Advances in Signal Processing
Adaptive algorithms and networks for optimal feature extraction from Gaussian data
Pattern Recognition Letters
A new incremental optimal feature extraction method for on-line applications
ICIAR'07 Proceedings of the 4th international conference on Image Analysis and Recognition
Just-in-time adaptive similarity component analysis in nonstationary environments
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.01 |
We introduce and discuss new accelerated algorithms for linear discriminant analysis (LDA) in unimodal multiclass Gaussian data. These algorithms use a variable step size, optimally computed in each iteration using (i) the steepest descent, (ii) conjugate direction, and (iii) Newton-Raphson methods in order to accelerate the convergence of the algorithm. Current adaptive methods based on the gradient descent optimization technique use a fixed or a monotonically decreasing step size in each iteration, which results in a slow convergence rate. Furthermore, the convergence of these algorithms depends on appropriate choices of the step sizes. The new algorithms have the advantage of automatic optimal selection of the step size using the current data samples. Based on the new adaptive algorithms, we present self-organizing neural networks for adaptive computation of @S^-^1^/^2 and use them in cascaded form with a PCA network for LDA. Experimental results demonstrate fast convergence and robustness of the new algorithms and justify their advantages for on-line pattern recognition applications with stationary and non-stationary multidimensional input data.