Radial basis function approximations to polynomials
Numerical analysis 1987
Finding Prototypes For Nearest Neighbor Classifiers
IEEE Transactions on Computers
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
The condensed nearest neighbor rule using the concept of mutual nearest neighborhood (Corresp.)
IEEE Transactions on Information Theory
An evolution-oriented learning algorithm for the optimal interpolative net
IEEE Transactions on Neural Networks
Local Subspace Classifier in Reproducing Kernel Hilbert Space
ICMI '00 Proceedings of the Third International Conference on Advances in Multimodal Interfaces
Image interpretation using contextual feedback
ICIP '95 Proceedings of the 1995 International Conference on Image Processing (Vol.2)-Volume 2 - Volume 2
Rectified nearest feature line segment for pattern classification
Pattern Recognition
A novel classifier based on shortest feature line segment
Pattern Recognition Letters
A training sample sequence planning method for pattern recognition problems
Automatica (Journal of IFAC)
Hi-index | 0.01 |
The Optimal Interpolative (OI) net, derived earlier by one of the authors based on a generalized Fock (GF) space formulation, provides a closed form solution to the synaptic weights in a two-layer feedforward neural network. In this paper, a recursive learning algorithm called RLS-OI is introduced, which extends the original OI solution to a recursive least squares solution. This is an enhanced version of a learning procedure we derived recently also based on the OI solution, which brings in new features such as a procedure for the selection of a good training sequence and a scaling factor in the nonlinear activation function to the hidden neurons. A more general approach is used in its derivation which makes it possible to apply the same technique with different forms of activation functions or basis functions. The proposed algorithm selects a small but meaningful (consistent) subset of exemplars known as prototypes and recursively updates the synaptic weights of the neural network as each new prototype is found. RLS-OI is a noniterative algorithm and is guaranteed to produce a solution in a number of recursions no larger than the number of exemplars in the training set. It also requires no prespecification of the architecture of the neural network to be learned but evolves a net from scratch to the extent sufficient to solve a given problem. The method for picking prototypes and the recursive relations needed to implement the algorithm are described in detail. Simulation results are provided to illustrate our approach.