Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
The cascade-correlation learning architecture
Advances in neural information processing systems 2
A resource-allocating network for function interpolation
Neural Computation
Neural networks and the bias/variance dilemma
Neural Computation
A function estimation approach to sequential learning with neural networks
Neural Computation
Approximation and Estimation Bounds for Artificial Neural Networks
Machine Learning - Special issue on computational learning theory
Signal representation using adaptive normalized Gaussian functions
Signal Processing
The nature of statistical learning theory
The nature of statistical learning theory
Investigation of the CasCor family of learning algorithms
Neural Networks
Modeling with constructive backpropagation
Neural Networks
Machine Learning
Predictive models for the breeder genetic algorithm i. continuous parameter optimization
Evolutionary Computation
Matching pursuits with time-frequency dictionaries
IEEE Transactions on Signal Processing
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
IEEE Transactions on Neural Networks
Objective functions for training new hidden units in constructive neural networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Exploring constructive cascade networks
IEEE Transactions on Neural Networks
Orthogonal least squares learning algorithm for radial basis function networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Regression modeling in back-propagation and projection pursuit learning
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Comparing support vector machines and feed-forward neural networks with similar parameters
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Hi-index | 0.01 |
An algorithm for sequential approximation with optimal coefficients and interacting frequencies (SAOCIF) for feed-forward neural networks is presented. SAOCIF combines two key ideas. The first one is the optimization of the coefficients (the linear part of the approximation). The second one is the strategy to choose the frequencies (the non-linear weights), taking into account the interactions with the previously selected ones. The resulting method combines the locality of sequential approximations, where only one frequency is found at every step, with the globality of non-sequential methods, where every frequency interacts with the others. The idea behind SAOCIF can be theoretically extended to general Hilbert spaces. Experimental results show a very satisfactory performance.