Incremental Learning with Respect to New Incoming Input Attributes
Neural Processing Letters
A New Learning Algorithm Using Simultaneous Perturbation with Weight Initialization
Neural Processing Letters
Mlps (mono layer polynomials and multi layer perceptrons) for nonlinear modeling
The Journal of Machine Learning Research
Novelty detection: a review—part 2: neural network based approaches
Signal Processing
An Approach to Novelty Detection Applied to the Classification of Image Regions
IEEE Transactions on Knowledge and Data Engineering
A Modified Backpropagation Training Algorithm for Feedforward Neural Networks
Neural Processing Letters
A Neural Network-Based Novelty Detector for Image Sequence Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
International Journal of Approximate Reasoning
Letters: Convex incremental extreme learning machine
Neurocomputing
Properties of the Hermite Activation Functions in a Neural Approximation Scheme
ICANNGA '07 Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part II
An Adaptively Constructing Multilayer Feedforward Neural Networks Using Hermite Polynomials
ICIC '08 Proceedings of the 4th international conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications - with Aspects of Artificial Intelligence
Convergence analysis of convex incremental neural networks
Annals of Mathematics and Artificial Intelligence
A new adaptive merging and growing algorithm for designing artificial neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Involving New Local Search in Hybrid Genetic Algorithm for Feature Selection
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
An Efficient Feature Selection Using Ant Colony Optimization Algorithm
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Output partitioning of neural networks
Neurocomputing
Pruned neural networks for regression
PRICAI'00 Proceedings of the 6th Pacific Rim international conference on Artificial intelligence
Some enhancements to orthonormal approximation of 2D functions
ICANNGA'09 Proceedings of the 9th international conference on Adaptive and natural computing algorithms
Adaptive constructive neural networks using hermite polynomials for image compression
ISNN'05 Proceedings of the Second international conference on Advances in neural networks - Volume Part II
Approximation of functions by multivariable hermite basis: a hybrid method
ICANNGA'11 Proceedings of the 10th international conference on Adaptive and natural computing algorithms - Volume Part I
An improved three-term optical backpropagation algorithm
International Journal of Artificial Intelligence and Soft Computing
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
A constructive algorithm for wavelet neural networks
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
Selection of weights for sequential feed-forward neural networks: an experimental study
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Robust training of feedforward neural networks using combined online/batch quasi-newton techniques
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
In this paper, we study a number of objective functions for training new hidden units in constructive algorithms for multilayer feedforward networks. The aim is to derive a class of objective functions the computation of which and the corresponding weight updates can be done in O(N) time, where N is the number of training patterns. Moreover, even though input weight freezing is applied during the process for computational efficiency, the convergence property of the constructive algorithms using these objective functions is still preserved. We also propose a few computational tricks that can be used to improve the optimization of the objective functions under practical situations. Their relative performance in a set of two-dimensional regression problems is also discussed