Noise injection: theoretical prospects
Neural Computation
Classifier's Complexity Control while Training Multilayer Perceptrons
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
AI*IA '99 Proceedings of the 6th Congress of the Italian Association for Artificial Intelligence on Advances in Artificial Intelligence
On different facets of regularization theory
Neural Computation
A neural network ensemble method with jittered training data for time series forecasting
Information Sciences: an International Journal
Designing Model Based Classifiers by Emphasizing Soft Targets
Fundamenta Informaticae - Advances in Artificial Intelligence and Applications
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Mining data by query-based error-propagation
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
Disease diagnosis using query-based neural networks
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part III
Mathematical and Computer Modelling: An International Journal
Approximation properties of local bases assembled from neural network transfer functions
Mathematical and Computer Modelling: An International Journal
Designing Model Based Classifiers by Emphasizing Soft Targets
Fundamenta Informaticae - Advances in Artificial Intelligence and Applications
Hi-index | 0.00 |
The generalization performance of feedforward layered perceptrons can, in many cases, be improved either by smoothing the target via convolution, regularizing the training error with a smoothing constraint, decreasing the gain (i.e., slope) of the sigmoid nonlinearities, or adding noise (i.e., jitter) to the input training data, In certain important cases, the results of these procedures yield highly similar results although at different costs. Training with jitter, for example, requires significantly more computation than sigmoid scaling