Diffusions for global optimizations
SIAM Journal on Control and Optimization
SIAM Journal on Applied Mathematics
Adaptive pattern recognition and neural networks
Adaptive pattern recognition and neural networks
A stochastic version of the delta rule
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
A back-propagation algorithm with optimal use of hidden units
Advances in neural information processing systems 1
Generalization by weight-elimination with application to forecasting
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Neural Computation
Simplifying neural networks by soft weight-sharing
Neural Computation
Optimal network construction by minimum description length
Neural Computation
Keeping the neural networks simple by minimizing the description length of the weights
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On Langevin updating in multilayer perceptrons
Neural Computation
Training with noise is equivalent to Tikhonov regularization
Neural Computation
Stochastic Complexity in Statistical Inquiry Theory
Stochastic Complexity in Statistical Inquiry Theory
Neural Computation
Noise injection: theoretical prospects
Neural Computation
Randomizing Outputs to Increase Prediction Accuracy
Machine Learning
Smoothed Bagging with Kernel Bandwidth Selectors
Neural Processing Letters
Virtual sample generation using a population of networks
Neural Processing Letters
Neural computing increases robot adaptivity
Natural Computing: an international journal
Classifier's Complexity Control while Training Multilayer Perceptrons
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
The Role of Combining Rules in Bagging and Boosting
Proceedings of the Joint IAPR International Workshops on Advances in Pattern Recognition
Neural Modeling of an Industrial Process with Noisy Data
Proceedings of the 14th International conference on Industrial and engineering applications of artificial intelligence and expert systems: engineering of intelligent systems
Boosting in Linear Discriminant Analysis
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Bagging and the Random Subspace Method for Redundant Feature Spaces
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
On different facets of regularization theory
Neural Computation
Class imbalances versus small disjuncts
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Architecture-Independent Approximation of Functions
Neural Computation
A framework to deal with interference in connectionist systems
AI Communications
A neural network ensemble method with jittered training data for time series forecasting
Information Sciences: an International Journal
Monitoring MLP's free parameters for generalization
AIKED'09 Proceedings of the 8th WSEAS international conference on Artificial intelligence, knowledge engineering and data bases
Fast training MLP networks with Lo-Shu data sampling
AIKED'09 Proceedings of the 8th WSEAS international conference on Artificial intelligence, knowledge engineering and data bases
Cheating for problem solving: a genetic algorithm with social interactions
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
A study of the effect of noise injection on the training of artificial neural networks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Using XCS to describe continuous-valued problem spaces
IWLCS'03-05 Proceedings of the 2003-2005 international conference on Learning classifier systems
IEEE Transactions on Neural Networks
The Journal of Machine Learning Research
A novel virtual sample generation method based on Gaussian distribution
Knowledge-Based Systems
Injecting Chaos in Feedforward Neural Networks
Neural Processing Letters
Expert Systems with Applications: An International Journal
Natural inspiration for artificial adaptivity: some neurocomputing experiences in robotics
UC'05 Proceedings of the 4th international conference on Unconventional Computation
VTG schemes for using back propagation for multivariate time series prediction
Applied Soft Computing
The dropout learning algorithm
Artificial Intelligence
Hi-index | 0.00 |
We study the effects of adding noise to the inputs, outputs, weight connections, and weight changes of multilayer feedforward neural networks during backpropagation training. We rigorously derive and analyze the objective functions that are minimized by the noise-affected training processes. We show that input noise and weight noise encourage the neural-network output to be a smooth function of the input or its weights, respectively. In the weak-noise limit, noise added to the output of the neural networks only changes the objective function by a constant. Hence, it cannot improve generalization. Input noise introduces penalty terms in the objective function that are related to, but distinct from, those found in the regularization approaches. Simulations have been performed on a regression and a classification problem to further substantiate our analysis. Input noise is found to be effective in improving the generalization performance for both problems. However, weight noise is found to be effective in improving the generalization performance only for the classification problem. Other forms of noise have practically no effect on generalization.