AIP Conference Proceedings 151 on Neural Networks for Computing
Neural Computation
The use of the L-curve in the regularization of discrete ill-posed problems
SIAM Journal on Scientific Computing
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
On the Use of Evidence in Neural Networks
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Acceleration Techniques for the Backpropagation Algorithm
Proceedings of the EURASIP Workshop 1990 on Neural Networks
On different facets of regularization theory
Neural Computation
Hi-index | 0.00 |
Relationships between clustering, description length, and regularization are pointed out, motivating the introduction of a cost function with a description length interpretation and the unusual and useful property of having its minimum approximated by the densest mode of a distribution. A simple inverse kinematics example is used to demonstrate that this property can be used to select and learn one branch of a multivalued mapping. This property is also used to develop a method for setting regularization parameters according to the scale on which structure is exhibited in the training data. The regularization technique is demonstrated on two real data sets, a classification problem and a regression problem.