IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A Novel Regularization Learning for Single-View Patterns: Multi-View Discriminative Regularization
Neural Processing Letters
On the selection of weight decay parameter for faulty networks
IEEE Transactions on Neural Networks
Image registration with regularized neural network
ICONIP'06 Proceedings of the 13th international conference on Neural Information Processing - Volume Part II
Two-Dimensional PCA combined with PCA for neural network based image registration
ICNC'06 Proceedings of the Second international conference on Advances in Natural Computation - Volume Part II
ADCONS'11 Proceedings of the 2011 international conference on Advanced Computing, Networking and Security
An adaptive regularization method for sparse representation
Integrated Computer-Aided Engineering
Hi-index | 0.00 |
Under the framework of the Kullback-Leibler (KL) distance, we show that a particular case of Gaussian probability function for feedforward neural networks (NNs) reduces into the first-order Tikhonov regularizer. The smooth parameter in kernel density estimation plays the role of regularization parameter. Under some approximations, an estimation formula is derived for estimating regularization parameters based on training data sets. The similarity and difference of the obtained results are compared with other work. Experimental results show that the estimation formula works well in sparse and small training sample cases.