Radial basis functions for multivariable interpolation: a review
Algorithms for approximation
A regularized solution to edge detection
Journal of Complexity
Connectionist learning procedures
Artificial Intelligence
Advances in neural information processing systems 2
Extensions of a theory of networks for approximation and learning: outliers and negative examples
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Chaitin-Kolmogorov complexity and generalization in neural networks
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Elements of information theory
Elements of information theory
Bayesian methods for adaptive models
Bayesian methods for adaptive models
Neural networks and the bias/variance dilemma
Neural Computation
Simplifying neural networks by soft weight-sharing
Neural Computation
Keeping the neural networks simple by minimizing the description length of the weights
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
Training with noise is equivalent to Tikhonov regularization
Neural Computation
Bayesian regularization and pruning using a Laplace prior
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Regularized neural networks: some convergence rate results
Neural Computation
From data distributions to regularization in invariant learning
Neural Computation
Learning and generalization in radial basis function networks
Neural Computation
Matrix computations (3rd ed.)
On Tikhonov regularization, bias and variance in nonlinear system identification
Automatica (Journal of IFAC)
Neural Computation
A penalty-function approach for pruning feedforward neural networks
Neural Computation
Note on free lunches and cross-validation
Neural Computation
Neural Computation
Information controller to maximize and minimize information
Neural Computation
Computation with infinite neural networks
Neural Computation
Bias/variance decompositions for likelihood-based estimators
Neural Computation
A sparse representation for function approximation
Neural Computation
An equivalence between sparse approximation and support vector machines
Neural Computation
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Geometry and invariance in kernel based methods
Advances in kernel methods
Generalization and regularization in nonlinear learning systems
The handbook of brain theory and neural networks
Atomic Decomposition by Basis Pursuit
SIAM Journal on Scientific Computing
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
No free lunch for early stopping
Neural Computation
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Rank-deficient and discrete ill-posed problems: numerical aspects of linear inversion
Source separation as a by-product of regularization
Proceedings of the 1998 conference on Advances in neural information processing systems II
Regularized radial basis functional networks: theory and applications
Regularized radial basis functional networks: theory and applications
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Sparse on-line Gaussian processes
Neural Computation
Transformation Invariance in Pattern Recognition-Tangent Distance and Tangent Propagation
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Prior Information and Generalized Questions
Prior Information and Generalized Questions
Sparse Correlation Kernel Analysis and Reconstruction
Sparse Correlation Kernel Analysis and Reconstruction
Kernel Independent Component Analysis
Kernel Independent Component Analysis
Classes of kernels for machine learning: a statistics perspective
The Journal of Machine Learning Research
No Free Lunch for Noise Prediction
Neural Computation
The lack of a priori distinctions between learning algorithms
Neural Computation
No free lunch for cross-validation
Neural Computation
A smoothing regularizer for feedforward and recurrent neural networks
Neural Computation
Minimum description length, regularization, and multimodal data
Neural Computation
A framework for multiscale and hybrid RKHS-based approximators
IEEE Transactions on Signal Processing
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Minimum description length induction, Bayesianism, and Kolmogorov complexity
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Regularization networks: fast weight calculation via Kalman filtering
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
On the application of orthogonal transformation for the design and analysis of feedforward networks
IEEE Transactions on Neural Networks
Adaptive Sparseness for Supervised Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pointwise convergence of Fourier regularization for smoothing data
Journal of Computational and Applied Mathematics
Classifier learning with a new locality regularization method
Pattern Recognition
Classifier learning with a new locality regularization method
Pattern Recognition
Discriminatively regularized least-squares classification
Pattern Recognition
Analytic Reconstruction of Transparent and Opaque Surfaces from Texture Images
IbPRIA '07 Proceedings of the 3rd Iberian conference on Pattern Recognition and Image Analysis, Part II
Generating spike trains with specified correlation coefficients
Neural Computation
Radial Basis Function network learning using localized generalization error bound
Information Sciences: an International Journal
Generalized competitive learning of Gaussian mixture models
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on cybernetics and cognitive informatics
Regularized neighborhood component analysis
SCIA'07 Proceedings of the 15th Scandinavian conference on Image analysis
A Novel Regularization Learning for Single-View Patterns: Multi-View Discriminative Regularization
Neural Processing Letters
Sparse regularization for semi-supervised classification
Pattern Recognition
A neural network for simultaneously reconstructing transparent and opaque surfaces
ICIAR'06 Proceedings of the Third international conference on Image Analysis and Recognition - Volume Part II
Approximate computation and implicit regularization for very large-scale data analysis
PODS '12 Proceedings of the 31st symposium on Principles of Database Systems
An alternative approach to avoid overfitting for surrogate models
Proceedings of the Winter Simulation Conference
Learning with boundary conditions
Neural Computation
A theoretical framework for supervised learning from regions
Neurocomputing
Hi-index | 0.00 |
This review provides a comprehensive understanding of regularization theory from different perspectives, emphasizing smoothness and simplicity principles. Using the tools of operator theory and Fourier analysis, it is shown that the solution of the classical Tikhonov regularization problem can be derived from the regularized functional defined by a linear differential (integral) operator in the spatial (Fourier) domain. State-of-the-art research relevant to the regularization theory is reviewed, covering Occam's razor, minimum length description, Bayesian theory, pruning algorithms, informational (entropy) theory, statistical learning theory, and equivalent regularization. The universal principle of regularization in terms of Kolmogorov complexity is discussed. Finally, some prospective studies on regularization theory and beyond are suggested.