Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
The nature of statistical learning theory
The nature of statistical learning theory
Efficient agnostic learning of neural networks with bounded fan-in
IEEE Transactions on Information Theory - Part 2
Hi-index | 0.00 |
We propose a convex optimization approach to solving thenonparametric regression estimation problem when the underlyingregression function is Lipschitz continuous. This approach is basedon the minimization of the sum of empirical squared errors, subjectto the constraints implied by Lipschitz continuity. The resultingoptimization problem has a convex objective function and linearconstraints, and as a result, is efficiently solvable. The estimatedfunction computed by this technique, is proven to convergeto theunderlying regression function uniformly and almost surely, when thesample size grows to infinity, thus providing a very strong form ofconsistency. Wealso propose a convex optimization approach to themaximum likelihood estimation of unknown parameters in statisticalmodels, where the parameters depend continuously on some observableinput variables. For a number of classical distributional forms, theobjective function in the underlying optimization problem is convexand the constraints are linear. These problems are, therefore, alsoefficiently solvable.