Estimation of Time-Varying Parameters in Statistical Models: AnOptimization Approach

  • Authors:
  • Dimitris Bertsimas;David Gamarnik;John N. Tsitsiklis

  • Affiliations:
  • Sloan School of Management and Operations Research Center, MIT Cambridge, MA 02139. dbertsim@mit.edu;Operations Research Center, MIT Cambridge, MA 02139. gamarnik@watson.ibm.com;Laboratory for Information and Decision Systems and Operations Research Center, MIT Cambridge, MA 02139. jnt@mit.edu

  • Venue:
  • Machine Learning - Special issue: computational learning theory, COLT '97
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a convex optimization approach to solving thenonparametric regression estimation problem when the underlyingregression function is Lipschitz continuous. This approach is basedon the minimization of the sum of empirical squared errors, subjectto the constraints implied by Lipschitz continuity. The resultingoptimization problem has a convex objective function and linearconstraints, and as a result, is efficiently solvable. The estimatedfunction computed by this technique, is proven to convergeto theunderlying regression function uniformly and almost surely, when thesample size grows to infinity, thus providing a very strong form ofconsistency. Wealso propose a convex optimization approach to themaximum likelihood estimation of unknown parameters in statisticalmodels, where the parameters depend continuously on some observableinput variables. For a number of classical distributional forms, theobjective function in the underlying optimization problem is convexand the constraints are linear. These problems are, therefore, alsoefficiently solvable.