Algebraic Analysis for Nonidentifiable Learning Machines

  • Authors:
  • Sumio Watanabe

  • Affiliations:
  • P&I Laboratory, Tokyo Institute of Technology, Yokohama, 226-8503 Japan

  • Venue:
  • Neural Computation
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article clarifies the relation between the learning curve and the algebraic geometrical structure of a nonidentifiable learning machine such as a multilayer neural network whose true parameter set is an analytic set with singular points. By using a concept in algebraic analysis, we rigorously prove that the Bayesian stochastic complexity or the free energy is asymptotically equal to λ1 log n - (m1 - 1) log log n + constant, where n is the number of training samples and λ1 and m1 are the rational number and the natural number, which are determined as the birational invariant values of the singularities in the parameter space. Also we show an algorithm to calculate λ1 and m1 based on the resolution of singularities in algebraic geometry. In regular statistical models, 2λ1 is equal to the number of parameters and m1 = 1 , whereas in nonregular models, such as multilayer networks, 2λ1 is not larger than the number of parameters and m1 ≥ 1. Since the increase of the stochastic complexity is equal to the learning curve or the generalization error, the nonidentifiable learning machines are better models than the regular ones if Bayesian ensemble learning is applied.