The Key Theorem of Learning Theory on Uncertainty Space

  • Authors:
  • Shujing Yan;Minghu Ha;Xiankun Zhang;Chao Wang

  • Affiliations:
  • College of Mathematical and Computer Sciences, Hebei University, Baoding, China 071002;College of Mathematical and Computer Sciences, Hebei University, Baoding, China 071002;College of Mathematical and Computer Sciences, Hebei University, Baoding, China 071002;College of Mathematical and Computer Sciences, Hebei University, Baoding, China 071002

  • Venue:
  • ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Statistical Learning Theory is commonly regarded as a sound framework within which we handle a variety of learning problems in presence of small size data samples. However, since the theory is based on probability space, it hardly handles statistical learning problems on uncertainty space. In this paper, the Statistical Learning Theory on uncertainty space is investigated. The Khintchine law of large numbers on uncertainty space is proved. The definitions of empirical risk functional, expected risk functional and empirical risk minimization principle on uncertainty space are introduced. On the basis of these concepts, the key theorem of learning theory on uncertainty space is introduced and proved.