First-Order Tree-Type Dependence between Variables and Classification Performance

  • Authors:
  • Sarunas Raudys;Ausra Saudargiene

  • Affiliations:
  • Institute of Mathematics and Informatics, Akademijos, Lithuania;Institute of Mathematics and Informatics, Akademijos, Lithuania

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 2001

Quantified Score

Hi-index 0.14

Visualization

Abstract

Structuralization of the covariance matrix reduces the number of parameters to be estimated from the training data and does not affect an increase in the generalization error asymptotically as both the number of dimensions and training sample size grow. A method to benefit from approximately correct assumptions about the first order tree dependence between components of the feature vector is proposed. We use a structured estimate of the covariance matrix to decorrelate and scale the data and to train a single-layer perceptron in the transformed feature space. We show that training the perceptron can reduce negative effects of inexact a priori information. Experiments performed with 13 artificial and 10 real world data sets show that the first-order tree-type dependence model is the most preferable one out of two dozen of the covariance matrix structures investigated.