The d-Dimensional Normal Distribution Case

  • Authors:
  • Luís G. Rueda;B. John Oommen;Resolving Minsky's Paradox

  • Affiliations:
  • -;-;-

  • Venue:
  • AI '01 Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the well-studied Pattern Recognition (PR) problem of designing linear classifiers. When dealing with normally distributed classes, it is well known that the optimal Bayes classifier is linear only when the covariance matrices are equal. This was the only known condition for discriminant linearity. In a previous work, we presented the theoretical framework for optimal pairwise linear classifiers for twodimensional normally distributed random vectors. We derived the necessary and sufficient conditions that the distributions have to satisfy so as to yield the optimal linear classifier as a pair of straight lines.In this paper we extend the previous work to d-dimensional normally distributed random vectors. We provide the necessary and sufficient conditions needed so that the optimal Bayes classifier is a pair of hyperplanes. Various scenarios have been considered including one which resolves the multi-dimensional Mznsky 's paradox for the perceptron. We have also provided some three dimensional examples for all the cases, and tested the classification accuracy of the relevant pairwise linear classifier that we found. In all the cases, these linear classifiers achieve very good performance.