Boosting the differences: A fast Bayesian classifier neural network

  • Authors:
  • Ninan Sajeeth Philip;K. Babu Joseph

  • Affiliations:
  • Department of Physics, Cochin University of Science and Technology, Kochi-682 022, India. E-mail: nsp@stthom.ernet.in, smanager@giasmd01.vsnl.net.in;Department of Physics, Cochin University of Science and Technology, Kochi-682 022, India. E-mail: nsp@stthom.ernet.in, smanager@giasmd01.vsnl.net.in

  • Venue:
  • Intelligent Data Analysis
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new classifier based on Bayes' principle that assumes theclustering of attribute values while boosting the attributedifferences is presented. The method considers the error producedby each example in the training set in turn and updates theconnection weights associated to the probability P(U_{m}\mid C_{k})of each attribute of that example. In this process the probabilitydensity of identical attribute values flattens out and thedifferences get boosted up. Using four popular datasets from theUCI repository, some of the characteristic features of the networkare illustrated. The network is found to have optimalgeneralization ability on all the datasets. For a given topology,the network converges to the same classification accuracy and thetraining time as compared to other networks is less. One of theexamples indicates the possibility that the optimization of thenetwork may be done in parallel.