A Spectral Lower Bound Technique for the Size of Decision Trees and Two-Level AND/OR Circuits
IEEE Transactions on Computers
Surveys in combinatorics, 1993
A Combined Latent Class and Trait Model for the Analysis and Visualization of Discrete Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning Recursive Bayesian Multinets for Data Clustering by Means of Constructive Induction
Machine Learning - Special issue: Unsupervised learning
Geometric Properties of Naive Bayes in Nominal Domains
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
The representational power of discrete bayesian networks
The Journal of Machine Learning Research
Probabilistic graphical models in artificial intelligence
Applied Soft Computing
The von Mises naive Bayes classifier for angular data
CAEPIA'11 Proceedings of the 14th international conference on Advances in artificial intelligence: spanish association for artificial intelligence
Hi-index | 0.00 |
A Naive (or Idiot) Bayes network is a network with a single hypothesis node and several observations that are conditionally independent given the hypothesis. We recently surveyed a number of members of the UAI community and discovered a general lack of understanding of the implications of the Naive Bayes assumption on the kinds of problems that can be solved by these networks. It has long been recognized [Minsky 61] that if observations are binary, the decision surfaces in these networks are hyperptanes. We extend this result (hyperplane separability) to Naive Bayes networks with m-ary observations. In addition, we illustrate the effect of observation-observation dependencies on decision surfaces. Finally, we discuss the implications of these results on knowledge acquisition and research in learning.