A new discriminative kernel from probabilistic models
Neural Computation
Learning equivalence classes of bayesian-network structures
The Journal of Machine Learning Research
Limitations of learning via embeddings in euclidean half spaces
The Journal of Machine Learning Research
Asymptotic properties of the Fisher kernel
Neural Computation
Diffusion Kernels on Statistical Manifolds
The Journal of Machine Learning Research
Inner Product Spaces for Bayesian Networks
The Journal of Machine Learning Research
Kernels on Prolog Proof Trees: Statistical Learning in the ILP Setting
The Journal of Machine Learning Research
Compression-Based Averaging of Selective Naive Bayes Classifiers
The Journal of Machine Learning Research
Max-margin Classification of Data with Absent Features
The Journal of Machine Learning Research
Discriminative Learning of Max-Sum Classifiers
The Journal of Machine Learning Research
Operations for inference in continuous Bayesian networks with linear deterministic variables
International Journal of Approximate Reasoning
Learning Bayesian network parameters under order constraints
International Journal of Approximate Reasoning
A transformational characterization of equivalent Bayesian network structures
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
AICI'11 Proceedings of the Third international conference on Artificial intelligence and computational intelligence - Volume Part III
On the properties of concept classes induced by multivalued Bayesian networks
Information Sciences: an International Journal
Hi-index | 0.00 |
Bayesian networks are graphical tools used to represent a high-dimensional probability distribution. They are used frequently in machine learning and many applications such as medical science. This paper studies whether the concept classes induced by a Bayesian network can be embedded into a low-dimensional inner product space. We focus on two-label classification tasks over the Boolean domain. For full Bayesian networks and almost full Bayesian networks with n variables, we show that VC dimension and the minimum dimension of the inner product space induced by them are 2^n-1. Also, for each Bayesian network N we show that VCdim(N)=Edim(N)=2^n^-^1+2^i if the network N^' constructed from N by removing X"n satisfies either (i) N^' is a full Bayesian network with n-1 variables, i is the number of parents of X"n, and i