Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic similarity networks
Probabilistic similarity networks
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning - Special issue on learning with probabilistic representations
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
A tutorial on learning with Bayesian networks
Learning in graphical models
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Learning Bayesian networks from data: an information-theory based approach
Artificial Intelligence
Tree Induction for Probability-Based Ranking
Machine Learning
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
A Bayesian approach to learning Bayesian networks with local structure
UAI'97 Proceedings of the Thirteenth conference on Uncertainty in artificial intelligence
Learning Bayesian networks with local structure
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
On the sample complexity of learning Bayesian networks
UAI'96 Proceedings of the Twelfth international conference on Uncertainty in artificial intelligence
Transfer Learning for Bayesian Networks
IBERAMIA '08 Proceedings of the 11th Ibero-American conference on AI: Advances in Artificial Intelligence
Multimedia ontology learning for automatic annotation and video browsing
MIR '08 Proceedings of the 1st ACM international conference on Multimedia information retrieval
Modular Bayesian Network Learning for Mobile Life Understanding
IDEAL '08 Proceedings of the 9th International Conference on Intelligent Data Engineering and Automated Learning
Fitting a graph to vector data
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
A Hybrid Intelligent Method for Performance Modeling and Prediction of Workflow Activities in Grids
CCGRID '09 Proceedings of the 2009 9th IEEE/ACM International Symposium on Cluster Computing and the Grid
Landmark detection from mobile life log using a modular Bayesian network model
Expert Systems with Applications: An International Journal
Multi-category bioinformatics dataset classification using extreme learning machine
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
Neural Networks
Efficient learning of Bayesian network classifiers: an extension to the TAN classifier
AI'07 Proceedings of the 20th Australian joint conference on Advances in artificial intelligence
Preservation of intangible heritage: a case-study of indian classical dance
Proceedings of the second workshop on eHeritage and digital art preservation
Learning graphical models for hypothesis testing and classification
IEEE Transactions on Signal Processing
Long term cardiovascular risk models' combination
Computer Methods and Programs in Biomedicine
Discriminative Learning of Bayesian Networks via Factorized Conditional Log-Likelihood
The Journal of Machine Learning Research
Bayesian classifiers for positive unlabeled learning
WAIM'11 Proceedings of the 12th international conference on Web-age information management
Nrityakosha: Preserving the intangible heritage of Indian classical dance
Journal on Computing and Cultural Heritage (JOCCH)
Predicting user personality by mining social interactions in Facebook
Journal of Computer and System Sciences
A self-learning nurse call system
Computers in Biology and Medicine
Computers in Biology and Medicine
Hi-index | 0.01 |
The structure of a Bayesian network (BN) encodes variable independence. Learning the structure of a BN, however, is typically of high computational complexity. In this paper, we explore and represent variable independence in learning conditional probability tables (CPTs), instead of in learning structure. A full Bayesian network is used as the structure and a decision tree is learned for each CPT. The resulting model is called full Bayesian network classifiers (FBCs). In learning an FBC, learning the decision trees for CPTs captures essentially both variable independence and context-specific independence. We present a novel, efficient decision tree learning, which is also effective in the context of FBC learning. In our experiments, the FBC learning algorithm demonstrates better performance in both classification and ranking compared with other state-of-the-art learning algorithms. In addition, its reduced effort on structure learning makes its time complexity quite low as well.