Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Instance-Based Learning Algorithms
Machine Learning
A Pattern Recognition Approach for Software Engineering Data Analysis
IEEE Transactions on Software Engineering - Special issue on software measurement principles, techniques, and environments
An entropy-based learning algorithm of Bayesian conditional trees
UAI '92 Proceedings of the eighth conference on Uncertainty in Artificial Intelligence
C4.5: programs for machine learning
C4.5: programs for machine learning
Lazy learning
Discovering Patterns in EEG-Signals: Comparative Study of a Few Methods
ECML '93 Proceedings of the European Conference on Machine Learning
Induction of Recursive Bayesian Classifiers
ECML '93 Proceedings of the European Conference on Machine Learning
Classification Learning Using All Rules
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Adjusted Probability Naive Bayesian Induction
AI '98 Selected papers from the 11th Australian Joint Conference on Artificial Intelligence on Advanced Topics in Artificial Intelligence
Improved use of continuous attributes in C4.5
Journal of Artificial Intelligence Research
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Building classifiers using Bayesian networks
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Learning Recursive Bayesian Multinets for Data Clustering by Means of Constructive Induction
Machine Learning - Special issue: Unsupervised learning
Candidate Elimination Criteria for Lazy Bayesian Rules
AI '01 Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
SNNB: A Selective Neighborhood Based Naïve Bayes for Lazy Learning
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Data Mining with Products of Trees
IDA '01 Proceedings of the 4th International Conference on Advances in Intelligent Data Analysis
Not so naive Bayes: aggregating one-dependence estimators
Machine Learning
Learning Instance Greedily Cloning Naive Bayes for Ranking
ICDM '05 Proceedings of the Fifth IEEE International Conference on Data Mining
Efficient lazy elimination for averaged one-dependence estimators
ICML '06 Proceedings of the 23rd international conference on Machine learning
A decision tree-based attribute weighting filter for naive Bayes
Knowledge-Based Systems
Local strategy learning in networked multi-agent team formation
Autonomous Agents and Multi-Agent Systems
Customized classification learning based on query projections
Information Sciences: an International Journal
Toward Exploratory Test-Instance-Centered Diagnosis in High-Dimensional Classification
IEEE Transactions on Knowledge and Data Engineering
A comprehensive review of recursive Naïve Bayes Classifiers
Intelligent Data Analysis
IEEE Transactions on Knowledge and Data Engineering
Machine learning: a review of classification and combining techniques
Artificial Intelligence Review
Survey of Improving Naive Bayes for Classification
ADMA '07 Proceedings of the 3rd international conference on Advanced Data Mining and Applications
Finding the Right Family: Parent and Child Selection for Averaged One-Dependence Estimators
ECML '07 Proceedings of the 18th European conference on Machine Learning
Analysis of Naive Bayes' assumptions on software fault data: An empirical study
Data & Knowledge Engineering
GAODE and HAODE: two proposals based on AODE to deal with continuous variables
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Bayesian clustering for email campaign detection
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Boosting Local Naïve Bayesian Rules
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Anytime learning and classification for online applications
Proceedings of the 2006 conference on Advances in Intelligent IT: Active Media Technology 2006
Supervised Machine Learning: A Review of Classification Techniques
Proceedings of the 2007 conference on Emerging Artificial Intelligence Applications in Computer Engineering: Real Word AI Systems with Applications in eHealth, HCI, Information Retrieval and Pervasive Technologies
HODE: Hidden One-Dependence Estimator
ECSQARU '09 Proceedings of the 10th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
A Multi-Strategy Approach to KNN and LARM on Small and Incrementally Induced Prediction Knowledge
ADMA '09 Proceedings of the 5th International Conference on Advanced Data Mining and Applications
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
Techniques for evolutionary rule discovery in data mining
CEC'09 Proceedings of the Eleventh conference on Congress on Evolutionary Computation
An Extendable Meta-learning Algorithm for Ontology Mapping
FQAS '09 Proceedings of the 8th International Conference on Flexible Query Answering Systems
A hybrid classification method of k nearest neighbor, Bayesian methods and genetic algorithm
Expert Systems with Applications: An International Journal
A new restricted Bayesian network classifier
PAKDD'03 Proceedings of the 7th Pacific-Asia conference on Advances in knowledge discovery and data mining
Apply a rough set-based classifier to dependency parsing
RSKT'08 Proceedings of the 3rd international conference on Rough sets and knowledge technology
The Knowledge Engineering Review
Scaling up the accuracy of Bayesian classifier based on frequent itemsets by m-estimate
AICI'10 Proceedings of the 2010 international conference on Artificial intelligence and computational intelligence: Part I
Data classification using rough sets and naïve Bayes
RSKT'10 Proceedings of the 5th international conference on Rough set and knowledge technology
Learning Instance-Specific Predictive Models
The Journal of Machine Learning Research
NB+: An improved Naïve Bayesian algorithm
Knowledge-Based Systems
A 'non-parametric' version of the naive Bayes classifier
Knowledge-Based Systems
K nearest neighbor reinforced expectation maximization method
Expert Systems with Applications: An International Journal
To select or to weigh: a comparative study of model selection and model weighing for SPODE ensembles
ECML'06 Proceedings of the 17th European conference on Machine Learning
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Naïve bayesian tree pruning by local accuracy estimation
ADMA'06 Proceedings of the Second international conference on Advanced Data Mining and Applications
Robust bayesian linear classifier ensembles
ECML'05 Proceedings of the 16th European conference on Machine Learning
Learning k-nearest neighbor naive bayes for ranking
ADMA'05 Proceedings of the First international conference on Advanced Data Mining and Applications
A bayesian metric for evaluating machine learning algorithms
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
Instance cloning local naive bayes
AI'05 Proceedings of the 18th Canadian Society conference on Advances in Artificial Intelligence
Enhancing SNNB with local accuracy estimation and ensemble techniques
DASFAA'05 Proceedings of the 10th international conference on Database Systems for Advanced Applications
Lazy averaged one-dependence estimators
AI'06 Proceedings of the 19th international conference on Advances in Artificial Intelligence: Canadian Society for Computational Studies of Intelligence
Double-layer bayesian classifier ensembles based on frequent itemsets
International Journal of Automation and Computing
Non-Disjoint discretization for aggregating one-dependence estimator classifiers
HAIS'12 Proceedings of the 7th international conference on Hybrid Artificial Intelligent Systems - Volume Part II
Techniques for efficient learning without search
PAKDD'12 Proceedings of the 16th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part I
LBR-Meta: an efficient algorithm for lazy Bayesian rules
AusDM '08 Proceedings of the 7th Australasian Data Mining Conference - Volume 87
Improving naive Bayes classifier using conditional probabilities
AusDM '11 Proceedings of the Ninth Australasian Data Mining Conference - Volume 121
Alleviating naive Bayes attribute independence assumption by attribute weighting
The Journal of Machine Learning Research
Hi-index | 0.00 |
The naive Bayesian classifier provides a simple and effective approach to classifier learning, but its attribute independence assumption is often violated in the real world. A number of approaches have sought to alleviate this problem. A Bayesian tree learning algorithm builds a decision tree, and generates a local naive Bayesian classifier at each leaf. The tests leading to a leaf can alleviate attribute inter-dependencies for the local naive Bayesian classifier. However, Bayesian tree learning still suffers from the small disjunct problem of tree learning. While inferred Bayesian trees demonstrate low average prediction error rates, there is reason to believe that error rates will be higher for those leaves with few training examples. This paper proposes the application of lazy learning techniques to Bayesian tree induction and presents the resulting lazy Bayesian rule learning algorithm, called LBR. This algorithm can be justified by a variant of Bayes theorem which supports a weaker conditional attribute independence assumption than is required by naive Bayes. For each test example, it builds a most appropriate rule with a local naive Bayesian classifier as its consequent. It is demonstrated that the computational requirements of LBR are reasonable in a wide cross-section of natural domains. Experiments with these domains show that, on average, this new algorithm obtains lower error rates significantly more often than the reverse in comparison to a naive Bayesian classifier, C4.5, a Bayesian tree learning algorithm, a constructive Bayesian classifier that eliminates attributes and constructs new attributes using Cartesian products of existing nominal attributes, and a lazy decision tree learning algorithm. It also outperforms, although the result is not statistically significant, a selective naive Bayesian classifier.