C4.5: programs for machine learning
C4.5: programs for machine learning
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Machine Learning
Computers and Electronics in Agriculture
Semi-supervised learning from only positive and unlabeled data using entropy
WAIM'10 Proceedings of the 11th international conference on Web-age information management
A comparison of machine learning techniques for detection of drug target articles
Journal of Biomedical Informatics
PolyA-iEP: A data mining method for the effective prediction of polyadenylation sites
Expert Systems with Applications: An International Journal
RAGE - A rapid graphlet enumerator for large networks
Computer Networks: The International Journal of Computer and Telecommunications Networking
An associative memory approach to medical decision support systems
Computer Methods and Programs in Biomedicine
Predicting baby feeding method from unstructured electronic health record data
Proceedings of the ACM sixth international workshop on Data and text mining in biomedical informatics
Prior versus contextual emotion of a word in a sentence
WASSA '12 Proceedings of the 3rd Workshop in Computational Approaches to Subjectivity and Sentiment Analysis
Automatic classification of archaeological pottery sherds
Journal on Computing and Cultural Heritage (JOCCH)
Information Sciences: an International Journal
Engineering Applications of Artificial Intelligence
A comparison of machine learning algorithms for proactive hard disk drive failure detection
Proceedings of the 4th international ACM Sigsoft symposium on Architecting critical systems
Can statistical tests be used for feature selection in diachronic text classification?
SLSP'13 Proceedings of the First international conference on Statistical Language and Speech Processing
Prior and contextual emotion of words in sentential context
Computer Speech and Language
Hi-index | 0.00 |
Logistic Model Trees have been shown to be very accurate and compact classifiers [8]. Their greatest disadvantage is the computational complexity of inducing the logistic regression models in the tree. We address this issue by using the AIC criterion [1] instead of cross-validation to prevent overfitting these models. In addition, a weight trimming heuristic is used which produces a significant speedup. We compare the training time and accuracy of the new induction process with the original one on various datasets and show that the training time often decreases while the classification accuracy diminishes only slightly.