C4.5: programs for machine learning
C4.5: programs for machine learning
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
A framework for constructing features and models for intrusion detection systems
ACM Transactions on Information and System Security (TISSEC)
A Survey of Methods for Scaling Up Inductive Algorithms
Data Mining and Knowledge Discovery
Tree Induction for Probability-Based Ranking
Machine Learning
On learning to predict web traffic
Decision Support Systems - Special issue: Web data mining
A complete fuzzy decision tree technique
Fuzzy Sets and Systems - Theme: Learning and modeling
Building multi-way decision trees with numerical attributes
Information Sciences: an International Journal
Machine Learning
A Simple Lexicographic Ranker and Probability Estimator
ECML '07 Proceedings of the 18th European conference on Machine Learning
Multivariate decision trees using linear discriminants and tabu search
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
Recent work in feature-based classification has focused on non-parametric techniques that can classify samples. Decision trees are one of the most popular choices for learning and reasoning from feature-based examples. Many machine learning systems have been developed for constructing decision trees from collection of examples. So far, grading of arecanut is done by trained experts manually. Currently, no work has been attempted towards automated classification of arecanuts. This paper discusses technique for classification of arecanut based on texture features. Classification is done using Mean around features, Gray level co-occurrence matrix (GLCM) features and combined (Mean around-GLCM) features. Decision trees classifier is used for classification of arecanut in to six classes. Results obtained from the proposed method are well accepted and solutions are good agreement with the agricultural experts. Proposed approach is experimented on large data set using cross validation and found approximately 99.05% success rate.