An incremental method for finding multivariate splits for decision trees
Proceedings of the seventh international conference (1990) on Machine learning
Neural networks
C4.5: programs for machine learning
C4.5: programs for machine learning
Bayesian classification (AutoClass): theory and results
Advances in knowledge discovery and data mining
Exploiting neural trees in range image understanding
Pattern Recognition Letters
Introduction to the Theory of Neural Computation
Introduction to the Theory of Neural Computation
A Probabilistic Classification System for Predicting the Cellular Localization Sites of Proteins
Proceedings of the Fourth International Conference on Intelligent Systems for Molecular Biology
Genetic Programming for data classification: partitioning the search space
Proceedings of the 2004 ACM symposium on Applied computing
Theory and application of cellular automata for pattern classification
Fundamenta Informaticae - Special issue on cellular automata
Guidelines to Select Machine Learning Scheme for Classification of Biomedical Datasets
EvoBIO '09 Proceedings of the 7th European Conference on Evolutionary Computation, Machine Learning and Data Mining in Bioinformatics
A system for induction of oblique decision trees
Journal of Artificial Intelligence Research
Decision tree learning with fuzzy labels
Information Sciences: an International Journal
An adaptive high-order neural tree for pattern recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Decision trees can initialize radial-basis function networks
IEEE Transactions on Neural Networks
Generalized neural trees for pattern classification
IEEE Transactions on Neural Networks
A tree-structured adaptive network for function approximation in high-dimensional spaces
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper proposes a new neural tree (NT) architecture, balanced neural tree (BNT), to reduce tree size and improve classification with respect to classical NTs. To achieve this result, two main innovations have been introduced: (a) perceptron substitution and (b) pattern removal. The first innovation aims to balance the structure of the tree. If the last-trained perceptron largely misclassifies the given training set into a reduced number of classes, then this perceptron is substituted with a new perceptron. The second novelty consists of the introduction of a new criterion for the removal of tough training patterns that generate the problem of over-fitting. Finally, a new error function based on the depth of the tree is introduced to reduce perceptron training time. The proposed BNT has been tested on various synthetic and real datasets. The experimental results show that the proposed BNT leads to satisfactory results in terms of both tree depth reduction and classification accuracy.