Fourier-Based Inspection of Free-Form Reflective Surfaces
ACIVS '08 Proceedings of the 10th International Conference on Advanced Concepts for Intelligent Vision Systems
Temperature prediction in electric arc furnace with neural network tree
ICANN'11 Proceedings of the 21st international conference on Artificial neural networks - Volume Part II
A balanced neural tree for pattern classification
Neural Networks
Neural network-based nonlinear tracking control of kinematically redundant robot manipulators
Mathematical and Computer Modelling: An International Journal
Incorporating linear discriminant analysis in neural tree for multidimensional splitting
Applied Soft Computing
Hi-index | 0.00 |
In this paper, a new neural tree (NT) model, the generalized NT (GNT), is presented. The main novelty of the GNT consists in the definition of a new training rule that performs an overall optimization of the tree. Each time the tree is increased by a new level, the whole tree is reevaluated. The training rule uses a weight correction strategy that takes into account the entire tree structure, and it applies a normalization procedure to the activation values of each node such that these values can be interpreted as a probability. The weight connection updating is calculated by minimizing a cost function, which represents a measure of the overall probability of correct classification. Significant results on both synthetic and real data have been obtained by comparing the classification performances among multilayer perceptrons (MLPs), NTs, and GNTs. In particular, the GNT model displays good classification performances for training sets having complex distributions. Moreover, its particular structure provides an easily probabilistic interpretation of the pattern classification task and allows growing small neural trees with good generalization properties.