Neural network learning and expert systems
Neural network learning and expert systems
The cascade-correlation learning architecture
Advances in neural information processing systems 2
A “thermal” perceptron learning rule
Neural Computation
Neural network constructive algorithms: trading generalization for learning efficiency?
Circuits, Systems, and Signal Processing - Special issue: networks for neural processing
The Java Language Specification
The Java Language Specification
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Constructive neural-network learning algorithms for pattern classification
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Active Learning Using a Constructive Neural Network Algorithm
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Constructive neural networks to predict breast cancer outcome by using gene expression profiles
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part I
Towards optimisation of the management of resources in the CloudSim simulator
International Journal of Innovative Computing and Applications
Improving a dynamic ensemble selection method based on oracle information
International Journal of Innovative Computing and Applications
Committee C-mantec: a probabilistic constructive neural network
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
Hi-index | 0.00 |
Unlike conventional Neural Network (NN) algorithms that require the definition of the NN architecture before learning starts, Constructive Neural Network (CoNN) algorithms enable the network architecture to be constructed along with the learning process. This paper presents and discusses the results of an empirical evaluation of seven two-class CoNN algorithms, namely Tower, Pyramid, Tiling, Upstart, Shift, Perceptron Cascade (PC) and Partial Target Inversion (PTI) in 12 knowledge domains. The way each particular algorithm approaches the growing of the network determines their differences. This paper also presents and analyses empirical results of five multiclass CoNN algorithms in five knowledge domains, namely MTower, MPyramid, MTiling, MUpstart and MPerceptron Cascade, which can be considered extensions of their two-class counterparts. Results obtained with the Pocket with the Ratchet Modification (PRM) algorithm, with its multiclass version, the PRMWTA algorithm and with the back propagation algorithm, are presented for comparison.