IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Circular backpropagation networks for classification
IEEE Transactions on Neural Networks
Circular backpropagation networks embed vector quantization
IEEE Transactions on Neural Networks
Learning neural networks with noisy inputs using the errors-in-variables approach
IEEE Transactions on Neural Networks
Chained DLS-ICBP Neural Networks with Multiple Steps Time Series Prediction
Neural Processing Letters
The build of n-Bits Binary Coding ICBP Ensemble System
Neurocomputing
A competitive ensemble pruning approach based on cross-validation technique
Knowledge-Based Systems
Hi-index | 0.00 |
Circular back-propagation neural network (CBP) put forward by Sandro Ridella and Stefano Rovetta, a generalized model of multi-layer perceptron (MLP), possesses strong capabilities of generalization and adaptation to unknown inputs. And they can flexibly construct vector quantization (VQ) and radial basis function (RBF) networks under the CBP framework. With the original structure of CBP remaining unchanged, in this Letter a more generalized network model ICBP (Improved Circular Back-Propagation Neural Network) was designed by adding an extensive node with quadratic form to the original CBP inputs and endowing fixed values to the weights between this node and all the hidden nodes. An interesting property of ICBP is that although it has less adaptable weights, it is better in generalization and adaptability than CBP. Moreover, in order to partially solve the problem of local minima, we adopt the method of adding controlled noise to desired outputs. Finally, ithas been proved by experiments that ICBP is better than CBP in the capabilities of forecasting and function approximation.