Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Neural Networks in Non-Euclidean Spaces
Neural Processing Letters
Circular Back-Propagation Networks for Measuring Displayed Image Quality
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Improved CBP Neural Network Model with Applications in Time Series Prediction
Neural Processing Letters
A functions localized neural network with branch gates
Neural Networks
A feature extraction unsupervised neural network for an environmental data set
Neural Networks - 2003 Special issue: Neural network analysis of complex scientific data: Astronomy and geosciences
Chained DLS-ICBP Neural Networks with Multiple Steps Time Series Prediction
Neural Processing Letters
A Very Fast Learning Method for Neural Networks Based on Sensitivity Analysis
The Journal of Machine Learning Research
Modified Kernel functions by geodesic distance
EURASIP Journal on Applied Signal Processing
Variations of the two-spiral task
Connection Science
Fast Local Support Vector Machines for Large Datasets
MLDM '09 Proceedings of the 6th International Conference on Machine Learning and Data Mining in Pattern Recognition
The discriminating power of random features
Proceedings of the 2009 conference on Neural Nets WIRN09: Proceedings of the 19th Italian Workshop on Neural Nets, Vietri sul Mare, Salerno, Italy, May 28--30 2009
Hybrid Neural Systems for Reduced-Reference Image Quality Assessment
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
No-reference quality assessment of JPEG images by using CBP neural networks
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
How to apply spatial saliency into objective metrics for JPEG compressed images?
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Learning the mean: A neural network approach
Neurocomputing
The build of n-Bits Binary Coding ICBP Ensemble System
Neurocomputing
Hi-index | 0.00 |
The class of mapping networks is a general family of tools to perform a wide variety of tasks. This paper presents a standardized, uniform representation for this class of networks, and introduces a simple modification of the multilayer perceptron with interesting practical properties, especially well suited to cope with pattern classification tasks. The proposed model unifies the two main representation paradigms found in the class of mapping networks for classification, namely, the surface-based and the prototype-based schemes, while retaining the advantage of being trainable by backpropagation. The enhancement in the representation properties and the generalization performance are assessed through results about the worst-case requirement in terms of hidden units and about the Vapnik-Chervonenkis dimension and cover capacity. The theoretical properties of the network also suggest that the proposed modification to the multilayer perceptron is in many senses optimal. A number of experimental verifications also confirm theoretical results about the model's increased performances, as compared with the multilayer perceptron and the Gaussian radial basis functions network