Fast learning in networks of locally-tuned processing units
Neural Computation
The general regression neural network-Rediscovered
Neural Networks
Mutual Information Theory for Adaptive Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive Metric Kernel Regression
Journal of VLSI Signal Processing Systems
Applying Mutual Information to Adaptive Mixture Models
IDEAL '00 Proceedings of the Second International Conference on Intelligent Data Engineering and Automated Learning, Data Mining, Financial Engineering, and Intelligent Agents
2005 Special Issue: An incremental regression method for graph structured data
Neural Networks - Special issue on neural networks and kernel methods for structured domains
Genetic Programming and Evolvable Machines
On detecting nonlinear patterns in discriminant problems
Information Sciences: an International Journal
Random Projection RBF Nets for Multidimensional Density Estimation
International Journal of Applied Mathematics and Computer Science - Issues in Fault Diagnosis and Fault Tolerant Control
Response to letter by D. F. Specht
Neural Networks
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
This article presents a general theoretical basis for the construction of mapping neural networks. The theory is based on the Parzen Window estimator for joint probability density functions. From the density estimator a consistent estimator for continous conditional expectation functions is deduced. The latter estimator is considered as being the general outline of a variety of mapping neural networks. Networks developed by other authors are considered special cases and a novel strategy for training such networks is presented.