The geometry of ill-conditioning
Journal of Complexity
Universal approximation using radial-basis-function networks
Neural Computation
Approximation and radial-basis-function networks
Neural Computation
A Fast Simplified Fuzzy ARTMAP Network
Neural Processing Letters
The MSFAM: a modified fuzzy ARTMAP system
Pattern Analysis & Applications
GFAM: Evolving Fuzzy ARTMAP neural networks
Neural Networks
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Semi-supervised Bayesian ARTMAP
Applied Intelligence
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
μARTMAP: use of mutual information for category reduction in Fuzzy ARTMAP
IEEE Transactions on Neural Networks
A general regression neural network
IEEE Transactions on Neural Networks
Fuzzy ARTMAP with input relevances
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Best Approximation of Gaussian Neural Networks With Nodes Uniformly Spaced
IEEE Transactions on Neural Networks
ART-EMAP: A neural network architecture for object recognition by evidence accumulation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A fuzzy ARTMAP nonparametric probability estimator for nonstationary pattern recognition problems
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons.