A massively parallel architecture for a self-organizing neural pattern recognition machine
Computer Vision, Graphics, and Image Processing
A Parzen classifier with an improved robustness against deviations between training and test data
Pattern Recognition Letters
The Hilbert Kernal regression estimate
Journal of Multivariate Analysis
On-line learning in neural networks
On-line learning in neural networks
Minimum-Entropy Data Partitioning Using Reversible Jump Markov Chain Monte Carlo
IEEE Transactions on Pattern Analysis and Machine Intelligence
Information Theoretic Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering Algorithms
Fusion of GRNN and FA for Online Noisy Data Regression
Neural Processing Letters
Principal Surfaces from Unsupervised Kernel Regression
IEEE Transactions on Pattern Analysis and Machine Intelligence
Non-linear channel equalization using adaptive MPNN
Applied Soft Computing
Advanced Engineering Informatics
A neural networks approach to image data compression
Applied Soft Computing
Electric load forecasting using a fuzzy ART&ARTMAP neural network
Applied Soft Computing
Application of artificial neural network to building compartment design for fire safety
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Entropy expressions and their estimators for multivariate distributions
IEEE Transactions on Information Theory
A Convergence Theorem for the Fuzzy ISODATA Clustering Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
An axiomatic approach to soft learning vector quantization and clustering
IEEE Transactions on Neural Networks
On overfitting, generalization, and randomly expanded training sets
IEEE Transactions on Neural Networks
A general regression neural network
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper presents a probabilistic-entropy-based neural network (PENN) model for tackling online data regression problems. The network learns online with an incremental growth network structure and performs regression in a noisy environment. The training samples presented to the model are clustered into hyperellipsoidal Gaussian kernels in the joint space of the input and output domains by using the principles of Bayesian classification and minimization of entropy. The joint probability distribution is established by applying the Parzen density estimator to the kernels. The prediction is carried out by evaluating the expected conditional mean of the output space with the given input vector. The PENN model is demonstrated to be able to remove symmetrically distributed noise embedded in the training samples. The performance of the model was evaluated by three benchmarking problems with noisy data (i.e., Ozone, Friedman#1, and Santa Fe Series E). The results show that the PENN model is able to outperform, statistically, other artificial neural network models. The PENN model is also applied to solve a fire safety engineering problem. It has been adopted to predict the height of the thermal interface which is one of the indicators of fire safety level of the fire compartment. The data samples are collected from a real experiment and are noisy in nature. The results show the superior performance of the PENN model working in a noisy environment, and the results are found to be acceptable according to industrial requirements.