Introduction to the theory of neural computation
Introduction to the theory of neural computation
Maximum likelihood competitive learning
Advances in neural information processing systems 2
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Adaptive mixtures of local experts
Neural Computation
A Bayesian Multiple Models Combination Method for Time Series Prediction
Journal of Intelligent and Robotic Systems
Multilevel Genetic Algorithm for the Complete Development of ANN
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
Time-Series Segmentation Using Predictive Modular Neural Networks
Neural Computation
Evolving neurocomputing systems for horticulture applications
Applied Soft Computing
Hi-index | 0.00 |
An incremental credit assignment (ICRA) scheme is introduced and applied to time series classification. It has been inspired from Bayes' rule, but the Bayesian connection is not necessary either for its development or proof of its convergence properties. The ICRA scheme is implemented by a recurrent, hierarchical, modular neural network, which consists of a bank of predictive modules at the lower level, and a decision module at the higher level. For each predictive module, a credit function is computed; the module that best predicts the observed time series behavior receives highest credit. We prove that the credit functions converge (with probability one) to correct values. Simulation results are also presented.