Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
A competitive modular connectionist architecture
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Evaluation of adaptive mixtures of competing experts
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Neural networks and the bias/variance dilemma
Neural Computation
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive mixtures of local experts
Neural Computation
Simultaneous training of negatively correlated neural networks inan ensemble
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.00 |
In this paper we propose a model of neural networks ensemble consisting of a number of MLPs, that deals with an imperfect learning supervisor that occasionally produces incorrect teacher signals. It is known that a conventional unitary neural network will not learn optimally from this kind of supervisor. We consider that the imperfect supervisor generates two kinds of input-output relations, the correct relation and the incorrect one. The learning characteristics of the proposed model allows the ensemble to automatically train one of its members to learn only from the correct input-output relation, producing a neural network that can to some extent tolerate the imperfection of the supervisor.