Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Neural networks and analog computation: beyond the Turing limit
Neural networks and analog computation: beyond the Turing limit
Modeling word perception using the Elman network
Neurocomputing
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Autoencoder for polysemous word
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Hi-index | 0.00 |
We present a novel method to train the Elman network to learn literal works. This paper reports findings and results during the training process. Both codes and network weights are trained by using this method. The training error can be greatly reduced by iteratively re-encoding all words.