Neural Computation
Long Short-Term Memory Learns Context Free and Context Sensitive Languages
Long Short-Term Memory Learns Context Free and Context Sensitive Languages
Clause identification with long short-term memory
ConLL '01 Proceedings of the 2001 workshop on Computational Natural Language Learning - Volume 7
Introduction to the CoNLL-2003 shared task: language-independent named entity recognition
CONLL '03 Proceedings of the seventh conference on Natural language learning at HLT-NAACL 2003 - Volume 4
Hi-index | 0.00 |
In this approach to named entity recognition, a recurrent neural network, known as Long Short-Term Memory, is applied. The network is trained to perform 2 passes on each sentence, outputting its decisions on the second pass. The first pass is used to acquire information for disambiguation during the second pass. SARDNET, a self-organising map for sequences is used to generate representations for the lexical items presented to the LSTM network, whilst orthogonal representations are used to represent the part of speech and chunk tags.