Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Mechanisms of sentence processing: assigning roles to constituents
Parallel distributed processing
Extracting Refined Rules from Knowledge-Based Neural Networks
Machine Learning
Symbolic Representation of Neural Networks
Computer - Special issue: neural computing: companion issue to Spring 1996 IEEE Computational Science & Engineering
Natural Language Grammatical Inference with Recurrent Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Hybrid thematic role processor: symbolic linguistic relations revised by connectionist learning
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
In recent years, the Natural Language Processing scene has witnessed the steady growth of interest in connectionist modeling. The main appeal of such an approach is that one does not have to determine the grammar rules in advance: the learning abilities displayed by such systems take care of input regularities. Better and faster learning can be obtained through the implementation of a symbolic-connectionist hybrid system. Such system combines the advantages of symbolic approaches, by introducing symbolic rules as network connection weights, with the advantages of connectionism. In a hybrid system called HTRP, words within a sentence are represented by means of semantic features. The features for the verbs are arranged along certain semantic dimensions, and are mutually exclusive within each dimension. One may infer that this happens because of the semantic features encoded in the network inputs.