Multiplying with synapses and neurons
Single neuron computation
Vector logics: the matrix-vector representation of logical calculus
Fuzzy Sets and Systems
NMDA-based pattern discrimination in a modeled cortical neuron
Neural Computation
On the computational power of neural nets
Journal of Computer and System Sciences
Natural Language Grammatical Inference with Recurrent Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Journal of Cognitive Neuroscience
A cognitive architecture that solves a problem stated by Minsky
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A non-linear index to evaluate a journal's scientific impact
Information Sciences: an International Journal
Quantized Neural Modeling: Hybrid Quantized Architecture in Elman Networks
Neural Processing Letters
Hi-index | 0.00 |
The development of neural network models has greatly enhanced the comprehension of cognitive phenomena. Here, we show that models using multiplicative processing of inputs are both powerful and simple to train and understand. We believe they are valuable tools for cognitive explorations. Our model can be viewed as a subclass of networks built on sigma-pi units and we show how to derive the Kronecker product representation from the classical sigma-pi unit. We also show how the connectivity requirements of the Kronecker product can be relaxed considering statistical arguments. We use the multiplicative network to implement what we call an Elman topology, that is, a simple recurrent network (SRN) that supports aspects of language processing. As an application, we model the appearance of hallucinated voices after network damage, and show that we can reproduce results previously obtained with SRNs concerning the pathology of schizophrenia.