Learning and applying contextual constraints in sentence comprehension
Artificial Intelligence - On connectionist symbol processing
International Journal of Approximate Reasoning
Neural network implementation of fuzzy logic
Fuzzy Sets and Systems
Hi-index | 0.00 |
Rigorous, formal treatments of neural network fundamentals (i.e., treatments whose arguments consist primarily of theorems and proofs) have by now focused on a number of aspects. The convergence properties (stability) and capacity of neural nets of various types have been analyzed in this manner to one degree or another (e.g., [1-3]), and their expressive power has also been the subject of a number of formal analyses (e.g.,[4]). Though not necessarily perfectly rigorous in the sense just mentioned, formal treatments of network dynamics also exist. For reasons which will become clear below, the remarkable paper of Amari and Maginu [5] is especially worthy of note. In this paper, the authors demonstrate that the size of the "basin" of an equilibrium (learned, if you will) state in an autocorrelation associative memory stands in a complex (but predictable) relationship with the distance between the state and the input vector. This paper, along with others by Amari and his coworkers, illustrates clearly that an indepth analysis of somewhat mundane properties of neural networks may yield surprising results of considerable significance.