Communications of the ACM
Connectionist learning procedures
Artificial Intelligence
Extracting Refined Rules from Knowledge-Based Neural Networks
Machine Learning
A fast parallel algorithm for thinning digital patterns
Communications of the ACM
Logic Minimization Algorithms for VLSI Synthesis
Logic Minimization Algorithms for VLSI Synthesis
Rule learning by searching on adapted nets
AAAI'91 Proceedings of the ninth National conference on Artificial intelligence - Volume 2
Data mining for design and manufacturing
Hi-index | 0.10 |
It is common to view multiple-layer feedforward neural networks as black boxes since the knowledge embedded in the connection weights of these networks is generally considered incomprehensible. This paper proposes a solution to this deficiency of neural networks by suggesting a mapping procedure for converting the weights of a neuron into a symbolic representation and demonstrating its use towards understanding the internal representation and the input-output mapping learned by a feedforward neural network. Several examples are presented to illustrate the proposed symbolic mapping of neurons.