On pattern, categories, and alternate realities
Pattern Recognition Letters
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
The roots of backpropagation: from ordered derivatives to neural networks and political forecasting
Backpropagation: theory, architectures, and applications
Backpropagation: theory, architectures, and applications
Neural networks for pattern recognition
Neural networks for pattern recognition
Elements of artificial neural networks
Elements of artificial neural networks
Hi-index | 0.01 |
In 1957 the psychologist Frank Rosenblatt proposed "The Perceptron: a perceiving and recognizing automaton" as a class of artificial nerve nets, embodying aspects of the brain and receptors of biological systems. Fig. 1 shows the network of the Mark 1 Perceptron. Later, Rosenblatt protested that the term perceptron, originally intended as a generic name for a variety of theoretical nerve nets, was actually associated with a very specific piece of hardware (Rosenblatt, 1962). The basic building block of a perceptron is an element that accepts a number of inputs xi, i = 1,..., N, and computes a weighted sum of these inputs where, for each input, its fixed weight ω can be only + 1 or - 1. The sum is then compared with a threshold θ, and an output y is produced that is either 0 or 1, depending on whether or not the sum exceeds the threshold. In other words