The complexity of Boolean functions
The complexity of Boolean functions
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Depth-Size Tradeoffs for Neural Computation
IEEE Transactions on Computers - Special issue on artificial neural networks
Original Contribution: Parity with two layer feedforward nets
Neural Networks
Size--Depth Tradeoffs for Threshold Circuits
SIAM Journal on Computing
Solving the N-bit parity problem using neural networks
Neural Networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Fundamentals of Artificial Neural Networks
Fundamentals of Artificial Neural Networks
Artificial Intelligence: A Guide to Intelligent Systems
Artificial Intelligence: A Guide to Intelligent Systems
Cellular neural networks and visual computing: foundations and applications
Cellular neural networks and visual computing: foundations and applications
Perceptrons: An Introduction to Computational Geometry
Perceptrons: An Introduction to Computational Geometry
Minimal Feedforward Parity Networks Using Threshold Gates
Neural Computation
IEEE Transactions on Computers
On threshold circuits for parity
SFCS '90 Proceedings of the 31st Annual Symposium on Foundations of Computer Science
Single-layer perceptron and dynamic neuron implementing linearly non-separable Boolean functions
International Journal of Circuit Theory and Applications
IEEE Transactions on Neural Networks
Periodic symmetric functions, serial addition, and multiplication with neural networks
IEEE Transactions on Neural Networks
Generalization properties of modular networks: implementing the parity function
IEEE Transactions on Neural Networks
The geometrical learning of binary neural networks
IEEE Transactions on Neural Networks
Classification of linearly nonseparable patterns by linear threshold elements
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A novel neural network parallel adder
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
Hi-index | 0.00 |
Universal perceptron (UP), a generalization of Rosenblatt's perceptron, is considered in this paper, which is capable of implementing all Boolean functions (BFs). In the classification of BFs, there are: 1) linearly separable Boolean function (LSBF) class, 2) parity Boolean function (PBF) class, and 3) non-LSBF and non-PBF class. To implement these functions, UP takes different kinds of simple topological structures in which each contains at most one hidden layer along with the smallest possible number of hidden neurons. Inspired by the concept of DNA sequences in biological systems, a novel learning algorithm named DNA-like learning is developed, which is able to quickly train a network with any prescribed BF. The focus is on performing LSBF and PBF by a single-layer perceptron (SLP) with the new algorithm. Two criteria for LSBF and PBF are proposed, respectively, and a new measure for a BF, named nonlinearly separable degree (NLSD), is introduced. In the sense of this measure, the PBF is the most complex one. The new algorithm has many advantages including, in particular, fast running speed, good robustness, and no need of considering the convergence property. For example, the number of iterations and computations in implementing the basic 2-bit logic operations such as AND, OR, and XOR by using the new algorithm is far smaller than the ones needed by using other existing algorithms such as error-correction (EC) and backpropagation (BP) algorithms. Moreover, the synaptic weights and threshold values derived from UP can be directly used in designing of the template of cellular neural networks (CNNs), which has been considered as a new spatial-temporal sensory computing paradigm.