Linear function neurons: Structure and training
Biological Cybernetics
Communications of the ACM
Cooling schedules for optimal annealing
Mathematics of Operations Research
On the complexity of loading shallow neural networks
Journal of Complexity - Special Issue on Neural Computation
Simulated annealing: theory and applications
Simulated annealing: theory and applications
Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing
Neural Computers
Explorations in parallel distributed processing: a handbook of models, programs, and exercises
Explorations in parallel distributed processing: a handbook of models, programs, and exercises
Neural network design and the complexity of learning
Neural network design and the complexity of learning
Training a 3-node neural network in NP-complete
Advances in neural information processing systems 1
Learning capabilities of Boolean networks
Neural computing architectures
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Creating artificial neural networks that generalize
Neural Networks
The design of intelligent robots as a federation of geometric machines
An introduction to neural and electronic networks
Brain style computation: learning and generalization
An introduction to neural and electronic networks
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Learning Translation Invariant Recognition in Massively Parallel Networks
Proceedings of the Parallel Architectures and Languages Europe, Volume I: Parallel Architectures PARLE
Complexity of Connectionist Learning with Various Node Functions
Complexity of Connectionist Learning with Various Node Functions
IEEE Transactions on Parallel and Distributed Systems
Open issues in genetic programming
Genetic Programming and Evolvable Machines
Hi-index | 0.00 |
The authors discuss the requirements of learning for generalization, where the traditional methods based on gradient descent have limited success. A stochastic learning algorithm based on simulated annealing in weight space is presented. The authors verify the convergence properties and feasibility of the algorithm. An implementation of the algorithm and validation experiments are described.