Perceptrons: expanded edition
Cooling schedules for optimal annealing
Mathematics of Operations Research
Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing
The perceptron algorithm is fast for nonmalicious distributions
Neural Computation
Symbolic and Neural Learning Algorithms: An Experimental Comparison
Machine Learning
Robust trainability of single neurons
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Learning linear threshold functions in the presence of classification noise
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Learning linear threshold approximations using perceptrons
Neural Computation
Journal of Complexity - Special issue for the Foundations of Computational Mathematics conference, Rio de Janeiro, Brazil, Jan. 1997
Stochastic simulations of two-dimensional composite packings
Journal of Computational Physics
Learning noisy perceptrons by a perceptron in polynomial time
FOCS '97 Proceedings of the 38th Annual Symposium on Foundations of Computer Science
Perceptron-based learning algorithms
IEEE Transactions on Neural Networks
Binary and multicategory classification accuracy of the LSA machine
ICCMSE '03 Proceedings of the international conference on Computational methods in sciences and engineering
MUSP'09 Proceedings of the 9th WSEAS international conference on Multimedia systems & signal processing
Designing neural networks for tackling hard classification problems
WSEAS TRANSACTIONS on SYSTEMS
Estimating the size of neural networks from the number of available training data
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
An Epicurean learning approach to gene-expression data classification
Artificial Intelligence in Medicine
Hi-index | 0.00 |
We present results of computational experiments with an extension of the Perceptron algorithm by a special type of simulated annealing. The simulated annealing procedure employs a logarithmic cooling schedule c(k)=Γ/ln(k+2), where Γ is a parameter that depends on the underlying configuration space. For sample sets S of n-dimensional vectors generated by randomly chosen polynomials w1·x1a1+···+wn·xnan⩾ϑ, we try to approximate the positive and negative examples by linear threshold functions. The approximations are computed by both the classical Perceptron algorithm and our extension with logarithmic cooling schedules. For n=256,…, 1024 and ai=3,…, 7, the extension outperforms the classical Perceptron algorithm by about 15% when the sample size is sufficiently large. The parameter Γ was chosen according to estimations of the maximum escape depth from local minima of the associated energy landscape.