Neural Networks for Combinatorial Optimization: a Review of More Than a Decade of Research
INFORMS Journal on Computing
New Optimization Algorithms in Physics
New Optimization Algorithms in Physics
Application of domain neural network to optimization tasks
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
A noisy self-organizing neural network with bifurcation dynamics for combinatorial optimization
IEEE Transactions on Neural Networks
A columnar competitive model for solving combinatorial optimization problems
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Adaptive Behavior - Animals, Animats, Software Agents, Robots, Adaptive Systems
Binary perceptron learning algorithm using simplex-method
ICAISC'12 Proceedings of the 11th international conference on Artificial Intelligence and Soft Computing - Volume Part I
Vector perceptron learning algorithm using linear programming
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Hi-index | 0.00 |
The problem of binary optimization of a quadratic functional is discussed. By analyzing the generalized Hopfield model we obtain expressions describing the relationship between the depth of a local minimum and the size of the basin of attraction. Based on this, we present the probability of finding a local minimum as a function of the depth of the minimum. Such a relation can be used in optimization applications: it allows one, basing on a series of already found minima, to estimate the probability of finding a deeper minimum, and to decide in favor of or against further running the program. The iterative algorithm that allows us to represent any symmetric N×Nmatrix as a weighted Hebbian series of bipolar vectors with a given accuracy is proposed. It so proves that all conclusions about neural networks and optimization algorithms that are based on Hebbian matrices are true for any other type of matrix. The theory is in a good agreement with experimental results.