Integer and combinatorial optimization
Integer and combinatorial optimization
On the stability of the travelling salesman problem algorithm of Hopfield and Tank
Biological Cybernetics
Chaotic Potts spin model for combinatorial optimization problems
Neural Networks
A New Relaxation Procedure in the Hopfield Network for Solving Optimization Problems
Neural Processing Letters
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Neural Networks for Combinatorial Optimization: a Review of More Than a Decade of Research
INFORMS Journal on Computing
IEEE Transactions on Neural Networks
Neural techniques for combinatorial optimization with applications
IEEE Transactions on Neural Networks
A parallel algorithm for tiling problems
IEEE Transactions on Neural Networks
A theoretical investigation into the performance of the Hopfield model
IEEE Transactions on Neural Networks
Performance and fault-tolerance of neural networks for optimization
IEEE Transactions on Neural Networks
A new Lagrangian net algorithm for solving max-bisection problems
Journal of Computational and Applied Mathematics
Hi-index | 0.01 |
We propose a Lagrangian object relaxation technique that can obtain a more near-optimal solution for the traveling salesman problem (TSP). It consists of two stages. First, a feasible solution is calculated and second, a more near-optimal solution is calculated by a Hopfield neural network (HNN). The Lagrangian object relaxation technique can help the HNN escape from the local minimum by correcting Lagrangian multipliers. The Lagrangian object relaxation neural network is analyzed theoretically and evaluated experimentally through simulating the TSP. The simulation results based on some TSPLIB benchmark problems show that the proposed method can find 100% valid solutions which are near-optimal solutions.