Testing Unconstrained Optimization Software
ACM Transactions on Mathematical Software (TOMS)
Numerical Initial Value Problems in Ordinary Differential Equations
Numerical Initial Value Problems in Ordinary Differential Equations
Stability Analysis of Gradient-Based Neural Networks for Optimization Problems
Journal of Global Optimization
A new neural network for solving linear programming problems and its application
IEEE Transactions on Neural Networks
A general methodology for designing globally convergent optimization neural networks
IEEE Transactions on Neural Networks
A new trust region method with adaptive radius
Computational Optimization and Applications
Convergence analysis of a global optimization algorithm using stochastic differential equations
Journal of Global Optimization
Hi-index | 0.00 |
In this paper, we study a gradient-based continuous method for large-scale optimization problems. By converting the optimization problem into an ODE, we are able to show that the solution trajectory of this ODE tends to the set of stationary points of the original optimization problem. We test our continuous method on large-scale problems available in the literature. The simulation results are very attractive.