On linear convergence of iterative methods for the variational inequality problem
Proceedings of the international meeting on Linear/nonlinear iterative methods and verification of solution
A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings
SIAM Journal on Control and Optimization
Mathematics of Operations Research
Dual extrapolation and its applications to solving variational inequalities and related problems
Mathematical Programming: Series A and B
On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
SIAM Journal on Optimization
An extraresolvent method for monotone mixed variational inequalities
Mathematical and Computer Modelling: An International Journal
Computational Optimization and Applications
Hi-index | 0.00 |
In this paper, we consider both a variant of Tseng's modified forward-backward splitting method and an extension of Korpelevich's method for solving hemivariational inequalities with Lipschitz continuous operators. By showing that these methods are special cases of the hybrid proximal extragradient method introduced by Solodov and Svaiter, we derive iteration-complexity bounds for them to obtain different types of approximate solutions. In the context of saddle-point problems, we also derive complexity bounds for these methods to obtain another type of an approximate solution, namely, that of an approximate saddle point. Finally, we illustrate the usefulness of the above results by applying them to a large class of linearly constrained convex programming problems, including, for example, cone programming and problems whose objective functions converge to infinity as the boundaries of their effective domains are approached.