Splitting Algorithms for General Pseudomonotone Mixed Variational Inequalities
Journal of Global Optimization
Local convergence analysis of projection-type algorithms: unified approach
Journal of Optimization Theory and Applications
Some new projection methods for variational inequalities
Applied Mathematics and Computation
Some recent advances in projection-type methods for variational inequalities
Journal of Computational and Applied Mathematics - Proceedings of the international conference on recent advances in computational mathematics
Pseudomonotone general mixed variational inequalities
Applied Mathematics and Computation
Comparison of Two Kinds of Prediction-Correction Methods for Monotone Variational Inequalities
Computational Optimization and Applications
A novel neural network for a class of convex quadratic minimax problems
Neural Computation
The over-relaxed proximal point algorithm based on H-maximal monotonicity design and applications
Computers & Mathematics with Applications
Generalized Eckstein-Bertsekas proximal point algorithm based on A-maximal monotonicity design
Computers & Mathematics with Applications
A new relaxed proximal point procedure and applications to nonlinear variational inclusions
Computers & Mathematics with Applications
A proximal iteration for deconvolving Poisson noisy images using sparse representations
IEEE Transactions on Image Processing
A new projection-based neural network for constrained variational inequalities
IEEE Transactions on Neural Networks
Multiplicative Noise Removal Using L1 Fidelity on Frame Coefficients
Journal of Mathematical Imaging and Vision
A non-interior-point smoothing method for variational inequality problem
Journal of Computational and Applied Mathematics
Efficient Online and Batch Learning Using Forward Backward Splitting
The Journal of Machine Learning Research
A note on Solodov and Tseng's methods for maximal monotone mappings
Journal of Computational and Applied Mathematics
A new one-layer neural network for linear and quadratic programming
IEEE Transactions on Neural Networks
Fast optimization for mixture prior models
ECCV'10 Proceedings of the 11th European conference on computer vision conference on Computer vision: Part III
Dual Averaging Methods for Regularized Stochastic Learning and Online Optimization
The Journal of Machine Learning Research
A Singular Value Thresholding Algorithm for Matrix Completion
SIAM Journal on Optimization
Composite splitting algorithms for convex optimization
Computer Vision and Image Understanding
On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
SIAM Journal on Optimization
Foundations and Trends® in Machine Learning
Computational Optimization and Applications
Mathematical and Computer Modelling: An International Journal
Generalized Eckstein-Bertsekas proximal point algorithm involving (H,η)-monotonicity framework
Mathematical and Computer Modelling: An International Journal
Mathematical and Computer Modelling: An International Journal
A Monotone+Skew Splitting Model for Composite Monotone Inclusions in Duality
SIAM Journal on Optimization
Hi-index | 0.00 |
We consider the forward-backward splitting method for finding a zero of the sum of two maximal monotone mappings. This method is known to converge when the inverse of the forward mapping is strongly monotone. We propose a modification to this method, in the spirit of the extragradient method for monotone variational inequalities, under which the method converges assuming only the forward mapping is (Lipschitz) continuous on some closed convex subset of its domain. The modification entails an additional forward step and a projection step at each iteration. Applications of the modified method to decomposition in convex programming and monotone variational inequalities are discussed.