Parallel Optimization: Theory, Algorithms and Applications
Parallel Optimization: Theory, Algorithms and Applications
Vector Space Projections: A Numerical Approach to Signal and Image Processing, Neural Nets, and Optics
A Weak-to-Strong Convergence Principle for Fejé-Monotone Methods in Hilbert Spaces
Mathematics of Operations Research
An efficient robust adaptive filtering algorithm based on parallelsubgradient projection techniques
IEEE Transactions on Signal Processing
Convex set theoretic image recovery by extrapolated iterations of parallel subgradient projections
IEEE Transactions on Image Processing
Hi-index | 0.00 |
This paper presents a closed form solution to a problem of constructing the best lower bound of a convex function under certain conditions. The function is assumed (I) bounded below by -ρ, and (II) differentiable and its derivative is Lipschitz continuous with Lipschitz constant L. To construct the lower bound, it is also assumed that we can use the values ρ and L together with the values of the function and its derivative at one specified point. By using the proposed lower bound, we derive a computationally efficient deep monotone approximation operator to the level set of the function. This operator realizes better approximation than subgradient projection which has been utilized, as a monotone to level sets of differentiable convex functions as well as nonsmooth convex functions. Therefore, by using the proposed operator, we can improve many signal processing algorithms essentially based on the subgradient projection.