Superiorization vs. Accelerated Convex Optimization: The Superiorized/Regularized Least-Squares Case

11/13/2019
by   Yair Censor, et al.
0

In this paper we conduct a study of both superiorization and optimization approaches for the reconstruction problem of superiorized/regularized solutions to underdetermined systems of linear equations with nonnegativity variable bounds. Specifically, we study a (smoothed) total variation regularized least-squares problem with nonnegativity constraints. We consider two approaches: (a) a superiorization approach that, in contrast to the classic gradient based superiorization methodology, employs proximal mappings and is structurally similar to a standard forward-backward optimization approach, and (b) an (inexact) accelerated optimization approach that mimics superiorization. Namely, a basic algorithm for nonnegative least squares that is enforced by inexact proximal points is perturbed by negative gradients of the the total variation term. Our numerical findings suggest that superiorization can approach the solution of the optimization problem and leads to comparable results at significantly lower costs, after appropriate parameter tuning. Reversing the roles of the terms treated by accelerated forward-backward optimization, on the other hand, slightly outperforms superiorization, which suggests that optimization can approach superiorization too, using a suitable problem splitting. Extensive numerical results substantiate our discussion of these aspects.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset