Rescaling and Stepsize Selection in Proximal Methods Using Separable Generalized Distances

Paulo J. S. Silva, Jonathan Eckstein, and Carlos Humes Jr.. SIAM Journal on Optimization, 2001.

Abstract

This paper presents a convergence proof technique for a broad class of proximal algorithms in which the perturbation term is separable and may contain barriers enforcing interval constraints. There are two key ingredients in the analysis: a mild regularity condition on the differential behavior of the barrier as one approaches an interval boundary and a lower stepsize limit that takes into account the curvature of the proximal term. We give two applications of our approach. First, we prove subsequential convergence of a very broad class of proximal minimization algorithms for convex optimization, where different stepsizes can be used for each coordinate. Applying these methods to the dual of a convex program, we obtain a wide class of multiplier methods with subsequential convergence of both primal and dual iterates and independent adjustment of the penalty parameter for each constraint. The adjustment rules for the penalty parameters generalize a well-established scheme for the exponential method of multipliers. The results may also be viewed as a generalization of recent work by Ben-Tal and Zibulevsky [SIAM J. Optim, 7 (1997), pp. 347–366] and Auslender, Teboulle, and Ben-Tiba [ Comput. Optim. Appl., 12 (1999), pp. 31–40; Math. Oper. Res., 24 (1999), pp. 645–668] on methods derived from $\varphi$-divergences. The second application established full convergence, under a novel stepsize condition, of Bregman-function-based proximal methods for general monotone operator problems over a box. Prior results in this area required strong restrictive assumptions on the monotone operator.