Accelerated Bregman Proximal Gradient Methods for Relatively Smooth Convex Optimization

  • Filip Hanzely ,
  • Peter Richtarik ,
  • Lin Xiao

MSR-TR-2018-22 |

Published by Microsoft

Revised April 23, 2020.

We consider the problem of minimizing the sum of two convex functions: one is differentiable and relatively smooth with respect to a reference convex function, and the other can be nondifferentiable but simple to optimize. We investigate a triangle scaling property of the Bregman distance generated by the reference convex function and present accelerated Bregman proximal gradient (ABPG) methods that attain an \(O(k^{-\gamma})\) convergence rate, where \(\gamma\in(0,2]\) is the triangle scaling exponent (TSE) of the Bregman distance. For the Euclidean distance, we have \(\gamma=2\) and recover the convergence rate of Nesterov’s accelerated gradient methods. For non-Euclidean Bregman distances, the TSE can be much smaller (say \(\gamma\leq 1\)), but we show that a relaxed definition of intrinsic TSE is always equal to 2. We exploit the intrinsic TSE to develop adaptive ABPG methods that converge much faster in practice. Although theoretical guarantees on a fast convergence rate seem to be out of reach in general, our methods obtain empirical \(O(k^{-2})\) rates in numerical experiments on several applications and provide posterior numerical certificates for the fast rates.

GitHubGitHub