Design and Analysis of Algorithms for Composite Optimization
Loading...
Author(s)
Liang, Jiaming
Advisor(s)
Editor(s)
Collections
Supplementary to:
Permanent Link
Abstract
In this thesis, we study first-order methods (FOMs) for solving three types of composite optimization problems: convex nonsmooth, convex hybrid, and nonconcex smooth. We revisit three representative methods among FOMs: proximal point, proximal bundle, and accelerated composite gradient (ACG). We use them to design fast algorithms for solving the three aforementioned problems, and obtain either improved (or sometimes optimal) complexity results or remarkable practical performance.
We first study convex nonsmooth composite optimization, and develop a novel proximal bundle method. We show that the proposed method has O(ε−2) complexity for obtaining an ε-solution, and also show that the problem has a matching lower complexity bound. As a result, the proposed proximal bundle method is optimal, and this is the first optimal method of its type for convex nonsmooth optimization as well.
We further investigate convex hybrid composite optimization, which includes both smooth and nonsmooth optimization as special cases. To solve this more challenging problem, we propose a generic framework containing many proximal bundle methods under the same umbrella, and present a unified complexity analysis for all methods in the framework. Moreover, we develop an adaptive one-cut proximal bundle method, which does not require any problem parameters as input, and hence can be deemed a universal method.
The second half of the thesis is devoted to studying nonconvex smooth composite optimization. Algorithms designed for solving this problem fall into two categories: indirect methods based on the inexact proximal point method, and direct methods based on the ACG method.
Following the idea of the proximal point method, indirect methods solve nonconvex smooth composite optimization problems by approximately solving a sequence of proximal subproblems, which are strongly convex by construction. In particular, we develop a doubly accelerated inexact proximal point method by applying an ACG method to solve proximal subproblems, and updating proximal subproblems in a manner similar to the accelerated method. The proposed method has the best iteration-complexity for solving nonconvex smooth composite optimization problems.
Based on the ACG method, we propose three direct methods in regard to their stepsize rules: constant, backtracking, and average curvature. The first two rules are widely used in convex optimization, and hence the methods based on these rules are direct extensions of convex ACG methods to the nonconvex context. The novel average curvature rule explores the local landscape of the nonconvex objective function, and provides an alternative approach to adaptively adjust stepsizes without backtracking. Applying the average curvature rule to the well-known ACG variant FISTA, together with a restart scheme, we develop a highly efficient algorithm in contrast to other nonconvex ACG variants for solving non-convex smooth composite optimization problems.
Sponsor
Date
2022-04-28
Extent
Resource Type
Text
Resource Subtype
Dissertation