Title:
Faster Conditional Gradient Algorithms for Machine Learning
Faster Conditional Gradient Algorithms for Machine Learning
Authors
Carderera De Diego, Alejandro Agustin
Authors
Advisors
Pokutta, Sebastian
Advisors
Associated Organizations
Organizational Unit
Organizational Unit
Series
Collections
Supplementary to
Permanent Link
Abstract
In this thesis, we focus on Frank-Wolfe (a.k.a. Conditional Gradient) algorithms, a family of iterative algorithms for convex optimization, that work under the assumption that projections onto the feasible region are prohibitive, but linear optimization problems can be efficiently solved over the feasible region. We present several algorithms that either locally or globally improve upon existing convergence guarantees. In Chapters 2-4 we focus on the case where the objective function is smooth and strongly convex and the feasible region is a polytope, and in Chapter 5 we focus on the case where the function is generalized self-concordant and the feasible region is a compact convex set.
Sponsor
Date Issued
2021-12-09
Extent
Resource Type
Text
Resource Subtype
Dissertation