Person:
Yezzi, Anthony

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 1 of 1
  • Item
    Accelerated Optimization in the PDE Framework
    (Georgia Institute of Technology, 2018-10-24) Yezzi, Anthony
    Following the seminal work of Nesterov, accelerated optimization methods (sometimes referred to as momentum methods) have been used to powerfully boost the performance of first-order, gradient-based parameter estimation in scenarios where second-order optimization strategies are either inapplicable or impractical. Not only does accelerated gradient descent converge considerably faster than traditional gradient descent, but it performs a more robust local search of the parameter space by initially overshooting and then oscillating back as it settles into a final configuration, thereby selecting only local minimizers with an attraction basin large enough to accommodate the initial overshoot. This behavior has made accelerated search methods particularly popular within the machine learning community where stochastic variants have been proposed as well. So far, however, accelerated optimization methods have been applied to searches over finite parameter spaces. We show how a variational setting for these finite dimensional methods (recently formulated by Wibisono, Wilson, and Jordan) can be extended to the infinite dimensional setting, both in linear functional spaces as well as to the more complicated manifold of 2D curves and 3D surfaces. Moreover, we also show how extremely simple explicit discretization schemes can be used to efficiently solve the resulting class of high-dimensional optimization problems.