Skip to main content

Adaptive Curvature for Stochastic Optimization

Abstract This thesis presents a family of adaptive curvature methods for gradient-based stochastic optimization. In particular, a general algorithmic framework is introduced along with a practical implementation that yields an efficient, adaptive curvature gradient descent algorithm. To this end, a theoretical and practical link between curvature matrix estimation and shrinkage methods for covariance matrices is established. The use of shrinkage improves estimation accuracy of the curvature matrix when data samples are scarce. This thesis also introduce several insights that result in data- and computation-efficient update equations. Empirical results suggest that the proposed method compares favorably with existing second-order techniques based on ... (more)
Created Date 2019
Contributor Barron, Trevor Paul (Author) / Ben Amor, Heni (Advisor) / He, Jingrui (Committee member) / Levihn, Martin (Committee member) / Arizona State University (Publisher)
Subject Statistics / Robotics / Natural gradient descent / Policy gradient methods / Truncated Newton methods
Type Masters Thesis
Extent 64 pages
Language English
Note Masters Thesis Computer Science 2019
Collaborating Institutions Graduate College / ASU Library
Additional Formats MODS / OAI Dublin Core / RIS

  Full Text
1.2 MB application/pdf
Download Count: 110

Description Dissertation/Thesis