Full metadata
Title
Adaptive Curvature for Stochastic Optimization
Description
This thesis presents a family of adaptive curvature methods for gradient-based stochastic optimization. In particular, a general algorithmic framework is introduced along with a practical implementation that yields an efficient, adaptive curvature gradient descent algorithm. To this end, a theoretical and practical link between curvature matrix estimation and shrinkage methods for covariance matrices is established. The use of shrinkage improves estimation accuracy of the curvature matrix when data samples are scarce. This thesis also introduce several insights that result in data- and computation-efficient update equations. Empirical results suggest that the proposed method compares favorably with existing second-order techniques based on the Fisher or Gauss-Newton and with adaptive stochastic gradient descent methods on both supervised and reinforcement learning tasks.
Date Created
2019
Contributors
- Barron, Trevor (Author)
- Ben Amor, Heni (Thesis advisor)
- He, Jingrui (Committee member)
- Levihn, Martin (Committee member)
- Arizona State University (Publisher)
Topical Subject
Resource Type
Extent
vi, 57 pages : color illustrations
Language
eng
Copyright Statement
In Copyright
Primary Member of
Peer-reviewed
No
Open Access
No
Handle
https://hdl.handle.net/2286/R.I.53675
Statement of Responsibility
by Trevor Barron
Description Source
Viewed on November 25, 2019
Level of coding
full
Note
thesis
Partial requirement for: M.S., Arizona State University, 2019
bibliography
Includes bibliographical references (pages 35-39)
Field of study: Computer science
System Created
- 2019-05-15 12:29:37
System Modified
- 2021-08-26 09:47:01
- 3 years 2 months ago
Additional Formats