Description
The performance of modern machine learning algorithms depends upon the selection
of a set of hyperparameters. Common examples of hyperparameters are learning
rate and the number of layers in a dense neural network. Auto-ML is a branch
of optimization that has produced important contributions in this area. Within
Auto-ML, multi-fidelity approaches, which eliminate poorly-performing configurations
after evaluating them at low budgets, are among the most effective. However, the
performance of these algorithms strongly depends on how effectively they allocate
the computational budget to various hyperparameter configurations. We first present
Parameter Optimization with Conscious Allocation 1.0 (POCA 1.0), a hyperband-
based algorithm for hyperparameter optimization that adaptively allocates the inputted
budget to the hyperparameter configurations it generates following a Bayesian sampling
scheme. We then present its successor Parameter Optimization with Conscious
Allocation 2.0 (POCA 2.0), which follows POCA 1.0’s successful philosophy while
utilizing a time-series model to reduce wasted computational cost and providing a
more flexible framework. We compare POCA 1.0 and 2.0 to its nearest competitor BOHB
at optimizing the hyperparameters of a multi-layered perceptron and find that both
POCA algorithms exceed BOHB in low-budget hyperparameter optimization while
performing similarly in high-budget scenarios.
Details
Title
- Parameter Optimization with Conscious Allocation (POCA): Efficient Bayesian Hyperparameter Optimization with Adaptive Budget Assignment
Contributors
- Inman, Joshua (Author)
- Sankar, Lalitha (Thesis director)
- Pedrielli, Giulia (Committee member)
- Barrett, The Honors College (Contributor)
- School of Mathematical and Statistical Sciences (Contributor)
- Computer Science and Engineering Program (Contributor)
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
2024-05
Resource Type
Collections this item is in