Skip to content

Better and Faster Hyperparameter Optimization with Dask

Photo of Alyssa Adams
Hosted By
Alyssa A. and PabloC
Better and Faster Hyperparameter Optimization with Dask

Details

WEBINAR: Out Meetup will be hosted online via Blackboard Collaborate. Link to join -> https://us.bbcollab.com/collab/ui/session/guest/437d5f21a9e64a78b83dfe144c5e5d46

ABSTRACT: Nearly every machine learning model requires hyperparameters, parameters that the user must specify before training begins and influence model performance. Finding the optimal set of hyperparameters is often a time- and resource-consuming process. A recent breakthrough hyperparameter optimization algorithm, Hyperband finds high performing hyperparameters with minimal training via a principled early stopping scheme for random hyperparameter selection li2016hyperband. This paper will provide an intuitive introduction to Hyperband and explain the implementation in Dask, a Python library that scales Python to larger datasets and more computational resources. The implementation makes adjustments to the Hyperband algorithm to exploit Dask's capabilities and parallel processing. In experiments, the Dask implementation of Hyperband rapidly finds high performing hyperparameters for deep learning models.

SPEAKER BIO: Scott Sievert is a graduate student at the UW–Madison formally studying Electrical & Computer Engineering and informally studying machine learning. He focuses on accelerating machine learning, distributed computation and optimization. After graduation, he will work for the Air Force Research Lab.

Photo of PyData Madison group
PyData Madison
See more events
Online event
This event has passed