This paper introduces Optuna, a Python package for performing hyperparameter optimization and pruning for machine learning algorithms. It is the creation of Preferred Networks who have kindly released what had been an internal project as open source code. This should mean that Optuna will be well supported going forward.

For hyperparameter search, Optuna only supports random search and the tree-structured Parzen estimator algorithm. For pruning, early stopping of unpromising trials, it supports the following methods: median stopping, percentile stopping, Asynchronous Successive Halving, and Hyperband.

We hope that Bayesian optimization will be added.

There are many examples of how to integrate Optuna with popular machine learning packages. We will have some more in the following week.

Below is the abstract of Optuna: A Next-generation Hyperparameter Optimization Framework.

The purpose of this study is to introduce new design-criteria for next-generation hyperparameter optimization software. The criteria we propose include (1) define-by-run API that allows users to construct the parameter search space dynamically, (2) efficient implementation of both searching and pruning strategies, and (3) easy-to-setup, versatile architecture that can be deployed for various purposes, ranging from scalable distributed computing to light-weight experiment conducted via interactive interface. In order to prove our point, we will introduce Optuna, an optimization software which is a culmination of our effort in the development of a next generation optimization software. As an optimization software designed with define-by-run principle, Optuna is particularly the first of its kind. We will present the design-techniques that became necessary in the development of the software that meets the above criteria, and demonstrate the power of our new design through experimental results and real world applications. Our software is available under the MIT license

Pin It on Pinterest