A hyperparameter optimization framework
Project description
Optuna: A hyperparameter optimization framework
Website | Docs | Install Guide | Tutorial
Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.
Key Features
Optuna has modern functionalities as follows:
- Parallel distributed optimization
- Pruning of unpromising trials
- Lightweight, versatile, and platform agnostic architecture
Basic Concepts
We use the terms study and trial as follows:
- Study: optimization based on an objective function
- Trial: a single execution of the objective function
Please refer to sample code below. The goal of a study is to find out the optimal set of
hyperparameter values (e.g., classifier
and svm_c
) through multiple trials (e.g.,
n_trials=100
). Optuna is a framework designed for the automation and the acceleration of the
optimization studies.
import ...
# Define an objective function to be minimized.
def objective(trial):
# Invoke suggest methods of a Trial object to generate hyperparameters.
regressor_name = trial.suggest_categorical('classifier', ['SVR', 'RandomForest'])
if regressor_name == 'SVR':
svr_c = trial.suggest_loguniform('svr_c', 1e-10, 1e10)
regressor_obj = sklearn.svm.SVR(C=svr_c)
else:
rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)
X, y = sklearn.datasets.load_boston(return_X_y=True)
X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)
regressor_obj.fit(X_train, y_train)
y_pred = regressor_obj.predict(X_val)
error = sklearn.metrics.mean_squared_error(y_val, y_pred)
return error # A objective value linked with the Trial object.
study = optuna.create_study() # Create a new study.
study.optimize(objective, n_trials=100) # Invoke optimization of the objective function.
Installation
To install Optuna, use pip
as follows:
$ pip install optuna
Optuna supports Python 2.7 and Python 3.5 or newer.
Contribution
Any contributions to Optuna are welcome! When you send a pull request, please follow the contribution guide.
License
MIT License (see LICENSE).
Reference
Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file optuna-0.18.1.tar.gz
.
File metadata
- Download URL: optuna-0.18.1.tar.gz
- Upload date:
- Size: 122.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.1 setuptools/41.0.1 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.5.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7f7d92d8d1a3e0c5c3b92b4272a29b110a4a3d53e140a55e564c8756225a0c99 |
|
MD5 | 39eb725155214799e6283c143fc42889 |
|
BLAKE2b-256 | 2d323141c8aaf40a076079f075c2b66240c4bb61fc37b22bc8e6898475f0236e |