A hyperparameter optimization framework
Project description
Optuna: A hyperparameter optimization framework
Website | Docs | Install Guide | Tutorial
Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.
News
-
2022-02-14 Pre-releases of Optuna 3.0 are available! Early adopters may want to upgrade and provide feedback for a smoother transition to the coming full release. You can install a pre-release version by
pip install -U --pre optuna
. Find the latest one here -
2021-10-11 Optuna 3.0 Roadmap published for review. Please take a look at the planned improvements to Optuna, and share your feedback in the github issues. PR contributions also welcome!
Key Features
Optuna has modern functionalities as follows:
- Lightweight, versatile, and platform agnostic architecture
- Handle a wide variety of tasks with a simple installation that has few requirements.
- Pythonic search spaces
- Define search spaces using familiar Python syntax including conditionals and loops.
- Efficient optimization algorithms
- Adopt state-of-the-art algorithms for sampling hyperparameters and efficiently pruning unpromising trials.
- Easy parallelization
- Scale studies to tens or hundreds or workers with little or no changes to the code.
- Quick visualization
- Inspect optimization histories from a variety of plotting functions.
Basic Concepts
We use the terms study and trial as follows:
- Study: optimization based on an objective function
- Trial: a single execution of the objective function
Please refer to sample code below. The goal of a study is to find out the optimal set of
hyperparameter values (e.g., regressor
and svr_c
) through multiple trials (e.g.,
n_trials=100
). Optuna is a framework designed for the automation and the acceleration of the
optimization studies.
import ...
# Define an objective function to be minimized.
def objective(trial):
# Invoke suggest methods of a Trial object to generate hyperparameters.
regressor_name = trial.suggest_categorical('regressor', ['SVR', 'RandomForest'])
if regressor_name == 'SVR':
svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
regressor_obj = sklearn.svm.SVR(C=svr_c)
else:
rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)
X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)
regressor_obj.fit(X_train, y_train)
y_pred = regressor_obj.predict(X_val)
error = sklearn.metrics.mean_squared_error(y_val, y_pred)
return error # An objective value linked with the Trial object.
study = optuna.create_study() # Create a new study.
study.optimize(objective, n_trials=100) # Invoke optimization of the objective function.
Examples
Examples can be found in optuna/optuna-examples.
Integrations
Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:
- AllenNLP
- Catalyst
- Catboost
- Chainer
- FastAI (V1, V2)
- Keras
- LightGBM
- MXNet
- PyTorch
- PyTorch Ignite
- PyTorch Lightning
- TensorFlow
- tf.keras
- XGBoost
Web Dashboard (experimental)
The new Web dashboard is under the development at optuna-dashboard. It is still experimental, but much better in many regards. Feature requests and bug reports welcome!
Manage studies | Visualize with interactive graphs |
---|---|
Install optuna-dashboard
via pip:
$ pip install optuna-dashboard
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.
Installation
Optuna is available at the Python Package Index and on Anaconda Cloud.
# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna
Optuna supports Python 3.6 or newer.
Also, we also provide Optuna docker images on DockerHub.
Communication
- GitHub Discussions for questions.
- GitHub Issues for bug reports and feature requests.
- Gitter for interactive chat with developers.
- Stack Overflow for questions.
Contribution
Any contributions to Optuna are more than welcome!
If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.
If you already have contributed to Optuna, we recommend the other contribution-welcome issues.
For general guidelines how to contribute to the project, take a look at CONTRIBUTING.md.
Reference
Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file optuna-3.0.0.tar.gz
.
File metadata
- Download URL: optuna-3.0.0.tar.gz
- Upload date:
- Size: 258.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.7.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9a80f7aa08e0ecccf24d93b761bc4ece14fc59686cef6143009605729889d3c9 |
|
MD5 | 4cf617f5762a84e51b9e9bfb5b5d55c3 |
|
BLAKE2b-256 | 6e1a4a647bbdca282efcc4387f065ca9658b464fb707fd3ca8d59b2da0258bf4 |
File details
Details for the file optuna-3.0.0-py3-none-any.whl
.
File metadata
- Download URL: optuna-3.0.0-py3-none-any.whl
- Upload date:
- Size: 348.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.7.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e05df69542e2ddd3dea3a57c18ad092e26c423c923ceb24ce031729796e2eaad |
|
MD5 | 88f72cfce9cc0c138f1ff8af0509d50c |
|
BLAKE2b-256 | db97d9c4a186b0aa7fb7acab1b64164fdb4f76495159836305c32dc6cb160f2c |