A hyperparameter optimization framework
Project description
Optuna: A hyperparameter optimization framework
Website | Docs | Install Guide | Tutorial
Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.
News
-
2022-02-14 Pre-releases of Optuna 3.0 are available! Early adopters may want to upgrade and provide feedback for a smoother transition to the coming full release. You can install a pre-release version by
pip install -U --pre optuna
. Find the latest one here -
2021-10-11 Optuna 3.0 Roadmap published for review. Please take a look at the planned improvements to Optuna, and share your feedback in the github issues. PR contributions also welcome!
Key Features
Optuna has modern functionalities as follows:
- Lightweight, versatile, and platform agnostic architecture
- Handle a wide variety of tasks with a simple installation that has few requirements.
- Pythonic search spaces
- Define search spaces using familiar Python syntax including conditionals and loops.
- Efficient optimization algorithms
- Adopt state-of-the-art algorithms for sampling hyperparameters and efficiently pruning unpromising trials.
- Easy parallelization
- Scale studies to tens or hundreds or workers with little or no changes to the code.
- Quick visualization
- Inspect optimization histories from a variety of plotting functions.
Basic Concepts
We use the terms study and trial as follows:
- Study: optimization based on an objective function
- Trial: a single execution of the objective function
Please refer to sample code below. The goal of a study is to find out the optimal set of
hyperparameter values (e.g., regressor
and svr_c
) through multiple trials (e.g.,
n_trials=100
). Optuna is a framework designed for the automation and the acceleration of the
optimization studies.
import ...
# Define an objective function to be minimized.
def objective(trial):
# Invoke suggest methods of a Trial object to generate hyperparameters.
regressor_name = trial.suggest_categorical('regressor', ['SVR', 'RandomForest'])
if regressor_name == 'SVR':
svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
regressor_obj = sklearn.svm.SVR(C=svr_c)
else:
rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)
X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)
regressor_obj.fit(X_train, y_train)
y_pred = regressor_obj.predict(X_val)
error = sklearn.metrics.mean_squared_error(y_val, y_pred)
return error # An objective value linked with the Trial object.
study = optuna.create_study() # Create a new study.
study.optimize(objective, n_trials=100) # Invoke optimization of the objective function.
Examples
Examples can be found in optuna/optuna-examples.
Integrations
Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:
- AllenNLP
- Catalyst
- Catboost
- Chainer
- FastAI (V1, V2)
- Keras
- LightGBM
- MXNet
- PyTorch
- PyTorch Ignite
- PyTorch Lightning
- TensorFlow
- tf.keras
- XGBoost
Web Dashboard (experimental)
The new Web dashboard is under the development at optuna-dashboard. It is still experimental, but much better in many regards. Feature requests and bug reports welcome!
Manage studies | Visualize with interactive graphs |
---|---|
Install optuna-dashboard
via pip:
$ pip install optuna-dashboard
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.
Installation
Optuna is available at the Python Package Index and on Anaconda Cloud.
# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna
Optuna supports Python 3.6 or newer.
Also, we also provide Optuna docker images on DockerHub.
Communication
- GitHub Discussions for questions.
- GitHub Issues for bug reports and feature requests.
- Gitter for interactive chat with developers.
- Stack Overflow for questions.
Contribution
Any contributions to Optuna are more than welcome!
If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.
If you already have contributed to Optuna, we recommend the other contribution-welcome issues.
For general guidelines how to contribute to the project, take a look at CONTRIBUTING.md.
Reference
Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file optuna-3.0.4.tar.gz
.
File metadata
- Download URL: optuna-3.0.4.tar.gz
- Upload date:
- Size: 259.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.15
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f3ce3126c40a2d8ec0a290861a622a91d0a73698942157a2589bbe648225134d |
|
MD5 | e3b8e046efebf6f61b68287844e1ced1 |
|
BLAKE2b-256 | 5581a6dfd94ebb8b2b0def38b00e7d31ca6306fd115a06cb76e1b8df527f09dd |
File details
Details for the file optuna-3.0.4-py3-none-any.whl
.
File metadata
- Download URL: optuna-3.0.4-py3-none-any.whl
- Upload date:
- Size: 348.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.15
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 76164b57375734330d0efe23da67ac2dc58a641bc4abb0b26d02719115828f03 |
|
MD5 | 033d9dca79283105347fd278df35fff6 |
|
BLAKE2b-256 | cadaeaf2b38f5693eac792ecde27785c0b5fc9919430a0a62b007a358cc74dbe |