Skip to main content

A hyperparameter optimization framework

Project description

Optuna: A hyperparameter optimization framework

Python pypi conda GitHub license CircleCI Read the Docs Codecov Gitter chat

Website | Docs | Install Guide | Tutorial

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.

News

  • 2021-12-06 First alpha version of Optuna 3.0 is released! Early adopters may want to upgrade and provide feedback for a smoother transition to the coming major release. Try pip install optuna==3.0.0a0.

  • 2021-10-11 Optuna 3.0 Roadmap published for review. Please take a look at the planned improvements to Optuna, and share your feedback in the github issues. PR contributions also welcome!

  • 2021-07-14 Please take a few minutes to fill in this survey, and let us know how you use Optuna now and what improvements you'd like.🤔 All questions optional. 🙇‍♂️ https://forms.gle/mCAttqxVg5oUifKV8

Key Features

Optuna has modern functionalities as follows:

Basic Concepts

We use the terms study and trial as follows:

  • Study: optimization based on an objective function
  • Trial: a single execution of the objective function

Please refer to sample code below. The goal of a study is to find out the optimal set of hyperparameter values (e.g., classifier and svm_c) through multiple trials (e.g., n_trials=100). Optuna is a framework designed for the automation and the acceleration of the optimization studies.

Open in Colab

import ...

# Define an objective function to be minimized.
def objective(trial):

    # Invoke suggest methods of a Trial object to generate hyperparameters.
    regressor_name = trial.suggest_categorical('classifier', ['SVR', 'RandomForest'])
    if regressor_name == 'SVR':
        svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
        regressor_obj = sklearn.svm.SVR(C=svr_c)
    else:
        rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
        regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)

    X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
    X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)

    regressor_obj.fit(X_train, y_train)
    y_pred = regressor_obj.predict(X_val)

    error = sklearn.metrics.mean_squared_error(y_val, y_pred)

    return error  # An objective value linked with the Trial object.

study = optuna.create_study()  # Create a new study.
study.optimize(objective, n_trials=100)  # Invoke optimization of the objective function.

Examples

Examples can be found in optuna/optuna-examples.

Integrations

Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:

Web Dashboard (experimental)

The new Web dashboard is under the development at optuna-dashboard. It is still experimental, but much better in many regards. Feature requests and bug reports welcome!

Manage studies Visualize with interactive graphs
manage-studies optuna-realtime-graph

Install optuna-dashboard via pip:

$ pip install optuna-dashboard
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.

Installation

Optuna is available at the Python Package Index and on Anaconda Cloud.

# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna

Optuna supports Python 3.6 or newer.

Also, we also provide Optuna docker images on DockerHub.

Communication

Contribution

Any contributions to Optuna are more than welcome!

If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.

If you already have contributed to Optuna, we recommend the other contribution-welcome issues.

For general guidelines how to contribute to the project, take a look at CONTRIBUTING.md.

Reference

Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optuna-3.0.0a2.tar.gz (241.2 kB view details)

Uploaded Source

Built Distribution

optuna-3.0.0a2-py3-none-any.whl (320.3 kB view details)

Uploaded Python 3

File details

Details for the file optuna-3.0.0a2.tar.gz.

File metadata

  • Download URL: optuna-3.0.0a2.tar.gz
  • Upload date:
  • Size: 241.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.12

File hashes

Hashes for optuna-3.0.0a2.tar.gz
Algorithm Hash digest
SHA256 1b5e3e976cb153d4c53832d50001ec48b21089eb4fff4e256dac294f0b53cdab
MD5 7c22ac547bad91a85e5fc3c5dc8998df
BLAKE2b-256 10e5dc3fb6254c0c68b0beaecb10c4e165d276876cb02ef33cef0dba44711176

See more details on using hashes here.

File details

Details for the file optuna-3.0.0a2-py3-none-any.whl.

File metadata

  • Download URL: optuna-3.0.0a2-py3-none-any.whl
  • Upload date:
  • Size: 320.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.12

File hashes

Hashes for optuna-3.0.0a2-py3-none-any.whl
Algorithm Hash digest
SHA256 0f101f6bfb9d31d2af649be45b60560f00d7783164442a71705400875cb16f47
MD5 c07d5af74283c43c65385e180c30b853
BLAKE2b-256 095aeed4537e85bc2e2af9e45006df6f02926cc997a108deb537978c503aaf8b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page