Skip to main content

A hyperparameter optimization framework

Project description

Optuna: A hyperparameter optimization framework

Python pypi conda GitHub license CircleCI Read the Docs Codecov Gitter chat

Website | Docs | Install Guide | Tutorial

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.

News

  • 2021-12-06 First alpha version of Optuna 3.0 is released! Early adopters may want to upgrade and provide feedback for a smoother transition to the coming major release. Try pip install optuna==3.0.0a0.

  • 2021-10-11 Optuna 3.0 Roadmap published for review. Please take a look at the planned improvements to Optuna, and share your feedback in the github issues. PR contributions also welcome!

  • 2021-07-14 Please take a few minutes to fill in this survey, and let us know how you use Optuna now and what improvements you'd like.🤔 All questions optional. 🙇‍♂️ https://forms.gle/mCAttqxVg5oUifKV8

Key Features

Optuna has modern functionalities as follows:

Basic Concepts

We use the terms study and trial as follows:

  • Study: optimization based on an objective function
  • Trial: a single execution of the objective function

Please refer to sample code below. The goal of a study is to find out the optimal set of hyperparameter values (e.g., classifier and svm_c) through multiple trials (e.g., n_trials=100). Optuna is a framework designed for the automation and the acceleration of the optimization studies.

Open in Colab

import ...

# Define an objective function to be minimized.
def objective(trial):

    # Invoke suggest methods of a Trial object to generate hyperparameters.
    regressor_name = trial.suggest_categorical('classifier', ['SVR', 'RandomForest'])
    if regressor_name == 'SVR':
        svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
        regressor_obj = sklearn.svm.SVR(C=svr_c)
    else:
        rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
        regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)

    X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
    X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)

    regressor_obj.fit(X_train, y_train)
    y_pred = regressor_obj.predict(X_val)

    error = sklearn.metrics.mean_squared_error(y_val, y_pred)

    return error  # An objective value linked with the Trial object.

study = optuna.create_study()  # Create a new study.
study.optimize(objective, n_trials=100)  # Invoke optimization of the objective function.

Examples

Examples can be found in optuna/optuna-examples.

Integrations

Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:

Web Dashboard (experimental)

The new Web dashboard is under the development at optuna-dashboard. It is still experimental, but much better in many regards. Feature requests and bug reports welcome!

Manage studies Visualize with interactive graphs
manage-studies optuna-realtime-graph

Install optuna-dashboard via pip:

$ pip install optuna-dashboard
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.

Installation

Optuna is available at the Python Package Index and on Anaconda Cloud.

# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna

Optuna supports Python 3.6 or newer.

Also, we also provide Optuna docker images on DockerHub.

Communication

Contribution

Any contributions to Optuna are more than welcome!

If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.

If you already have contributed to Optuna, we recommend the other contribution-welcome issues.

For general guidelines how to contribute to the project, take a look at CONTRIBUTING.md.

Reference

Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optuna-3.0.0a1.tar.gz (241.3 kB view details)

Uploaded Source

Built Distribution

optuna-3.0.0a1-py3-none-any.whl (320.4 kB view details)

Uploaded Python 3

File details

Details for the file optuna-3.0.0a1.tar.gz.

File metadata

  • Download URL: optuna-3.0.0a1.tar.gz
  • Upload date:
  • Size: 241.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.12

File hashes

Hashes for optuna-3.0.0a1.tar.gz
Algorithm Hash digest
SHA256 599a8afeaaca15130ae0bc50e50e30cd6b8e90efe9f46a1729a136415927ed4b
MD5 9b15d07611ccc0830c4c84cbff87b70c
BLAKE2b-256 feb66d906e443a407634aeaf4c09dac46e3363deae4aef37de2ab29474e452ca

See more details on using hashes here.

File details

Details for the file optuna-3.0.0a1-py3-none-any.whl.

File metadata

  • Download URL: optuna-3.0.0a1-py3-none-any.whl
  • Upload date:
  • Size: 320.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.10.1 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.7.12

File hashes

Hashes for optuna-3.0.0a1-py3-none-any.whl
Algorithm Hash digest
SHA256 36d00b71d6474d45d36131cb31f7e7f68960ffde36d428daf5a0e3bc65925ed3
MD5 a76ca9da22104c600cfd2a4196b351bd
BLAKE2b-256 f2cbe54dd6fea4f25091fc9ec62524bacdea57baaed66ce4fec610c6ec4ff2f0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page