Skip to main content

A hyperparameter optimization framework

Project description

Optuna: A hyperparameter optimization framework

Python pypi conda GitHub license Read the Docs Codecov

Website | Docs | Install Guide | Tutorial | Examples

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.

Key Features

Optuna has modern functionalities as follows:

Basic Concepts

We use the terms study and trial as follows:

  • Study: optimization based on an objective function
  • Trial: a single execution of the objective function

Please refer to sample code below. The goal of a study is to find out the optimal set of hyperparameter values (e.g., regressor and svr_c) through multiple trials (e.g., n_trials=100). Optuna is a framework designed for the automation and the acceleration of the optimization studies.

Open in Colab

import ...

# Define an objective function to be minimized.
def objective(trial):

    # Invoke suggest methods of a Trial object to generate hyperparameters.
    regressor_name = trial.suggest_categorical('regressor', ['SVR', 'RandomForest'])
    if regressor_name == 'SVR':
        svr_c = trial.suggest_float('svr_c', 1e-10, 1e10, log=True)
        regressor_obj = sklearn.svm.SVR(C=svr_c)
    else:
        rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32)
        regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth)

    X, y = sklearn.datasets.fetch_california_housing(return_X_y=True)
    X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0)

    regressor_obj.fit(X_train, y_train)
    y_pred = regressor_obj.predict(X_val)

    error = sklearn.metrics.mean_squared_error(y_val, y_pred)

    return error  # An objective value linked with the Trial object.

study = optuna.create_study()  # Create a new study.
study.optimize(objective, n_trials=100)  # Invoke optimization of the objective function.

Installation

Optuna is available at the Python Package Index and on Anaconda Cloud.

# PyPI
$ pip install optuna
# Anaconda Cloud
$ conda install -c conda-forge optuna

Optuna supports Python 3.7 or newer.

Also, we also provide Optuna docker images on DockerHub.

Examples

Examples can be found in optuna/optuna-examples.

Integrations

Optuna has integration features with various third-party libraries. Integrations can be found in optuna/optuna-integration and the document is available here. Integrations support libraries such as the following:

Web Dashboard

Optuna Dashboard is a real-time web dashboard for Optuna. You can check the optimization history, hyperparameter importances, etc. in graphs and tables. You don't need to create a Python script to call Optuna's visualization functions. Feature requests and bug reports welcome!

optuna-dashboard

Install optuna-dashboard via pip:

$ pip install optuna-dashboard
$ optuna-dashboard sqlite:///db.sqlite3
...
Listening on http://localhost:8080/
Hit Ctrl-C to quit.

Communication

Contribution

Any contributions to Optuna are more than welcome!

If you are new to Optuna, please check the good first issues. They are relatively simple, well-defined and are often good starting points for you to get familiar with the contribution workflow and other developers.

If you already have contributed to Optuna, we recommend the other contribution-welcome issues.

For general guidelines how to contribute to the project, take a look at CONTRIBUTING.md.

Reference

Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optuna-3.5.0.tar.gz (323.5 kB view details)

Uploaded Source

Built Distribution

optuna-3.5.0-py3-none-any.whl (413.4 kB view details)

Uploaded Python 3

File details

Details for the file optuna-3.5.0.tar.gz.

File metadata

  • Download URL: optuna-3.5.0.tar.gz
  • Upload date:
  • Size: 323.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for optuna-3.5.0.tar.gz
Algorithm Hash digest
SHA256 ca9e1ce16aa6c6a5af0e1cc1d0cbcd98eb1c75b6a2f06be6bd9c0c5ab0698724
MD5 3bcc236279698a5de47d905b38f3cf0e
BLAKE2b-256 b6216f7e1253afad069d3266f746c01cb2e87f83abec6ae66e3756abdb2fa8a8

See more details on using hashes here.

File details

Details for the file optuna-3.5.0-py3-none-any.whl.

File metadata

  • Download URL: optuna-3.5.0-py3-none-any.whl
  • Upload date:
  • Size: 413.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for optuna-3.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4c86bbcaeff9ad5b0758e87537793f66df8f9352246315d7121ea465724a44e8
MD5 d85a7d2703877583e156ecb8d4962a2b
BLAKE2b-256 4c6a219a431aaf81b3eb3070fd2d58116baa366d3072f43bbcc87dc3495b7546

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page