Skip to main content

Sequential model-based optimization toolbox.

Project description

Logo

Travis Status CircleCI Status binder gitter

Scikit-Optimize

Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements several methods for sequential model-based optimization. skopt aims to be accessible and easy to use in many contexts.

The library is built on top of NumPy, SciPy and Scikit-Learn.

We do not perform gradient-based optimization. For gradient-based optimization algorithms look at scipy.optimize here.

Approximated objective

Approximated objective function after 50 iterations of gp_minimize. Plot made using skopt.plots.plot_objective.

Install

The latest released version of scikit-optimize is v0.4, which you can install with:

pip install numpy # explicitly install this first
pip install scikit-optimize

In addition there is a conda-forge package for version 0.3 of scikit-optimize:

conda install -c conda-forge scikit-optimize

Using conda-forge is probably the easiest way to install scikit-optimize on Windows.

Getting started

Find the minimum of the noisy function f(x) over the range -2 < x < 2 with skopt:

import numpy as np
from skopt import gp_minimize

def f(x):
    return (np.sin(5 * x[0]) * (1 - np.tanh(x[0] ** 2)) +
            np.random.randn() * 0.1)

res = gp_minimize(f, [(-2.0, 2.0)])

For more control over the optimization loop you can use the skopt.Optimizer class:

from skopt import Optimizer

opt = Optimizer([(-2.0, 2.0)])

for i in range(20):
    suggested = opt.ask()
    y = f(suggested)
    opt.tell(suggested, y)
    print('iteration:', i, suggested, y)

Read our introduction to bayesian optimization and the other examples.

Development

The library is still experimental and under heavy development. Checkout the next milestone for the plans for the next release or look at some easy issues to get started contributing.

The development version can be installed through:

git clone https://github.com/scikit-optimize/scikit-optimize.git
cd scikit-optimize
pip install -e.

Run all tests by executing pytest in the top level directory.

To only run the subset of tests with short run time, you can use pytest -m 'fast_test' (pytest -m 'slow_test' is also possible). To exclude all slow running tests try pytest -m 'not slow_test'.

This is implemented using pytest attributes. If a tests runs longer than 1 second, it is marked as slow, else as fast.

All contributors are welcome!

Commercial support

Feel free to get in touch if you need commercial support or would like to sponsor development. Resources go towards paying for additional work by seasoned engineers and researchers.

Made possible by

The scikit-optimize project was made possible with the support of

Wild Tree Tech NYU Center for Data Science NSF Northrop Grumman

If your employer allows you to work on scikit-optimize during the day and would like recognition, feel free to add them to the “Made possible by” list.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scikit-optimize-0.5.tar.gz (62.7 kB view details)

Uploaded Source

Built Distribution

scikit_optimize-0.5-py2.py3-none-any.whl (73.2 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file scikit-optimize-0.5.tar.gz.

File metadata

File hashes

Hashes for scikit-optimize-0.5.tar.gz
Algorithm Hash digest
SHA256 c20f9ae162698e16c2f8476863780b198ad8d5d7f67d7dbf153be2338fdd7712
MD5 6fc5a9af50b0b424fc2334beac2b4b16
BLAKE2b-256 335badbb49e726a45df1b8980b51b351e0842ec8b83c98173f4c43020af79084

See more details on using hashes here.

File details

Details for the file scikit_optimize-0.5-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for scikit_optimize-0.5-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 4b47873cf3d29e920367dd31b3192ee15c182eef44f7d3704b88e6a33954e51d
MD5 a45f39a20ae1c81431a6df79258e1a84
BLAKE2b-256 e811ed9e7ae1da513158fab2bfffaf8753fc0dda48de3206bdb3949570bd8db9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page