Skip to main content

Bayesian Optimization in PyTorch

Project description

BoTorch Logo

Support Ukraine Lint Test Docs Tutorials Codecov

Conda PyPI License

BoTorch is a library for Bayesian Optimization built on PyTorch.

BoTorch is currently in beta and under active development!

Why BoTorch ?

BoTorch

  • Provides a modular and easily extensible interface for composing Bayesian optimization primitives, including probabilistic models, acquisition functions, and optimizers.
  • Harnesses the power of PyTorch, including auto-differentiation, native support for highly parallelized modern hardware (e.g. GPUs) using device-agnostic code, and a dynamic computation graph.
  • Supports Monte Carlo-based acquisition functions via the reparameterization trick, which makes it straightforward to implement new ideas without having to impose restrictive assumptions about the underlying model.
  • Enables seamless integration with deep and/or convolutional architectures in PyTorch.
  • Has first-class support for state-of-the art probabilistic models in GPyTorch, including support for multi-task Gaussian Processes (GPs) deep kernel learning, deep GPs, and approximate inference.

Target Audience

The primary audience for hands-on use of BoTorch are researchers and sophisticated practitioners in Bayesian Optimization and AI. We recommend using BoTorch as a low-level API for implementing new algorithms for Ax. Ax has been designed to be an easy-to-use platform for end-users, which at the same time is flexible enough for Bayesian Optimization researchers to plug into for handling of feature transformations, (meta-)data management, storage, etc. We recommend that end-users who are not actively doing research on Bayesian Optimization simply use Ax.

Installation

Installation Requirements

  • Python >= 3.8
  • PyTorch >= 1.11
  • gpytorch == 1.9.0
  • linear_operator == 0.2.0
  • pyro-ppl >= 1.8.2
  • scipy
  • multiple-dispatch
Installing the latest release

The latest release of BoTorch is easily installed either via Anaconda (recommended):

conda install botorch -c pytorch -c gpytorch -c conda-forge

or via pip:

pip install botorch

You can customize your PyTorch installation (i.e. CUDA version, CPU only option) by following the PyTorch installation instructions.

Important note for MacOS users:

  • Make sure your PyTorch build is linked against MKL (the non-optimized version of BoTorch can be up to an order of magnitude slower in some settings). Setting this up manually on MacOS can be tricky - to ensure this works properly, please follow the PyTorch installation instructions.
  • If you need CUDA on MacOS, you will need to build PyTorch from source. Please consult the PyTorch installation instructions above.
Installing from latest main branch

If you would like to try our bleeding edge features (and don't mind potentially running into the occasional bug here or there), you can install the latest development version directly from GitHub. If you want to also install the current gpytorch and linear_operator development versions, you will need to ensure that the ALLOW_LATEST_GPYTORCH_LINOP environment variable is set):

pip install --upgrade git+https://github.com/cornellius-gp/linear_operator.git
pip install --upgrade git+https://github.com/cornellius-gp/gpytorch.git
export ALLOW_LATEST_GPYTORCH_LINOP=true
pip install --upgrade git+https://github.com/pytorch/botorch.git

Manual / Dev install

Alternatively, you can do a manual install. For a basic install, run:

git clone https://github.com/pytorch/botorch.git
cd botorch
pip install -e .

To customize the installation, you can also run the following variants of the above:

  • pip install -e .[dev]: Also installs all tools necessary for development (testing, linting, docs building; see Contributing below).
  • pip install -e .[tutorials]: Also installs all packages necessary for running the tutorial notebooks.

Getting Started

Here's a quick run down of the main components of a Bayesian optimization loop. For more details see our Documentation and the Tutorials.

  1. Fit a Gaussian Process model to data
import torch
from botorch.models import SingleTaskGP
from botorch.fit import fit_gpytorch_mll
from gpytorch.mlls import ExactMarginalLogLikelihood

train_X = torch.rand(10, 2)
Y = 1 - (train_X - 0.5).norm(dim=-1, keepdim=True)  # explicit output dimension
Y += 0.1 * torch.rand_like(Y)
train_Y = (Y - Y.mean()) / Y.std()

gp = SingleTaskGP(train_X, train_Y)
mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
fit_gpytorch_mll(mll)
  1. Construct an acquisition function
from botorch.acquisition import UpperConfidenceBound

UCB = UpperConfidenceBound(gp, beta=0.1)
  1. Optimize the acquisition function
from botorch.optim import optimize_acqf

bounds = torch.stack([torch.zeros(2), torch.ones(2)])
candidate, acq_value = optimize_acqf(
    UCB, bounds=bounds, q=1, num_restarts=5, raw_samples=20,
)

Citing BoTorch

If you use BoTorch, please cite the following paper:

M. Balandat, B. Karrer, D. R. Jiang, S. Daulton, B. Letham, A. G. Wilson, and E. Bakshy. BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization. Advances in Neural Information Processing Systems 33, 2020.

@inproceedings{balandat2020botorch,
  title={{BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization}},
  author={Balandat, Maximilian and Karrer, Brian and Jiang, Daniel R. and Daulton, Samuel and Letham, Benjamin and Wilson, Andrew Gordon and Bakshy, Eytan},
  booktitle = {Advances in Neural Information Processing Systems 33},
  year={2020},
  url = {http://arxiv.org/abs/1910.06403}
}

See here for an incomplete selection of peer-reviewed papers that build off of BoTorch.

Contributing

See the CONTRIBUTING file for how to help out.

License

BoTorch is MIT licensed, as found in the LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

botorch-0.7.3.tar.gz (490.4 kB view details)

Uploaded Source

Built Distribution

botorch-0.7.3-py3-none-any.whl (431.6 kB view details)

Uploaded Python 3

File details

Details for the file botorch-0.7.3.tar.gz.

File metadata

  • Download URL: botorch-0.7.3.tar.gz
  • Upload date:
  • Size: 490.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.15

File hashes

Hashes for botorch-0.7.3.tar.gz
Algorithm Hash digest
SHA256 d35d45da5c8bf65a7ec3cf6107f5fe2685630e99cc73943a96ccfa476ba6c0af
MD5 6003307565f0a8b420d641e083ee1f8b
BLAKE2b-256 e2d0aa90f5cfc185ea31059bb8a08460f2151cd7e04eadf74d60be7c5f03d5c5

See more details on using hashes here.

Provenance

File details

Details for the file botorch-0.7.3-py3-none-any.whl.

File metadata

  • Download URL: botorch-0.7.3-py3-none-any.whl
  • Upload date:
  • Size: 431.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.15

File hashes

Hashes for botorch-0.7.3-py3-none-any.whl
Algorithm Hash digest
SHA256 5de139ed2c2d7bdf130e8c98ffb1c15dba9c48f000af8995e3cba48b4b21c7b9
MD5 2d5de87b4c646e74e48eb11b17cfcfb0
BLAKE2b-256 e1abe720413f552322db425af315530e49dbe3c1b2b997afa1d21a443177c2f3

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page