An implementation of Gaussian Processes in Pytorch
Project description
GPyTorch (Beta Release)
News!
- The Beta release is currently out! Note that it requires PyTorch >= 1.3
- If you need to install the alpha release (we recommend you use the latest version though!), check out the alpha release.
GPyTorch is a Gaussian process library implemented using PyTorch. GPyTorch is designed for creating scalable, flexible, and modular Gaussian process models with ease.
Internally, GPyTorch differs from many existing approaches to GP inference by performing all inference operations using modern numerical linear algebra techniques like preconditioned conjugate gradients. Implementing a scalable GP method is as simple as providing a matrix multiplication routine with the kernel matrix and its derivative via our LazyTensor
interface, or by composing many of our already existing LazyTensors
. This allows not only for easy implementation of popular scalable GP techniques, but often also for significantly improved utilization of GPU computing compared to solvers based on the Cholesky decomposition.
GPyTorch provides (1) significant GPU acceleration (through MVM based inference); (2) state-of-the-art implementations of the latest algorithmic advances for scalability and flexibility (SKI/KISS-GP, stochastic Lanczos expansions, LOVE, SKIP, stochastic variational deep kernel learning, ...); (3) easy integration with deep learning frameworks.
Examples and Tutorials
See our numerous examples and tutorials on how to construct all sorts of models in GPyTorch. These example notebooks and a walk through of GPyTorch are also available at our ReadTheDocs page here.
Installation
Requirements:
- Python >= 3.6
- PyTorch >= 1.3
N.B. GPyTorch will not run on PyTorch 0.4.1 or earlier versions.
First make sure that you have PyTorch (`>= 1.3
`) installed using the appropriate command from here.
Then install GPyTorch using pip or conda:
pip install gpytorch
conda install gpytorch -c gpytorch
To use packages globally but install GPyTorch as a user-only package, use pip install --user
above.
Latest (unstable) version
To get the latest (unstable) version, run
pip install git+https://github.com/cornellius-gp/gpytorch.git
Citing Us
If you use GPyTorch, please cite the following papers:
@inproceedings{gardner2018gpytorch,
title={GPyTorch: Blackbox Matrix-Matrix Gaussian Process Inference with GPU Acceleration},
author={Gardner, Jacob R and Pleiss, Geoff and Bindel, David and Weinberger, Kilian Q and Wilson, Andrew Gordon},
booktitle={Advances in Neural Information Processing Systems},
year={2018}
}
Documentation
- For tutorials and examples, check out the examples folder.
- For in-depth documentation, check out our read the docs.
Development
To run the unit tests:
python -m unittest
By default, the random seeds are locked down for some of the tests. If you want to run the tests without locking down the seed, run
UNLOCK_SEED=true python -m unittest
Please lint the code with flake8
.
pip install flake8 # if not already installed
flake8
The Team
GPyTorch is primarily maintained by:
- Jake Gardner (Uber AI Labs)
- Geoff Pleiss (Cornell University)
- Kilian Weinberger (Cornell University)
- Andrew Gordon Wilson (Cornell University)
- Max Balandat (Facebook)
Acknowledgements
Development of GPyTorch is supported by funding from the Bill and Melinda Gates Foundation, the National Science Foundation, and SAP.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file gpytorch-0.3.6.tar.gz
.
File metadata
- Download URL: gpytorch-0.3.6.tar.gz
- Upload date:
- Size: 208.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2819eabc121e1e91cc0d30c941debf45ded4a92f85bbefa86d4c346cbd7d23fc |
|
MD5 | 77f1a252201a11be3fbe0d2eb4384cf5 |
|
BLAKE2b-256 | 7ac7d8ec508a68c42bfc4fd68ef450b5b9379370b2b3679fab2381e3eea695c0 |