Skip to main content

Simulation-based inference.

Project description

PyPI version Contributions welcome Tests codecov GitHub license DOI

sbi: simulation-based inference

Getting Started | Documentation

sbi is a PyTorch package for simulation-based inference. Simulation-based inference is the process of finding parameters of a simulator from observations.

sbi takes a Bayesian approach and returns a full posterior distribution over the parameters of the simulator, conditional on the observations. The package implements a variety of inference algorithms, including amortized and sequential methods. Amortized methods return a posterior that can be applied to many different observations without retraining; sequential methods focus the inference on one particular observation to be more simulation-efficient. See below for an overview of implemented methods.

sbi offers a simple interface for one-line posterior inference:

from sbi.inference import infer
# import your simulator, define your prior over the parameters
parameter_posterior = infer(simulator, prior, method='SNPE', num_simulations=100)

Installation

sbi requires Python 3.6 or higher. We recommend to use a conda virtual environment (Miniconda installation instructions). If conda is installed on the system, an environment for installing sbi can be created as follows:

# Create an environment for sbi (indicate Python 3.6 or higher); activate it
$ conda create -n sbi_env python=3.7 && conda activate sbi_env

Independent of whether you are using conda or not, sbi can be installed using pip:

pip install sbi

To test the installation, drop into a python prompt and run

from sbi.examples.minimal import simple
posterior = simple()
print(posterior)

Inference Algorithms

The following algorithms are currently available. You can find a tutorial on how to run each of these methods here.

Neural Posterior Estimation: amortized (NPE) and sequential (SNPE)

Neural Likelihood Estimation: amortized (NLE) and sequential (SNLE)

Neural Ratio Estimation: amortized (NRE) and sequential (SNRE)

Neural Variational Inference, amortized (NVI) and sequential (SNVI)

Mixed Neural Likelihood Estimation (MNLE)

Feedback and Contributions

We welcome any feedback on how sbi is working for your inference problems (see Discussions) and are happy to receive bug reports, pull requests and other feedback (see contribute). We wish to maintain a positive community, please read our Code of Conduct.

Acknowledgements

sbi is the successor (using PyTorch) of the delfi package. It was started as a fork of Conor M. Durkan's lfi. sbi runs as a community project; development is coordinated at the mackelab. See also credits.

Support

sbi has been supported by the German Federal Ministry of Education and Research (BMBF) through the project ADIMEM, FKZ 01IS18052 A-D). ADIMEM is a collaborative project between the groups of Jakob Macke (Uni Tübingen), Philipp Berens (Uni Tübingen), Philipp Hennig (Uni Tübingen) and Marcel Oberlaender (caesar Bonn) which aims to develop inference methods for mechanistic models.

License

Affero General Public License v3 (AGPLv3)

Citation

If you use sbi consider citing the sbi software paper, in addition to the original research articles describing the specific sbi-algorithm(s) you are using.

@article{tejero-cantero2020sbi,
  doi = {10.21105/joss.02505},
  url = {https://doi.org/10.21105/joss.02505},
  year = {2020},
  publisher = {The Open Journal},
  volume = {5},
  number = {52},
  pages = {2505},
  author = {Alvaro Tejero-Cantero and Jan Boelts and Michael Deistler and Jan-Matthis Lueckmann and Conor Durkan and Pedro J. Gonçalves and David S. Greenberg and Jakob H. Macke},
  title = {sbi: A toolkit for simulation-based inference},
  journal = {Journal of Open Source Software}
}

The above citation refers to the original version of the sbi project and has a persistent DOI. Additionally, new releases of sbi are citable via Zenodo, where we create a new DOI for every release.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sbi-0.22.0.tar.gz (417.4 kB view details)

Uploaded Source

Built Distribution

sbi-0.22.0-py2.py3-none-any.whl (272.3 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file sbi-0.22.0.tar.gz.

File metadata

  • Download URL: sbi-0.22.0.tar.gz
  • Upload date:
  • Size: 417.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for sbi-0.22.0.tar.gz
Algorithm Hash digest
SHA256 e632994c0bcfbc63c110d6eb04c6e1bdf6ae4ca42211d67ab4946d9b394dc360
MD5 89625ebcf087814a26a4fe903f4b7091
BLAKE2b-256 46add6494f2d3c38d9f5834027fe7f002930188776b7ac9abd14e5417e326dcd

See more details on using hashes here.

File details

Details for the file sbi-0.22.0-py2.py3-none-any.whl.

File metadata

  • Download URL: sbi-0.22.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 272.3 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for sbi-0.22.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 26dd81d3e1220c4ca16a33fc0779e18e35c296f97e7fae588712f17a412c058e
MD5 01da769270ef4d79363ef8575729348a
BLAKE2b-256 21e5e43c1b47be6ec16cf0fd43c6819ec3e7e52dfbfe0561f8b9a88b3f120f0d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page