Simulation-based inference.
Project description
sbi: simulation-based inference
Getting Started | Documentation
sbi
is a PyTorch package for simulation-based inference. Simulation-based inference is the process of finding parameters of a simulator from observations.
sbi
takes a Bayesian approach and returns a full posterior distribution over the parameters of the simulator, conditional on the observations.
The package implements a variety of inference algorithms, including amortized and sequential methods.
Amortized methods return a posterior that can be applied to many different observations without retraining; sequential methods focus the inference on one particular observation to be more simulation-efficient.
See below for an overview of implemented methods.
sbi
offers a simple interface for one-line posterior inference:
from sbi.inference import infer
# import your simulator, define your prior over the parameters
parameter_posterior = infer(simulator, prior, method='SNPE', num_simulations=100)
Installation
sbi
requires Python 3.6 or higher. We recommend to use a conda
virtual
environment (Miniconda installation instructions). If conda
is installed on the system, an environment for
installing sbi
can be created as follows:
# Create an environment for sbi (indicate Python 3.6 or higher); activate it
$ conda create -n sbi_env python=3.7 && conda activate sbi_env
Independent of whether you are using conda
or not, sbi
can be installed using pip
:
pip install sbi
To test the installation, drop into a python prompt and run
from sbi.examples.minimal import simple
posterior = simple()
print(posterior)
Inference Algorithms
The following algorithms are currently available. You can find a tutorial on how to run each of these methods here.
Neural Posterior Estimation: amortized (NPE) and sequential (SNPE)
-
SNPE_A
(including amortized single-roundNPE
) from Papamakarios G and Murray I Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation (NeurIPS 2016). -
SNPE_C
orAPT
from Greenberg D, Nonnenmacher M, and Macke J Automatic Posterior Transformation for likelihood-free inference (ICML 2019). -
TSNPE
from Deistler M, Goncalves P, and Macke J Truncated proposals for scalable and hassle-free simulation-based inference (NeurIPS 2022).
Neural Likelihood Estimation: amortized (NLE) and sequential (SNLE)
SNLE_A
or justSNL
from Papamakarios G, Sterrat DC and Murray I Sequential Neural Likelihood (AISTATS 2019).
Neural Ratio Estimation: amortized (NRE) and sequential (SNRE)
-
(S)NRE_A
orAALR
from Hermans J, Begy V, and Louppe G. Likelihood-free Inference with Amortized Approximate Likelihood Ratios (ICML 2020). -
(S)NRE_B
orSRE
from Durkan C, Murray I, and Papamakarios G. On Contrastive Learning for Likelihood-free Inference (ICML 2020). -
BNRE
from Delaunoy A, Hermans J, Rozet F, Wehenkel A, and Louppe G. Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation (NeurIPS 2022). -
(S)NRE_C
orNRE-C
from Miller BK, Weniger C, Forré P. Contrastive Neural Ratio Estimation (NeurIPS 2022).
Neural Variational Inference, amortized (NVI) and sequential (SNVI)
SNVI
from Glöckler M, Deistler M, Macke J, Variational methods for simulation-based inference (ICLR 2022).
Mixed Neural Likelihood Estimation (MNLE)
MNLE
from Boelts J, Lueckmann JM, Gao R, Macke J, Flexible and efficient simulation-based inference for models of decision-making (eLife 2022).
Feedback and Contributions
We welcome any feedback on how sbi
is working for your inference problems (see Discussions) and are happy to receive bug reports, pull requests and other feedback (see
contribute).
We wish to maintain a positive community, please read our Code of Conduct.
Acknowledgements
sbi
is the successor (using PyTorch) of the
delfi
package. It was started as a fork of Conor
M. Durkan's lfi
. sbi
runs as a community project; development is coordinated at the
mackelab. See also credits.
Support
sbi
has been supported by the German Federal Ministry of Education and Research (BMBF) through the project ADIMEM, FKZ 01IS18052 A-D). ADIMEM is a collaborative project between the groups of Jakob Macke (Uni Tübingen), Philipp Berens (Uni Tübingen), Philipp Hennig (Uni Tübingen) and Marcel Oberlaender (caesar Bonn) which aims to develop inference methods for mechanistic models.
License
Affero General Public License v3 (AGPLv3)
Citation
If you use sbi
consider citing the sbi software paper, in addition to the original research articles describing the specific sbi-algorithm(s) you are using.
@article{tejero-cantero2020sbi,
doi = {10.21105/joss.02505},
url = {https://doi.org/10.21105/joss.02505},
year = {2020},
publisher = {The Open Journal},
volume = {5},
number = {52},
pages = {2505},
author = {Alvaro Tejero-Cantero and Jan Boelts and Michael Deistler and Jan-Matthis Lueckmann and Conor Durkan and Pedro J. Gonçalves and David S. Greenberg and Jakob H. Macke},
title = {sbi: A toolkit for simulation-based inference},
journal = {Journal of Open Source Software}
}
The above citation refers to the original version of the sbi
project and has a persistent DOI.
Additionally, new releases of sbi
are citable via Zenodo, where we create a new DOI for every release.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file sbi-0.22.0.tar.gz
.
File metadata
- Download URL: sbi-0.22.0.tar.gz
- Upload date:
- Size: 417.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e632994c0bcfbc63c110d6eb04c6e1bdf6ae4ca42211d67ab4946d9b394dc360 |
|
MD5 | 89625ebcf087814a26a4fe903f4b7091 |
|
BLAKE2b-256 | 46add6494f2d3c38d9f5834027fe7f002930188776b7ac9abd14e5417e326dcd |
File details
Details for the file sbi-0.22.0-py2.py3-none-any.whl
.
File metadata
- Download URL: sbi-0.22.0-py2.py3-none-any.whl
- Upload date:
- Size: 272.3 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 26dd81d3e1220c4ca16a33fc0779e18e35c296f97e7fae588712f17a412c058e |
|
MD5 | 01da769270ef4d79363ef8575729348a |
|
BLAKE2b-256 | 21e5e43c1b47be6ec16cf0fd43c6819ec3e7e52dfbfe0561f8b9a88b3f120f0d |