Simulation-based inference.
Project description
sbi
: Simulation-Based Inference
Getting Started | Documentation
sbi
is a Python package for simulation-based inference, designed to meet the needs of
both researchers and practitioners. Whether you need fine-grained control or an
easy-to-use interface, sbi
has you covered.
With sbi
, you can perform parameter inference using Bayesian inference: Given a
simulator that models a real-world process, SBI estimates the full posterior
distribution over the simulator’s parameters based on observed data. This distribution
indicates the most likely parameter values while additionally quantifying uncertainty
and revealing potential interactions between parameters.
Key Features of sbi
sbi
offers a blend of flexibility and ease of use:
- Low-Level Interfaces: For those who require maximum control over the inference
process,
sbi
provides low-level interfaces that allow you to fine-tune many aspects of your workflow. - High-Level Interfaces: If you prefer simplicity and efficiency,
sbi
also offers high-level interfaces that enable quick and easy implementation of complex inference tasks.
In addition, sbi
supports a wide range of state-of-the-art inference algorithms (see
below for a list of implemented methods):
- Amortized Methods: These methods enable the reuse of posterior estimators across multiple observations without the need to retrain.
- Sequential Methods: These methods focus on individual observations, optimizing the number of simulations required.
Beyond inference, sbi
also provides:
- Validation Tools: Built-in methods to validate and verify the accuracy of your inferred posteriors.
- Plotting and Analysis Tools: Comprehensive functions for visualizing and analyzing results, helping you interpret the posterior distributions with ease.
Getting started with sbi
is straightforward, requiring only a few lines of code:
from sbi.inference import NPE
# Given: parameters theta and corresponding simulations x
inference = NPE(prior=prior)
inference.append_simulations(theta, x).train()
posterior = inference.build_posterior()
Installation
sbi
requires Python 3.9 or higher. While a GPU isn't necessary, it can improve
performance in some cases. We recommend using a virtual environment with
conda
for an easy setup.
To install sbi
, follow these steps:
-
Create a Conda Environment (if using Conda):
conda create -n sbi_env python=3.9 && conda activate sbi_env
-
Install
sbi
: Independent of whether you are usingconda
or not,sbi
can be installed usingpip
:
pip install sbi
- Test the installation: Open a Python prompt and run
from sbi.examples.minimal import simple
posterior = simple()
print(posterior)
Tutorials
If you're new to sbi
, we recommend starting with our Getting
Started tutorial.
You can also access and run these tutorials directly in your browser by opening
Codespace. To do so, click the green
“Code” button on the GitHub repository and select “Open with Codespaces.” This provides
a fully functional environment where you can explore sbi
through Jupyter notebooks.
Inference Algorithms
The following inference algorithms are currently available. You can find instructions on how to run each of these methods here.
Neural Posterior Estimation: amortized (NPE) and sequential (SNPE)
-
(S)NPE_A
(including amortized single-roundNPE
) from Papamakarios G and Murray I Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation (NeurIPS 2016). -
(S)NPE_C
orAPT
from Greenberg D, Nonnenmacher M, and Macke J Automatic Posterior Transformation for likelihood-free inference (ICML 2019). -
TSNPE
from Deistler M, Goncalves P, and Macke J Truncated proposals for scalable and hassle-free simulation-based inference (NeurIPS 2022). -
FMPE
from Wildberger, J., Dax, M., Buchholz, S., Green, S., Macke, J. H., & Schölkopf, B. Flow matching for scalable simulation-based inference. (NeurIPS 2023). -
NPSE
from Geffner, T., Papamakarios, G., & Mnih, A. Compositional score modeling for simulation-based inference. (ICML 2023)
Neural Likelihood Estimation: amortized (NLE) and sequential (SNLE)
(S)NLE
or justSNL
from Papamakarios G, Sterrat DC and Murray I Sequential Neural Likelihood (AISTATS 2019).
Neural Ratio Estimation: amortized (NRE) and sequential (SNRE)
-
(S)NRE_A
orAALR
from Hermans J, Begy V, and Louppe G. Likelihood-free Inference with Amortized Approximate Likelihood Ratios (ICML 2020). -
(S)NRE_B
orSRE
from Durkan C, Murray I, and Papamakarios G. On Contrastive Learning for Likelihood-free Inference (ICML 2020). -
(S)NRE_C
orNRE-C
from Miller BK, Weniger C, Forré P. Contrastive Neural Ratio Estimation (NeurIPS 2022). -
BNRE
from Delaunoy A, Hermans J, Rozet F, Wehenkel A, and Louppe G. Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation (NeurIPS 2022).
Neural Variational Inference, amortized (NVI) and sequential (SNVI)
SNVI
from Glöckler M, Deistler M, Macke J, Variational methods for simulation-based inference (ICLR 2022).
Mixed Neural Likelihood Estimation (MNLE)
MNLE
from Boelts J, Lueckmann JM, Gao R, Macke J, Flexible and efficient simulation-based inference for models of decision-making (eLife 2022).
Feedback and Contributions
We welcome any feedback on how sbi
is working for your inference problems (see
Discussions) and are happy to receive bug
reports, pull requests, and other feedback (see
contribute). We wish to maintain a positive
community; please read our Code of Conduct.
Acknowledgments
sbi
is the successor (using PyTorch) of the
delfi
package. It started as a fork of Conor M.
Durkan's lfi
. sbi
runs as a community project. See also
credits.
Support
sbi
has been supported by the German Federal Ministry of Education and Research (BMBF)
through project ADIMEM (FKZ 01IS18052 A-D), project SiMaLeSAM (FKZ 01IS21055A) and the
Tübingen AI Center (FKZ 01IS18039A). Since 2024, sbi
is supported by the appliedAI
Institute for Europe.
License
Apache License Version 2.0 (Apache-2.0)
Citation
If you use sbi
consider citing the sbi software
paper, in addition to the original research
articles describing the specific sbi-algorithm(s) you are using.
@article{tejero-cantero2020sbi,
doi = {10.21105/joss.02505},
url = {https://doi.org/10.21105/joss.02505},
year = {2020},
publisher = {The Open Journal},
volume = {5},
number = {52},
pages = {2505},
author = {Alvaro Tejero-Cantero and Jan Boelts and Michael Deistler and Jan-Matthis Lueckmann and Conor Durkan and Pedro J. Gonçalves and David S. Greenberg and Jakob H. Macke},
title = {sbi: A toolkit for simulation-based inference},
journal = {Journal of Open Source Software}
}
The above citation refers to the original version of the sbi
project and has a
persistent DOI. Additionally, new releases of sbi
are citable via
Zenodo, where we create a new DOI for every
release.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file sbi-0.23.2.tar.gz
.
File metadata
- Download URL: sbi-0.23.2.tar.gz
- Upload date:
- Size: 497.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 96b6dea2acb410d871bb2ea5b6c0b487aa6d270ac35900b4859981af4a2eab7c |
|
MD5 | 67bbbf0a85590bc1efa35b3bbfecd484 |
|
BLAKE2b-256 | 45398cb92c60202bb225f2ace04a2d8b4fca8595c6af5dc7379e52422be4d1bc |
Provenance
File details
Details for the file sbi-0.23.2-py3-none-any.whl
.
File metadata
- Download URL: sbi-0.23.2-py3-none-any.whl
- Upload date:
- Size: 364.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4ac0d75513ff569fb0e55a4ae236a2a1c829b8215de3d31037ad1db1e69696c2 |
|
MD5 | 1b37263d25e4a3c699742bc5f21286a4 |
|
BLAKE2b-256 | 5c0cf2cedca72693f063ddd4aa504431696a3bacd0ead763ca42065a35cd0180 |