Skip to main content

Flexible and fast sampling in Python

Project description

BlackJAX

Continuous integration codecov PyPI version

BlackJAX animation: sampling BlackJAX with BlackJAX

What is BlackJAX?

BlackJAX is a library of samplers for JAX that works on CPU as well as GPU.

It is not a probabilistic programming library. However it integrates really well with PPLs as long as they can provide a (potentially unnormalized) log-probability density function compatible with JAX.

Who should use BlackJAX?

BlackJAX should appeal to those who:

  • Have a logpdf and just need a sampler;
  • Need more than a general-purpose sampler;
  • Want to sample on GPU;
  • Want to build upon robust elementary blocks for their research;
  • Are building a probabilistic programming language;
  • Want to learn how sampling algorithms work.

Quickstart

Installation

You can install BlackJAX using pip:

pip install blackjax

or via conda-forge:

conda install -c conda-forge blackjax

Nightly builds (bleeding edge) of Blackjax can also be installed using pip:

pip install blackjax-nightly

BlackJAX is written in pure Python but depends on XLA via JAX. By default, the version of JAX that will be installed along with BlackJAX will make your code run on CPU only. If you want to use BlackJAX on GPU/TPU we recommend you follow these instructions to install JAX with the relevant hardware acceleration support.

Example

Let us look at a simple self-contained example sampling with NUTS:

import jax
import jax.numpy as jnp
import jax.scipy.stats as stats
import numpy as np

import blackjax

observed = np.random.normal(10, 20, size=1_000)
def logdensity_fn(x):
    logpdf = stats.norm.logpdf(observed, x["loc"], x["scale"])
    return jnp.sum(logpdf)

# Build the kernel
step_size = 1e-3
inverse_mass_matrix = jnp.array([1., 1.])
nuts = blackjax.nuts(logdensity_fn, step_size, inverse_mass_matrix)

# Initialize the state
initial_position = {"loc": 1., "scale": 2.}
state = nuts.init(initial_position)

# Iterate
rng_key = jax.random.key(0)
for step in range(100):
    nuts_key = jax.random.fold_in(rng_key, step)
    state, _ = nuts.step(nuts_key, state)

See the documentation for more examples of how to use the library: how to write inference loops for one or several chains, how to use the Stan warmup, etc.

Philosophy

What is BlackJAX?

BlackJAX bridges the gap between "one liner" frameworks and modular, customizable libraries.

Users can import the library and interact with robust, well-tested and performant samplers with a few lines of code. These samplers are aimed at PPL developers, or people who have a logpdf and just need a sampler that works.

But the true strength of BlackJAX lies in its internals and how they can be used to experiment quickly on existing or new sampling schemes. This lower level exposes the building blocks of inference algorithms: integrators, proposal, momentum generators, etc and makes it easy to combine them to build new algorithms. It provides an opportunity to accelerate research on sampling algorithms by providing robust, performant and reusable code.

Why BlackJAX?

Sampling algorithms are too often integrated into PPLs and not decoupled from the rest of the framework, making them hard to use for people who do not need the modeling language to build their logpdf. Their implementation is most of the time monolithic and it is impossible to reuse parts of the algorithm to build custom kernels. BlackJAX solves both problems.

How does it work?

BlackJAX allows to build arbitrarily complex algorithms because it is built around a very general pattern. Everything that takes a state and returns a state is a transition kernel, and is implemented as:

new_state, info =  kernel(rng_key, state)

kernels are stateless functions and all follow the same API; state and information related to the transition are returned separately. They can thus be easily composed and exchanged. We specialize these kernels by closure instead of passing parameters.

Contributions

Please follow our short guide.

Citing Blackjax

To cite this repository:

@misc{cabezas2024blackjax,
      title={BlackJAX: Composable {B}ayesian inference in {JAX}},
      author={Alberto Cabezas and Adrien Corenflos and Junpeng Lao and Rémi Louf},
      year={2024},
      eprint={2402.10797},
      archivePrefix={arXiv},
      primaryClass={cs.MS}
}

In the above bibtex entry, names are in alphabetical order, the version number should be the last tag on the main branch.

Acknowledgements

Some details of the NUTS implementation were largely inspired by Numpyro's.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

blackjax-nightly-1.1.1.post7.tar.gz (4.6 MB view details)

Uploaded Source

Built Distribution

blackjax_nightly-1.1.1.post7-py3-none-any.whl (4.6 MB view details)

Uploaded Python 3

File details

Details for the file blackjax-nightly-1.1.1.post7.tar.gz.

File metadata

File hashes

Hashes for blackjax-nightly-1.1.1.post7.tar.gz
Algorithm Hash digest
SHA256 59094d6a856b72631a89fcf493c2f2d2dee1f3e0a6f182c81b2ef8c31bb54724
MD5 6596a66c6c39836c7d0e27baa0f98a0a
BLAKE2b-256 599866c27f02c67297a96b56cb4201e2e4c7d93ffbc6123c566d7e1b695c6ee7

See more details on using hashes here.

File details

Details for the file blackjax_nightly-1.1.1.post7-py3-none-any.whl.

File metadata

File hashes

Hashes for blackjax_nightly-1.1.1.post7-py3-none-any.whl
Algorithm Hash digest
SHA256 e3f0ee4fd3cd1f1622b55286930e24e29c42d137cfdde41ea97aa4166597827b
MD5 20e44b4632391487497a9f08f103865d
BLAKE2b-256 6d42c307b47a28ede6da82abf7d6fbd20b2a26a318f9809022ab7d30dad61886

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page