Miscellaneous MCMC samplers written in Aesara
Project description
AeMCMC is a Python library that automates the construction of samplers for Aesara graphs that represent statistical models.
Features
This project is currently in an alpha state, but the basic features/objectives are currently as follows:
Provide utilities that simplify the process of constructing Aesara graphs/functions for posterior and posterior predictive sampling
Host a wide array of “exact” posterior sampling steps (e.g. Gibbs steps, scale-mixture/decomposition-based conditional samplers, etc.)
Build a framework for identifying and composing said sampler steps and enumerating the possible samplers for an arbitrary model
Overall, we would like this project to serve as a hub for community-sourced specialized samplers and facilitate their general use.
Getting started
Using AeMCMC, one can construct sampling steps from a graph containing Aesara RandomVariables. AeMCMC analyzes the model graph and possibly rewrites it to find the most suitable sampler.
AeMCMC can recognize closed-form posteriors; for instance the following Beta-Binomial model amounts to sampling from a Beta distribution:
import aesara
import aemcmc
import aesara.tensor as at
srng = at.random.RandomStream(0)
p_rv = srng.beta(1., 1., name="p")
Y_rv = srng.binomial(10, p_rv, name="Y")
y_vv = Y_rv.clone()
y_vv.name = "y"
sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)
p_posterior_step = sampler.sample_steps[p_rv]
aesara.dprint(p_posterior_step)
# beta_rv{0, (0, 0), floatX, False}.1 [id A]
# |RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7F77B2831200>) [id B]
# |TensorConstant{[]} [id C]
# |TensorConstant{11} [id D]
# |Elemwise{add,no_inplace} [id E]
# | |TensorConstant{1.0} [id F]
# | |y [id G]
# |Elemwise{sub,no_inplace} [id H]
# |Elemwise{add,no_inplace} [id I]
# | |TensorConstant{1.0} [id F]
# | |TensorConstant{10} [id J]
# |y [id G]
sample_fn = aesara.function([y_vv], p_posterior_step)
AeMCMC also contains a database of Gibbs samplers that can be used to sample some models more efficiently than a general-purpose sampler like NUTS would:
import aemcmc
import aesara
import aesara.tensor as at
srng = at.random.RandomStream(0)
X = at.matrix("X")
# Horseshoe prior for `beta_rv`
tau_rv = srng.halfcauchy(0, 1, name="tau")
lmbda_rv = srng.halfcauchy(0, 1, size=X.shape[1], name="lambda")
beta_rv = srng.normal(0, lmbda_rv * tau_rv, size=X.shape[1], name="beta")
a = at.scalar("a")
b = at.scalar("b")
h_rv = srng.gamma(a, b, name="h")
# Negative-binomial regression
eta = X @ beta_rv
p = at.sigmoid(-eta)
Y_rv = srng.nbinom(h_rv, p, name="Y")
y_vv = Y_rv.clone()
y_vv.name = "y"
sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)
# `sampler.sample_steps` contains the sample step for each random variable
print(sampler.sample_steps[h_rv])
# h_posterior
# `sampler.stages` contains the sampling kernels sorted by scan order
print(sampler.stages)
# {HorseshoeGibbsKernel: [tau, lambda], NBRegressionGibbsKernel: [beta], DispersionGibbsKernel: [h]}
# Build a function that returns new samples
to_sample_rvs = [tau_rv, lmbda_rv, beta_rv, h_rv]
inputs = [a, b, X, y_vv] + [initial_values[rv] for rv in to_sample_rvs]
outputs = [sampler.sample_steps[rv] for rv in to_sample_rvs]
sample_fn = aesara.function(inputs, outputs, updates=sampler.updates)
In case no specialized sampler is found, AeMCMC assigns the NUTS sampler to the remaining variables. AeMCMC reparametrizes the model automatically to improve sampling if needed:
import aemcmc
import aesara
import aesara.tensor as at
srng = at.random.RandomStream(0)
mu_rv = srng.normal(0, 1, name="mu")
sigma_rv = srng.halfnormal(0.0, 1.0, name="sigma")
Y_rv = srng.normal(mu_rv, sigma_rv, name="Y")
y_vv = Y_rv.clone()
sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)
print(sampler.sample_steps.keys())
# dict_keys([sigma, mu])
print(sampler.stages)
# {NUTSKernel: [sigma, mu]}
print(sampler.parameters)
# {NUTSKernel: (step_size, inverse_mass_matrix)}
# Build a function that returns new samples
step_size, inverse_mass_matrix = list(sampler.parameters.values())[0]
inputs = [
initial_values[mu_rv],
initial_values[sigma_rv],
y_vv,
step_size,
inverse_mass_matrix
]
outputs = [sampler.sample_steps[mu_rv], sampler.sample_steps[sigma_rv]]
sample_fn = aesara.function(inputs, outputs, updates=sampler.updates)
Installation
The latest release of AeMCMC can be installed from PyPI using pip:
pip install aemcmc
Or via conda-forge:
conda install -c conda-forge aemcmc
The current development branch of AeMCMC can be installed from GitHub, also using pip:
pip install git+https://github.com/aesara-devs/aemcmc
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file aemcmc-nightly-0.0.6.dev20221222.tar.gz
.
File metadata
- Download URL: aemcmc-nightly-0.0.6.dev20221222.tar.gz
- Upload date:
- Size: 43.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c0ef407588d4a0af5f51b5bb25384d474078cbee5814fb0b215a3b355adfb518 |
|
MD5 | 855a654cbaf82d1dcaf2763de8b25646 |
|
BLAKE2b-256 | 60e11b9a7d59ab4cf202cfd5f4e49b682ef26469d225d1bc5e6e3832a8a0bce3 |