Skip to main content

Miscellaneous MCMC samplers written in Aesara

Project description

Tests Status Coverage Gitter

AeMCMC is a Python library that automates the construction of samplers for Aesara graphs that represent statistical models.

Features

This project is currently in an alpha state, but the basic features/objectives are currently as follows:

  • Provide utilities that simplify the process of constructing Aesara graphs/functions for posterior and posterior predictive sampling

  • Host a wide array of “exact” posterior sampling steps (e.g. Gibbs steps, scale-mixture/decomposition-based conditional samplers, etc.)

  • Build a framework for identifying and composing said sampler steps and enumerating the possible samplers for an arbitrary model

Overall, we would like this project to serve as a hub for community-sourced specialized samplers and facilitate their general use.

Getting started

Using AeMCMC, one can construct sampling steps from a graph containing Aesara RandomVariables. AeMCMC analyzes the model graph and possibly rewrites it to find the most suitable sampler.

AeMCMC can recognize closed-form posteriors; for instance the following Beta-Binomial model amounts to sampling from a Beta distribution:

import aesara
import aemcmc
import aesara.tensor as at

srng = at.random.RandomStream(0)

p_rv = srng.beta(1., 1., name="p")
Y_rv = srng.binomial(10, p_rv, name="Y")

y_vv = Y_rv.clone()
y_vv.name = "y"

sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)

p_posterior_step = sampler.sample_steps[p_rv]
aesara.dprint(p_posterior_step)
# beta_rv{0, (0, 0), floatX, False}.1 [id A]
#  |RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7F77B2831200>) [id B]
#  |TensorConstant{[]} [id C]
#  |TensorConstant{11} [id D]
#  |Elemwise{add,no_inplace} [id E]
#  | |TensorConstant{1.0} [id F]
#  | |y [id G]
#  |Elemwise{sub,no_inplace} [id H]
#    |Elemwise{add,no_inplace} [id I]
#    | |TensorConstant{1.0} [id F]
#    | |TensorConstant{10} [id J]
#    |y [id G]

sample_fn = aesara.function([y_vv], p_posterior_step)

AeMCMC also contains a database of Gibbs samplers that can be used to sample some models more efficiently than a general-purpose sampler like NUTS would:

import aemcmc
import aesara
import aesara.tensor as at

srng = at.random.RandomStream(0)

X = at.matrix("X")

# Horseshoe prior for `beta_rv`
tau_rv = srng.halfcauchy(0, 1, name="tau")
lmbda_rv = srng.halfcauchy(0, 1, size=X.shape[1], name="lambda")
beta_rv = srng.normal(0, lmbda_rv * tau_rv, size=X.shape[1], name="beta")

a = at.scalar("a")
b = at.scalar("b")
h_rv = srng.gamma(a, b, name="h")

# Negative-binomial regression
eta = X @ beta_rv
p = at.sigmoid(-eta)
Y_rv = srng.nbinom(h_rv, p, name="Y")

y_vv = Y_rv.clone()
y_vv.name = "y"

sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)

# `sampler.sample_steps` contains the sample step for each random variable
print(sampler.sample_steps[h_rv])
# h_posterior

# `sampler.stages` contains the sampling kernels sorted by scan order
print(sampler.stages)
# {HorseshoeGibbsKernel: [tau, lambda], NBRegressionGibbsKernel: [beta], DispersionGibbsKernel: [h]}

# Build a function that returns new samples
to_sample_rvs = [tau_rv, lmbda_rv, beta_rv, h_rv]
inputs = [a, b, X, y_vv] + [initial_values[rv] for rv in to_sample_rvs]
outputs = [sampler.sample_steps[rv] for rv in to_sample_rvs]
sample_fn = aesara.function(inputs, outputs, updates=sampler.updates)

In case no specialized sampler is found, AeMCMC assigns the NUTS sampler to the remaining variables. AeMCMC reparametrizes the model automatically to improve sampling if needed:

import aemcmc
import aesara
import aesara.tensor as at

srng = at.random.RandomStream(0)
mu_rv = srng.normal(0, 1, name="mu")
sigma_rv = srng.halfnormal(0.0, 1.0, name="sigma")
Y_rv = srng.normal(mu_rv, sigma_rv, name="Y")

y_vv = Y_rv.clone()

sampler, initial_values = aemcmc.construct_sampler({Y_rv: y_vv}, srng)

print(sampler.sample_steps.keys())
# dict_keys([sigma, mu])
print(sampler.stages)
# {NUTSKernel: [sigma, mu]}
print(sampler.parameters)
# {NUTSKernel: (step_size, inverse_mass_matrix)}

# Build a function that returns new samples
step_size, inverse_mass_matrix = list(sampler.parameters.values())[0]
inputs = [
    initial_values[mu_rv],
    initial_values[sigma_rv],
    y_vv,
    step_size,
    inverse_mass_matrix
]
outputs = [sampler.sample_steps[mu_rv], sampler.sample_steps[sigma_rv]]
sample_fn = aesara.function(inputs, outputs, updates=sampler.updates)

Installation

The latest release of AeMCMC can be installed from PyPI using pip:

pip install aemcmc

Or via conda-forge:

conda install -c conda-forge aemcmc

The current development branch of AeMCMC can be installed from GitHub, also using pip:

pip install git+https://github.com/aesara-devs/aemcmc

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aemcmc-nightly-0.0.7.dev20230106.tar.gz (43.3 kB view details)

Uploaded Source

File details

Details for the file aemcmc-nightly-0.0.7.dev20230106.tar.gz.

File metadata

File hashes

Hashes for aemcmc-nightly-0.0.7.dev20230106.tar.gz
Algorithm Hash digest
SHA256 f68cb0507ce0c0d41c908b33e260c32844e4cc72f223f5c3e18e6300901473ac
MD5 bb02ec1a75c1cfb0f94b67f0d5cc9910
BLAKE2b-256 e1c1d44cb5d082e4503ea625f5423a997dd387663217e9da1b2cbc9d9916c92d

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page