Skip to main content

Decorator for reusable models in PyMC3

Project description

Build Status Coverage Status

sampled

Decorator for reusable models in PyMC3

Provides syntactic sugar for reusable models with PyMC3. This lets you separate creating a generative model from using the model.

Here is an example of creating a model:

import numpy as np
import pymc3 as pm
from sampled import sampled

@sampled
def linear_model(X, y):
    shape = X.shape
    X = pm.Normal('X', mu=np.mean(X, axis=0), sd=np.std(X, axis=0), shape=shape)
    coefs = pm.Normal('coefs', mu=np.zeros(shape[1]), sd=np.ones(shape[1]), shape=shape[1])
    pm.Normal('y', mu=np.dot(X, coefs), sd=np.ones(shape[0]), shape=shape[0])

Now here is how to use the model:

X = np.random.normal(size=(1000, 10))
w = np.random.normal(size=10)
y = X.dot(w) + np.random.normal(scale=0.1, size=1000)

with linear_model(X=X, y=y):
    sampled_coefs = pm.sample(draws=1000, tune=500)

np.allclose(sampled_coefs.get_values('coefs').mean(axis=0), w, atol=0.1) # True

You can also use this to build graphical networks – here is a continuous version of the STUDENT example from Koller and Friedman’s “Probabilistic Graphical Models”, chapter 3:

@sampled
def student():
    difficulty = pm.Beta('difficulty', alpha=5, beta=5)
    intelligence = pm.Beta('intelligence', alpha=5, beta=5)
    SAT = pm.Beta('SAT', alpha=20 * intelligence, beta=20 * (1 - intelligence))
    grade_avg = 0.5 + 0.5 * tt.sqrt((1 - difficulty) * intelligence)
    grade = pm.Beta('grade', alpha=20 * grade_avg, beta=20 * (1 - grade_avg))
    recommendation = pm.Binomial('recommendation', n=1, p=0.7 * grade)

Observations may be passed into any node, and we can observe how that changes posterior expectations:

# no prior knowledge
with student():
    prior = pm.sample(draws=1000, tune=500)

prior.get_values('recommendation').mean()  # 0.502

# 99th percentile SAT score --> higher chance of a recommendation
with student(SAT=0.99):
    good_sats = pm.sample(draws=1000, tune=500)

good_sats.get_values('recommendation').mean()  # 0.543

# A good grade in a hard class --> very high chance of recommendation
with student(difficulty=0.99, grade=0.99):
    hard_class_good_grade = pm.sample(draws=1000, tune=500)

hard_class_good_grade.get_values('recommendation').mean()  # 0.705

References

  • Koller, Daphne, and Nir Friedman. Probabilistic graphical models: principles and techniques. MIT press, 2009.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sampled-0.1.2.tar.gz (3.4 kB view details)

Uploaded Source

Built Distribution

sampled-0.1.2-py2.py3-none-any.whl (4.9 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file sampled-0.1.2.tar.gz.

File metadata

  • Download URL: sampled-0.1.2.tar.gz
  • Upload date:
  • Size: 3.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for sampled-0.1.2.tar.gz
Algorithm Hash digest
SHA256 aeedde577c4e787a91d5ae3510d2d0733c9d72f381e75fee1c8d98fb365ad4c9
MD5 32453efcc964198e5eb6ade8cabc00f4
BLAKE2b-256 9aff9ba6fce1125f1cd2b9a9daeb49d8a0e0c868661c652e8e560ae46eb614fa

See more details on using hashes here.

File details

Details for the file sampled-0.1.2-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for sampled-0.1.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 9aa5c6a96f63862d344f65c9fd1fbbe34cf61057cde4c3401da21f22e1468ebb
MD5 f1e6901fb10e0e304ff30d51d8443d38
BLAKE2b-256 53f12f9de51aee389c3932d682b5a9cfdd3acc55d5d1cb8069449b68340755bb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page