Skip to main content

Lightweight pipelining: using Python functions as pipeline jobs.

Project description

Joblib is a set of tools to provide lightweight pipelining in Python. In particular, joblib offers:

  1. transparent disk-caching of the output values and lazy re-evaluation (memoize pattern)

  2. easy simple parallel computing

  3. logging and tracing of the execution

Joblib is optimized to be fast and robust in particular on large data and has specific optimizations for numpy arrays. It is BSD-licensed.

User documentation:

http://packages.python.org/joblib

Download packages:

http://pypi.python.org/pypi/joblib#downloads

Source code:

http://github.com/joblib/joblib

Report issues:

http://github.com/joblib/joblib/issues

Vision

The vision is to provide tools to easily achieve better performance and reproducibility when working with long running jobs. In addition, Joblib can also be used to provide a light-weight make replacement or caching solution.

  • Avoid computing twice the same thing: code is rerun over an over, for instance when prototyping computational-heavy jobs (as in scientific development), but hand-crafted solution to aleviate this issue is error-prone and often leads to unreproducible results

  • Persist to disk transparently: persisting in an efficient way arbitrary objects containing large data is hard. In addition, hand-written persistence does not link easily the file on disk to the execution context of the original Python object. As a result, it is challenging to resume a application status or computational job, eg after a crash.

It strives to address these problems while leaving your code and your flow control as unmodified as possible (no framework, no new paradigms).

Main features

  1. Transparent and fast disk-caching of output value: a memoize or make-like functionality for Python functions that works well for arbitrary Python objects, including very large numpy arrays. Separate persistence and flow-execution logic from domain logic or algorithmic code by writing the operations as a set of steps with well-defined inputs and outputs: Python functions. Joblib can save their computation to disk and rerun it only if necessary:

    >>> from joblib import Memory
    >>> mem = Memory(cachedir='/tmp/joblib')
    >>> import numpy as np
    >>> a = np.vander(np.arange(3))
    >>> square = mem.cache(np.square)
    >>> b = square(a)                                   # doctest: +ELLIPSIS
    ________________________________________________________________________________
    [Memory] Calling square...
    square(array([[0, 0, 1],
           [1, 1, 1],
           [4, 2, 1]]))
    ___________________________________________________________square - 0...s, 0.0min
    
    >>> c = square(a)
    >>> # The above call did not trigger an evaluation
  2. Embarrassingly parallel helper: to make is easy to write readable parallel code and debug it quickly:

    >>> from joblib import Parallel, delayed
    >>> from math import sqrt
    >>> Parallel(n_jobs=1)(delayed(sqrt)(i**2) for i in range(10))
    [0.0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0]
    
  3. Logging/tracing: The different functionalities will progressively acquire better logging mechanism to help track what has been ran, and capture I/O easily. In addition, Joblib will provide a few I/O primitives, to easily define define logging and display streams, and provide a way of compiling a report. We want to be able to quickly inspect what has been run.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

joblib-0.5.7a.dev.tar.gz (210.2 kB view details)

Uploaded Source

Built Distributions

joblib-0.5.7a.dev-py2.7.egg (105.0 kB view details)

Uploaded Source

joblib-0.5.7a.dev-py2.6.egg (105.1 kB view details)

Uploaded Source

File details

Details for the file joblib-0.5.7a.dev.tar.gz.

File metadata

  • Download URL: joblib-0.5.7a.dev.tar.gz
  • Upload date:
  • Size: 210.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for joblib-0.5.7a.dev.tar.gz
Algorithm Hash digest
SHA256 df3f06707c3beeeff6dcd190360980d400d91cedd727caa0f8e1e71969e4c21d
MD5 42cb4660b80d9f0f18c6a2dccac31dac
BLAKE2b-256 1b107b7e4094b9bd3d21a5415874a7d854cef9bb7529449d8ba3f316a0bd2c11

See more details on using hashes here.

Provenance

File details

Details for the file joblib-0.5.7a.dev-py2.7.egg.

File metadata

File hashes

Hashes for joblib-0.5.7a.dev-py2.7.egg
Algorithm Hash digest
SHA256 32dd8fbed7808837aa6572dc2f4ea0b09189c20b0f5ba64a8f774696f3bd8d11
MD5 70857f49516d548ec03934f4c68dc5c9
BLAKE2b-256 9d313ae1dee098fd491541d10f290610a74bbde5e4b9d39957b70ba8baee109d

See more details on using hashes here.

Provenance

File details

Details for the file joblib-0.5.7a.dev-py2.6.egg.

File metadata

File hashes

Hashes for joblib-0.5.7a.dev-py2.6.egg
Algorithm Hash digest
SHA256 7f49fd0522d35ab190a7c77e3c2d135d95fe25311032c2ed1d6d9f7a8573da2c
MD5 707838dad9abe2f09a740c47a148c526
BLAKE2b-256 159c631d51a06a3691829a4a1b74ab5591995a9dfe001b6e294fb2f46fd25851

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page