Skip to main content

Appendable key-value storage

Project description

Build Status Version Status

Key-value byte store with appendable values

Partd stores key-value pairs. Values are raw bytes. We append on old values.

Partd excels at shuffling operations.

Operations

PartD has two main operations, append and get.

Example

  1. Create a Partd backed by a directory:

    >>> import partd
    >>> p = partd.File('/path/to/new/dataset/')
  2. Append key-byte pairs to dataset:

    >>> p.append({'x': b'Hello ', 'y': b'123'})
    >>> p.append({'x': b'world!', 'y': b'456'})
  3. Get bytes associated to keys:

    >>> p.get('x')         # One key
    b'Hello world!'
    
    >>> p.get(['y', 'x'])  # List of keys
    [b'123456', b'Hello world!']
  4. Destroy partd dataset:

    >>> p.drop()

That’s it.

Implementations

We can back a partd by an in-memory dictionary:

>>> p = Dict()

For larger amounts of data or to share data between processes we back a partd by a directory of files. This uses file-based locks for consistency.:

>>> p = File('/path/to/dataset/')

However this can fail for many small writes. In these cases you may wish to buffer one partd with another, keeping a fixed maximum of data in the buffering partd. This writes the larger elements of the first partd to the second partd when space runs low:

>>> p = Buffer(Dict(), File(), available_memory=2e9)  # 2GB memory buffer

You might also want to have many distributed process write to a single partd consistently. This can be done with a server

  • Server Process:

    >>> p = Buffer(Dict(), File(), available_memory=2e9)  # 2GB memory buffer
    >>> s = Server(p, address='ipc://server')
  • Worker processes:

    >>> p = Client('ipc://server')  # Client machine talks to remote server

Encodings and Compression

Once we can robustly and efficiently append bytes to a partd we consider compression and encodings. This is generally available with the Encode partd, which accepts three functions, one to apply on bytes as they are written, one to apply to bytes as they are read, and one to join bytestreams. Common configurations already exist for common data and compression formats.

We may wish to compress and decompress data transparently as we interact with a partd. Objects like BZ2, Blosc, ZLib and Snappy exist and take another partd as an argument.:

>>> p = File(...)
>>> p = ZLib(p)

These work exactly as before, the (de)compression happens automatically.

Common data formats like Python lists, numpy arrays, and pandas dataframes are also supported out of the box.:

>>> p = File(...)
>>> p = NumPy(p)
>>> p.append({'x': np.array([...])})

This lets us forget about bytes and think instead in our normal data types.

Composition

In principle we want to compose all of these choices together

  1. Write policy: Dict, File, Buffer, Client

  2. Encoding: Pickle, Numpy, Pandas, …

  3. Compression: Blosc, Snappy, …

Partd objects compose by nesting. Here we make a partd that writes pickle encoded BZ2 compressed bytes directly to disk:

>>> p = Pickle(BZ2(File('foo')))

We could construct more complex systems that include compression, serialization, buffering, and remote access.:

>>> server = Server(Buffer(Dict(), File(), available_memory=2e0))

>>> client = Pickle(Snappy(Client(server.address)))
>>> client.append({'x': [1, 2, 3]})

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

partd-0.3.5.tar.gz (16.0 kB view details)

Uploaded Source

Built Distribution

partd-0.3.5-py2.py3-none-any.whl (19.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file partd-0.3.5.tar.gz.

File metadata

  • Download URL: partd-0.3.5.tar.gz
  • Upload date:
  • Size: 16.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for partd-0.3.5.tar.gz
Algorithm Hash digest
SHA256 83f4bf4973e2e3ea1b15af344766d3b20184bf669777e8e37129fcb496a6d2df
MD5 0adf919103e6cc8c7a6d72f3027cefb3
BLAKE2b-256 e0126cbc058ce0e6d91def9786d4015289b7a59d6e670ebf27bb9e1268a0f484

See more details on using hashes here.

Provenance

File details

Details for the file partd-0.3.5-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for partd-0.3.5-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 e07a00b221d469e6809e8eb57931d9fa9b6a6efa687aeb27704bc36bc03656fc
MD5 413d0255aedf953feaa0bff2faf8a4b4
BLAKE2b-256 2887f021d5d1454ec6f61b4c29e7c0add130f899c46811ed193fb766e14b4e79

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page