Skip to main content

Transparent optimized reading of n-dimensional Blosc2 slices for h5py

Project description

b2h5py provides h5py with transparent, automatic optimized reading of n-dimensional slices of Blosc2-compressed datasets. This optimized slicing leverages direct chunk access (skipping the slow HDF5 filter pipeline) and 2-level partitioning into chunks and then smaller blocks (so that less data is actually decompressed).

Benchmarks of this technique show 2x-5x speed-ups compared with normal filter-based access. Comparable results are obtained with a similar technique in PyTables, see Optimized Hyper-slicing in PyTables with Blosc2 NDim.

doc/benchmark.png

Usage

This optimized access works for slices with step 1 on Blosc2-compressed datasets using the native byte order. It is enabled by monkey-patching the h5py.Dataset class to extend the slicing operation. This is done on module import, so the only thing you need to do is:

import b2h5py

After that, optimization will be attempted for any slicing of a dataset (of the form dataset[...] or dataset.__getitem__(...)). If the optimization is not possible in a particular case, normal h5py slicing code will be used (which performs HDF5 filter-based access, backed by hdf5plugin to support Blosc2).

You may globally disable the optimization after importing b2h5py by calling b2h5py.unpatch_dataset_class(), and enable it again with b2h5py.patch_dataset_class(). You may also perform this patching temporarily using patching_dataset_class() to get a context manager.

Even if the module is imported and the Dataset class is patched, you may still force-disable the optimization by setting BLOSC2_FILTER=1 in the environment.

Building

Just install PyPA build (e.g. pip install build), enter the source code directory and run pyproject-build to get a source tarball and a wheel under the dist directory.

Installing

To install as a wheel from PyPI, run pip install b2h5py.

You may also install the wheel that you built in the previous section, or enter the source code directory and run pip install . from there.

Running tests

If you have installed b2h5py, just run python -m unittest discover b2h5py.tests.

Otherwise, just enter its source code directory and run python -m unittest.

You can also run the h5py tests with the patched Dataset class to check that patching does not break anything. You may install the h5py-test extra (e.g. pip install b2h5py[h5py-test] and run python -m b2h5py.tests.test_patched_h5py.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

b2h5py-0.2.0.tar.gz (12.9 kB view details)

Uploaded Source

Built Distribution

b2h5py-0.2.0-py3-none-any.whl (12.9 kB view details)

Uploaded Python 3

File details

Details for the file b2h5py-0.2.0.tar.gz.

File metadata

  • Download URL: b2h5py-0.2.0.tar.gz
  • Upload date:
  • Size: 12.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.9.6 requests/2.28.1 setuptools/63.2.0 requests-toolbelt/0.9.1 tqdm/4.64.1 CPython/3.10.7

File hashes

Hashes for b2h5py-0.2.0.tar.gz
Algorithm Hash digest
SHA256 36a7c59611fc2761ebaa6321be99be943ccabd73769521440488c28c16a3a118
MD5 403a569721f2f383faa0075a5e1f7cd8
BLAKE2b-256 74060e010c96e6971421144d733294dfa738e21c3e4a63f5cbffac944f26b410

See more details on using hashes here.

File details

Details for the file b2h5py-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: b2h5py-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 12.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.9.6 requests/2.28.1 setuptools/63.2.0 requests-toolbelt/0.9.1 tqdm/4.64.1 CPython/3.10.7

File hashes

Hashes for b2h5py-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5d91877e2db7c6332e9fae3021348553cc84c74602aa6ff02f29d5ae67b0c825
MD5 f72c418a1e3921f2d3b0da12c52f1a08
BLAKE2b-256 05a51953c5a179ffe768468e109a5237160842bae01855d024537d842996025c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page