Skip to main content

File-like object from url of remote file, optimized for use with h5py.

Project description

remfile

latest-release tests codecov

Provides a file-like object for reading a remote file over HTTP, optimized for use with h5py.

Example usage:

# See examples/example1.py

import h5py
import remfile

url = 'https://dandiarchive.s3.amazonaws.com/blobs/d86/055/d8605573-4639-4b99-a6d9-e0ac13f9a7df'

file = remfile.File(url)

with h5py.File(file, 'r') as f:
    print(f['/'].keys())

See examples/example1.py for a more complete example.

Note: url can either be a string or an object that has a get_url() method. The latter is useful if the url is a presigned AWS URL that expires after a certain amount of time. However, if you implement your own get_url() method, make sure it renews the signed URL only when necessary!

Installation

pip install remfile

Why?

The conventional way of reading a remote hdf5 file is to use the fsspec library as in examples/example1_compare_fsspec.py. However, this approach is empirically much slower than using remfile. I am not familiar with the inner workings of fsspec, but it does not seem to be optimized for reading hdf5 files. Efficient access of remote hdf5 files requires reading small chunks of data to obtain meta information, and then large chunks of data, and parallelization, to obtain the larger data arrays.

See a timing comparison betweeen remfile and fsspec in the examples directory.

Furthermore, since the url can be an object with a get_url() method, it is possible to use remfile in a context where presigned URLs need to be renewed. As mentioned above, if you implement your own get_url() method, make sure it renews the signed URL only when necessary!

How?

A file-like object is created that reads the remote file in chunks using the requests library. A relatively small default chunk size is used, but when the system detects that a large data array is being accessed, it switches to a larger chunk size. For very large data arrays, the system will use multiple threads to read the data in parallel.

Disk caching

The following example shows how to use disk caching. It is important to note that this is not an LRU cache, so there is no cleanup operation. The cache will grow until the disk is full. Therefore, you are responsible for deleting the directory when you are done with it.

import remfile

url = 'https://dandiarchive.s3.amazonaws.com/blobs/d86/055/d8605573-4639-4b99-a6d9-e0ac13f9a7df'

cache_dirname = '/tmp/remfile_test_cache'
disk_cache = remfile.DiskCache(cache_dirname)

file = remfile.File(url, disk_cache=disk_cache)

with h5py.File(file, 'r') as f:
    print(f['/'].keys())

Caveats

This library is not intended to be a general purpose library for reading remote files. It is optimized for reading hdf5 files.

License

Apache 2.0

Author

Jeremy Magland, Center for Computational Mathematics, Flatiron Institute

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

remfile-0.1.8.tar.gz (11.3 kB view details)

Uploaded Source

Built Distribution

remfile-0.1.8-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file remfile-0.1.8.tar.gz.

File metadata

  • Download URL: remfile-0.1.8.tar.gz
  • Upload date:
  • Size: 11.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.17

File hashes

Hashes for remfile-0.1.8.tar.gz
Algorithm Hash digest
SHA256 a9cc249b1b45e66a9e0828f3fef533136347971f2c27f19a3dd96f201042626c
MD5 15ac19b917eeb0cf95891a4280b6221d
BLAKE2b-256 6a6511020a6612f191873dda0facefb386cec04982370ae1e6b9a29d9dc64ed4

See more details on using hashes here.

File details

Details for the file remfile-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: remfile-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 10.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.17

File hashes

Hashes for remfile-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 7c592345ed26cf5ce1ac4249908b534f5f1570cf7f690364b5e006d8a4a0db33
MD5 4496109c05da95304ac020de9c84d76e
BLAKE2b-256 9da4b37336fd00bd270cf59ce71cb5e9ee93c3e760e287898a49db8646579cd9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page