Skip to main content

Cross-compatible API for accessing Posix and OBS storage systems

Project description

Build Status

stor provides a cross-compatible CLI and Python API for accessing block and object storage. stor was created so you could write one piece of code to work with local or remote files, without needing to write specialized code to handle failure modes, retrying or temporarily system unavailability. The functional API (i.e., stor.copytree, stor.rmtree, stor.remove, stor.listdir) will work with the same semantics across all storage backends. This makes it really easy to develop/test code locally with files and then take advantage of robust and cheaper object storage when you push to remote.

View full docs for stor at https://counsyl.github.io/stor/ .

Quickstart

pip install stor

stor provides both a CLI and a Python library for manipulating Posix and OBS with a single, cross-compatible API.

Quickstart - CLI

usage: stor [-h] [-c CONFIG_FILE] [--version]
            {list,ls,cp,rm,walkfiles,cat,cd,pwd,clear,url,convert-swiftstack}
            ...

A command line interface for stor.

positional arguments:
  {list,ls,cp,rm,walkfiles,cat,cd,pwd,clear,url,convert-swiftstack}
    list                List contents using the path as a prefix.
    ls                  List path as a directory.
    cp                  Copy a source to a destination path.
    rm                  Remove file at a path.
    walkfiles           List all files under a path that match an optional
                        pattern.
    cat                 Output file contents to stdout.
    cd                  Change directory to a given OBS path.
    pwd                 Get the present working directory of a service or all
                        current directories.
    clear               Clear current directories of a specified service.
    url                 generate URI for path
    convert-swiftstack  convert swiftstack paths

optional arguments:
  -h, --help            show this help message and exit
  -c CONFIG_FILE, --config CONFIG_FILE
                        File containing configuration settings.
  --version             Print version

You can ls local and remote directories

›› stor ls s3://stor-test-bucket
s3://stor-test-bucket/b.txt
s3://stor-test-bucket/counsyl-storage-utils
s3://stor-test-bucket/file_test.txt
s3://stor-test-bucket/counsyl-storage-utils/
s3://stor-test-bucket/empty/
s3://stor-test-bucket/lots_of_files/
s3://stor-test-bucket/small_test/

Copy files locally or remotely or upload from stdin

›› echo "HELLO WORLD" | stor cp - swift://AUTH_stor_test/hello_world.txt
starting upload of 1 objects
upload complete - 1/1   0:00:00 0.00 MB 0.00 MB/s
›› stor cat swift://AUTH_stor_test/hello_world.txt
HELLO WORLD
›› stor cp swift://AUTH_stor_test/hello_world.txt hello_world.txt
›› stor cat hello_world.txt
HELLO WORLD

Quickstart - Python

List files in a directory, taking advantage of delimiters

>>> stor.listdir('s3://bestbucket')
[S3Path('s3://bestbucket/a/')
 S3Path('s3://bestbucket/b/')]

List all objects in a bucket

>>> stor.list('s3://bestbucket')
[S3Path('s3://bestbucket/a/1.txt')
 S3Path('s3://bestbucket/a/2.txt')
 S3Path('s3://bestbucket/a/3.txt')
 S3Path('s3://bestbucket/b/1.txt')]

Or in a local path

>>> stor.list('stor')
[PosixPath('stor/__init__.py'),
 PosixPath('stor/exceptions.pyc'),
 PosixPath('stor/tests/test_s3.py'),
 PosixPath('stor/tests/test_swift.py'),
 PosixPath('stor/tests/test_integration_swift.py'),
 PosixPath('stor/tests/test_utils.py'),
 PosixPath('stor/posix.pyc'),
 PosixPath('stor/base.py'),

Read and write files from POSIX or OBS, using python file objects.

import stor
with stor.open('/my/exciting.json') as fp:
    data1 = json.load(fp)

data1['read'] = True

with stor.open('s3://bestbucket/exciting.json') as fp:
    json.dump(data1, fp)

Testing code that uses stor

The key design consideration of stor is that your code should be able to transparently use POSIX or any object storage system to read and update files. So, rather than use mocks, we suggest that you structure your test code to point to local filesystem paths and restrict yourself to the functional API. E.g., in your prod settings, you could set DATADIR = 's3://bestbucketever'and when you test, you could use DATADIR = '/somewhat/cool/path/to/test/data', while your actual code just says:

with stor.open(stor.join(DATADIR, experiment)) as fp:
    data = json.load(fp)

Easy! and no mocks required!

Running the Tests

make test

Contributing and Semantic Versioning

We use semantic versioning to communicate when we make API changes to the library. See CONTRIBUTING.md for more details on contributing to stor.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

stor-4.1.0.tar.gz (1.8 MB view details)

Uploaded Source

Built Distribution

stor-4.1.0-py3-none-any.whl (2.1 MB view details)

Uploaded Python 3

File details

Details for the file stor-4.1.0.tar.gz.

File metadata

  • Download URL: stor-4.1.0.tar.gz
  • Upload date:
  • Size: 1.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.2 CPython/3.6.15 Linux/3.13.0-170-generic

File hashes

Hashes for stor-4.1.0.tar.gz
Algorithm Hash digest
SHA256 145e9aefa84ce541e28bd13e8e5f303d2ae23a04ead5bb91ccaa9474e7c919a4
MD5 41acdee686108523f5ce4f3f02306ce1
BLAKE2b-256 f0397bb6d3cf87e53535240d1198a5d61765b95c48af4097f1b0669e1ab443a0

See more details on using hashes here.

Provenance

File details

Details for the file stor-4.1.0-py3-none-any.whl.

File metadata

  • Download URL: stor-4.1.0-py3-none-any.whl
  • Upload date:
  • Size: 2.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.2 CPython/3.6.15 Linux/3.13.0-170-generic

File hashes

Hashes for stor-4.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7343c5588f634d204fff110c8101362a5d87dd5249b7171fd6afd5d434a0c6e2
MD5 3b208426ce76a80b9bb40c50ea4bcddc
BLAKE2b-256 5a2fbe667fa5b91dcd7806ebf56074a1125e0b0057d45f37b2c353cf148f31a7

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page