Skip to main content

A utility library for working with Table Schema in Python

Project description

dataflows-aws

Travis Coveralls

Dataflows's processors to work with AWS

Features

  • dump_to_s3 processor
  • change_acl_on_s3 processor

Contents

Getting Started

Installation

The package use semantic versioning. It means that major versions could include breaking changes. It's recommended to specify package version range in your setup/requirements file e.g. package>=1.0,<2.0.

$ pip install dataflows-aws

Examples

These processors have to be used as a part of data flow. For example:

flow = Flow(
    load('data/data.csv'),
    dump_to_s3(
        bucket=bucket,
        acl='private',
        path='my/datapackage',
        endpoint_url=os.environ['S3_ENDPOINT_URL'],
    ),
)
flow.process()

Documentation

dump_to_s3

Saves the DataPackage to AWS S3.

Parameters

  • bucket - Name of the bucket where DataPackage will be stored (should already be created!)
  • acl - ACL to provide the uploaded files. Default is 'public-read' (see boto3 docs for more info).
  • path - Path (key/prefix) to the DataPackage. May contain format string available for datapackage.json Eg: my/example/path/{owner}/{name}/{version}
  • content_type - content type to use when storing files in S3. Defaults to text/plain (usual S3 default is binary/octet-stream but we prefer text/plain).
  • endpoint_url - api endpoint to allow using S3 compatible services (e.g. 'https://ams3.digitaloceanspaces.com')

change_acl_on_s3

Changes ACL of object in given Bucket with given path aka prefix.

Parameters

  • bucket - Name of the bucket where objects are stored
  • acl - Available options 'private'|'public-read'|'public-read-write'|'authenticated-read'|'aws-exec-read'|'bucket-owner-read'|'bucket-owner-full-control'
  • path - Path (key/prefix) to the DataPackage.
  • endpoint_url - api endpoint to allow using S3 compatible services (e.g. 'https://ams3.digitaloceanspaces.com')

Contributing

The project follows the Open Knowledge International coding standards.

The recommended way to get started is to create and activate a project virtual environment. To install package and development dependencies into your active environment:

$ make install

To run tests with linting and coverage:

$ make test

For linting, pylama (configured in pylama.ini) is used. At this stage it's already installed into your environment and could be used separately with more fine-grained control as described in documentation - https://pylama.readthedocs.io/en/latest/.

For example to sort results by error type:

$ pylama --sort <path>

For testing, tox (configured in tox.ini) is used. It's already installed into your environment and could be used separately with more fine-grained control as described in documentation - https://testrun.org/tox/latest/.

For example to check subset of tests against Python 2 environment with increased verbosity. All positional arguments and options after -- will be passed to py.test:

tox -e py37 -- -v tests/<path>

Under the hood tox uses pytest (configured in pytest.ini), coverage and mock packages. These packages are available only in tox envionments.

Changelog

Here described only breaking and the most important changes. The full changelog and documentation for all released versions can be found in the nicely formatted commit history.

v0.x

  • an initial processors implementation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dataflows-aws-0.2.0.tar.gz (7.2 MB view details)

Uploaded Source

Built Distribution

dataflows_aws-0.2.0-py2.py3-none-any.whl (6.9 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file dataflows-aws-0.2.0.tar.gz.

File metadata

  • Download URL: dataflows-aws-0.2.0.tar.gz
  • Upload date:
  • Size: 7.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.34.0 CPython/3.7.1

File hashes

Hashes for dataflows-aws-0.2.0.tar.gz
Algorithm Hash digest
SHA256 47165c33916a1c7b09c97ded1c7a35a32bb802e594ff1849a48e7179b774e931
MD5 2a466568afdb5a883e155cfc7a74e39e
BLAKE2b-256 453fdae33256ead4a01849a67e89a9931c1eb26aa969bd163a80a3dffe7a0d4c

See more details on using hashes here.

Provenance

File details

Details for the file dataflows_aws-0.2.0-py2.py3-none-any.whl.

File metadata

  • Download URL: dataflows_aws-0.2.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 6.9 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.34.0 CPython/3.7.1

File hashes

Hashes for dataflows_aws-0.2.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 b3f9ba640fc66fa9c7a0c0bc0f0a4efb3925ba989297fd6f906173553cc6750e
MD5 2ea5cb9f9d31a355a89e33d2a818a9a9
BLAKE2b-256 daad9544cd1e25825a0a32adef5f5d7dfd2fd805e016333d3e0c408f92b60e0b

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page