Skip to main content

{{ DESCRIPTION }}

Project description

# AWS Extensions for datapackage-pipelines

[![Build Status](https://travis-ci.org/frictionlessdata/datapackage-pipelines-aws.svg?branch=master)](https://travis-ci.org/frictionlessdata/datapackage-pipelines-aws)

## Install

```
# clone the repo and install it wit pip

git clone https://github.com/frictionlessdata/datapackage-pipelines-aws.git
pip install -e .
```

## Usage

You can use datapackage-pipelines-aws as a plugin for [dpp](https://github.com/frictionlessdata/datapackage-pipelines#datapackage-pipelines). In pipeline-spec.yaml it will look like this

```yaml
...
- run: aws.dump.to_s3
```

You will need AWS credentials to be set up. See [the guide to set up the credentials](http://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/credentials.html)

### dump.to_s3

Saves the DataPackage to AWS S3.

_Parameters:_

* `bucket` - Name of the bucket where DataPackage will be stored (should already be created!)
* `acl` - ACL to provide the uploaded files. Default is 'public-read' (see [boto3 docs](http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.put_object) for more info).
* `path` - Path (key/prefix) to the DataPackage. May contain format string available for `datapackage.json` Eg: `my/example/path/{owner}/{name}/{version}`
* `content_type` - content type to use when storing files in S3. Defaults to text/plain (usual S3 default is binary/octet-stream but we prefer text/plain).

_Example:_

```yaml
datahub:
title: datahub-to-s3
pipeline:
-
run: load_metadata
parameters:
url: http://example.com/my-datapackage/datapackage.json
-
run: load_resource
parameters:
url: http://example.com/my-datapackage/datapackage.json
resource: my-resource
-
run: aws.dump.to_s3
parameters:
bucket: my.bucket.name
path: path/{owner}/{name}/{version}
-
run: aws.dump.to_s3
parameters:
bucket: my.another.bucket
path: another/path/{version}
acl: private
```

Executing pipeline above will save DataPackage in the following directories on S3:
* my.bucket.name/path/my-name/py-package-name/latest/...
* my.bucket.name/another/path/latest/...

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datapackage-pipelines-aws-0.0.12.tar.gz (5.4 kB view details)

Uploaded Source

File details

Details for the file datapackage-pipelines-aws-0.0.12.tar.gz.

File metadata

File hashes

Hashes for datapackage-pipelines-aws-0.0.12.tar.gz
Algorithm Hash digest
SHA256 fe51baf578832ed8b3db9bd1f801c3e72164fb67fbe71d9eb34ca418ac9721f4
MD5 72d6761c38e21034018cd07687d2e513
BLAKE2b-256 1ecd77c00f104cbbf31da33fc2dbf5cfab2962adda42cb4aa081e82b8994cb15

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page