Skip to main content

The simple module for putting and getting object from Amazon S3 compatible endpoints

Project description

aiohttp-s3-client

PyPI - License Wheel Mypy PyPI PyPI Coverage Status tox

The simple module for putting and getting object from Amazon S3 compatible endpoints

Installation

pip install aiohttp-s3-client

Usage

from http import HTTPStatus

from aiohttp import ClientSession
from aiohttp_s3_client import S3Client


async with ClientSession(raise_for_status=True) as session:
    client = S3Client(
        url="http://s3-url",
        session=session,
        access_key_id="key-id",
        secret_access_key="hackme",
        region="us-east-1"
    )

    # Upload str object to bucket "bucket" and key "str"
    async with client.put("bucket/str", "hello, world") as resp:
        assert resp.status == HTTPStatus.OK

    # Upload bytes object to bucket "bucket" and key "bytes"
    resp = await client.put("bucket/bytes", b"hello, world")
    assert resp.status == HTTPStatus.OK

    # Upload AsyncIterable to bucket "bucket" and key "iterable"
    async def gen():
        yield b'some bytes'

    resp = await client.put("bucket/file", gen())
    assert resp.status == HTTPStatus.OK

    # Upload file to bucket "bucket" and key "file"
    resp = await client.put_file("bucket/file", "/path_to_file")
    assert resp.status == HTTPStatus.OK

    # Check object exists using bucket+key
    resp = await client.head("bucket/key")
    assert resp == HTTPStatus.OK

    # Get object by bucket+key
    resp = await client.get("bucket/key")
    data = await resp.read()

    # Delete object using bucket+key
    resp = await client.delete("bucket/key")
    assert resp == HTTPStatus.NO_CONTENT

    # List objects by prefix
    async for result in client.list_objects_v2("bucket/", prefix="prefix"):
        # Each result is a list of metadata objects representing an object
        # stored in the bucket.
        do_work(result)

Bucket may be specified as subdomain or in object name:

client = S3Client(url="http://bucket.your-s3-host", ...)
resp = await client.put("key", gen())

client = S3Client(url="http://your-s3-host", ...)
resp = await client.put("bucket/key", gen())

client = S3Client(url="http://your-s3-host/bucket", ...)
resp = await client.put("key", gen())

Auth may be specified with keywords or in URL:

client = S3Client(url="http://your-s3-host", access_key_id="key_id",
                  secret_access_key="access_key", ...)

client = S3Client(url="http://key_id:access_key@your-s3-host", ...)

Multipart upload

For uploading large files multipart uploading can be used. It allows you to asynchronously upload multiple parts of a file to S3. S3Client handles retries of part uploads and calculates part hash for integrity checks.

client = S3Client()
await client.put_file_multipart(
    "test/bigfile.csv",
    headers={
    	"Content-Type": "text/csv",
    },
    workers_count=8,
)

Parallel download to file

S3 supports GET requests with Range header. It's possible to download objects in parallel with multiple connections for speedup. S3Client handles retries of partial requests and makes sure that file won't changed during download with ETag header. If your system supports pwrite syscall (linux, macos, etc) it will be used to write simultaneously to a single file. Otherwise, each worker will have own file which will be concatenated after downloading.

client = S3Client()
await client.get_file_parallel(
    "dump/bigfile.csv",
    "/home/user/bigfile.csv",
    workers_count=8,
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiohttp-s3-client-0.7.0.tar.gz (16.9 kB view details)

Uploaded Source

Built Distribution

aiohttp_s3_client-0.7.0-py3-none-any.whl (19.1 kB view details)

Uploaded Python 3

File details

Details for the file aiohttp-s3-client-0.7.0.tar.gz.

File metadata

  • Download URL: aiohttp-s3-client-0.7.0.tar.gz
  • Upload date:
  • Size: 16.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for aiohttp-s3-client-0.7.0.tar.gz
Algorithm Hash digest
SHA256 9c747941994aa65069b593f49cf147940e41fdf525b8e9c83ca81edc63afe4a0
MD5 6bf2df5dd586a8553ef1b8d097293eb2
BLAKE2b-256 83560ff0e7dea7c565829bf8b62f9c52022ee39832414779ec5abf7136e24b8f

See more details on using hashes here.

File details

Details for the file aiohttp_s3_client-0.7.0-py3-none-any.whl.

File metadata

  • Download URL: aiohttp_s3_client-0.7.0-py3-none-any.whl
  • Upload date:
  • Size: 19.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/32.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.62.3 importlib-metadata/4.11.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for aiohttp_s3_client-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 63efc77dca4c53d981d8c877017052b6420f78bf7a7af9946e8471f0b7f6c163
MD5 fd34d568e9f4ee642574721e1a54afc5
BLAKE2b-256 f43cb9b08c34d5d5b04d4feeb9dd38186c0d8c018edda893bfb2dbc1fdacba6d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page