Skip to main content

Simple, quick Amazon AWS S3 interface

Project description

simples3 is a fairly simple, decently quick interface to Amazon’s S3 storage service.

It grew out of frustration with other libraries that were either written too pragmatically (slow), too bloatedly, or just half-done.

The module aims for:

  • simplicity,

  • decent speed,

  • non-intrusiveness.

It really is designed to fit into programmer memory. The three basic operations are as easy as with dictionaries.

Out of simplicity comes no dependencies - the code relies solely on Python standard libraries.

simples3 requires Python 2.5+ and nose for running tests. Python 3 support is not yet available.

IRC

#sendapatch on chat.freenode.net.

Usage

A simple Amazon AWS S3 interface

Access to a bucket is done via the S3Bucket class. It has three required arguments:

>>> s = S3Bucket(bucket,
...              access_key=access_key,
...              secret_key=secret_key)
...
>>> print s  # doctest: +ELLIPSIS
<S3Bucket ... at 'https://s3.amazonaws.com/...'>

or if you’d like to use the use-any-domain-you-want stuff, set base_url to something like http://s3.example.com:

>>> s = S3Bucket(bucket,
...              access_key=access_key,
...              secret_key=secret_key,
...              base_url=base_url)
>>> print s  # doctest: +ELLIPSIS
<S3Bucket ... at 'http...'>

Note that missing slash above, it’s important. Think of it as “The prefix to which all calls are made.” Also the scheme can be https or regular http, or any other urllib2-compatible scheme (as in you could register your own scheme.)

Now, let’s start doing something useful. Start out by putting a simple file onto there:

>>> s.put("my file", "my content")

Alright, and fetch it back:

>>> f = s.get("my file")
>>> f.read()
'my content'

Nice and tidy, but what if we want to know more about our fetched file? Easy:

>>> f.s3_info["modify"]  # doctest: +ELLIPSIS
datetime.datetime(...)
>>> f.s3_info["mimetype"]
'application/octet-stream'
>>> f.s3_info.keys()
['mimetype', 'modify', 'headers', 'date', 'size', 'metadata']
>>> f.close()

Note that the type was octet stream. That’s simply because we didn’t specify anything else. Do that using the mimetype keyword argument:

>>> s.put("my new file!", "Improved content!\nMultiple lines!",
...       mimetype="text/plain")

Let’s be cool and use the very Pythonic API to do fetch:

>>> f = s["my new file!"]
>>> print f.read()
Improved content!
Multiple lines!
>>> f.s3_info["mimetype"]
'text/plain'
>>> f.close()

Great job, huh. Now, let’s delete it:

>>> del s["my new file!"]

Could’ve used the delete method instead, but we didn’t.

If you just want to know about a key, ask and ye shall receive:

>>> from pprint import pprint
>>> s["This is a testfile."] = S3File("Hi!", metadata={"hairdo": "Secret"})
>>> pprint(s.info("This is a testfile."))  # doctest: +ELLIPSIS
{'date': datetime.datetime(...),
 'headers': {'content-length': '3',
             'content-type': 'application/octet-stream',
             'date': '...',
             'etag': '"..."',
             'last-modified': '...',
             'server': 'AmazonS3',
             'x-amz-id-2': '...',
             'x-amz-meta-hairdo': 'Secret',
             'x-amz-request-id': '...'},
 'metadata': {'hairdo': 'Secret'},
 'mimetype': 'application/octet-stream',
 'modify': datetime.datetime(...),
 'size': 3}

Notable is that you got the metadata parsed out in the metadata key. You might also have noticed how the file was uploaded, using an S3File object like that. That’s a nicer way to do it, in a way.

The S3File simply takes its keyword arguments, and passes them on to put later. Other than that, it’s a str subclass.

And the last dict-like behavior is in tests:

>>> "This is a testfile." in s
True
>>> del s["This is a testfile."]
>>> "This is a testfile." in s
False

You can also set a canned ACL using put, which is very simple:

>>> s.put("test/foo", "test", acl="public-read")
>>> s.put("test/bar", "rawr", acl="public-read")

Boom. What’s more? Listing the bucket:

>>> for (key, modify, etag, size) in s.listdir(prefix="test/"):
...     print "%r (%r) is size %r, modified %r" % (key, etag, size, modify)
... # doctest: +ELLIPSIS
'test/bar' ('"..."') is size 4, modified datetime.datetime(...)
'test/foo' ('"..."') is size 4, modified datetime.datetime(...)

That about sums the basics up.

Changes in simples3 1.0

  • Made simples3 a “flat package”, imports work as usual.

  • Refactored url_for to make_url_authed, make_url.

  • Added an optional timeout argument to the S3Bucket class.

  • Added nose-based testing.

  • Added support for streaming with poster.streaminghttp.

  • Added support for Google App Engine.

Changes in simples3 0.5

  • Add S3-to-S3 copy method.

Changes in simples3 0.4

  • Minor fixes, released as 0.4 mostly because the previous version naming scheme was a bad idea.

  • 0.4.1: Made the put method retry on HTTP 500.

  • 0.4.1: Fix a critical error in signature generation when metadata is given.

Changes in simples3 0.3

  • Add a url_for method on buckets which lets you use expiring URLs. Thanks to Pavel Repin.

  • Much better test coverage.

  • simples3 now works on Python 2.6’s mimetypes module.

  • r1: Handle HTTP errors in exception parser better, which broke the existence test.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

simples3-1.0.tar.gz (14.6 kB view details)

Uploaded Source

File details

Details for the file simples3-1.0.tar.gz.

File metadata

  • Download URL: simples3-1.0.tar.gz
  • Upload date:
  • Size: 14.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for simples3-1.0.tar.gz
Algorithm Hash digest
SHA256 2f3d3ef5a691741a489971e292cca2fd69bb11937b542bd1838793da2014f0fc
MD5 fc8d4c4dae6b068e42f5b2c5d05b0134
BLAKE2b-256 e37a6fccad7b7aa621a2f95df3e2a69c977103bd2a73aa1776c66280b73c41dc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page