UNKNOWN
Project description
S3 Stat
=======
This python module uses the really nice `goaccess <http://goaccess.prosoftcorp.com/>` utility
to provide you with an amazing Amazon log file analyser tool that is relatively easy to install, and is extremely
easy to extend.
Installation
-------------
::
pip install s3stat
This installs `s3stat.py` in your PYTHONPATH in case you would like to run it from the command line.
Quickstart
------------
Generating an AWS user
........................
First you should create a user that has approriate rights to read your log files, and you should have its AWS access keys ready.
# Log in to the `aws console <https://console.aws.amazon.com/iam/home?#users>`
# Create a new user and select the option to generate an access key for the user
# Save the access key and secure keys, as these will be needed soon
# Open the *Permissions* tab for the user, and attach a new user policy.
Select custom policy, and copy the following::
{
"Statement": [
{
"Sid": "Stmt1334764540928",
"Action": [
"s3:GetBucketAcl",
"s3:GetBucketLogging",
"s3:GetObject",
"s3:ListAllMyBuckets",
"s3:ListBucket",
"s3:PutBucketAcl",
"s3:PutBucketLogging",
"s3:PutObject",
"s3:PutObjectAcl"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::*"
]
},
{
"Sid": "Stmt1334764631669",
"Action": [
"cloudfront:GetDistribution",
"cloudfront:GetDistributionConfig",
"cloudfront:GetStreamingDistribution",
"cloudfront:GetStreamingDistributionConfig",
"cloudfront:ListDistributions",
"cloudfront:ListStreamingDistributions",
"cloudfront:UpdateDistribution",
"cloudfront:UpdateStreamingDistribution"
],
"Effect": "Allow",
"Resource": [
"*"
]
}
]
}
Set up logging in your buckets
...............................
First you should ask Amazon to generate logs for your buckets and cloudfront distributions.
Run this script
................
::
s3stat.py <aws key> <aws secret> <bucket> <log_path>
This will download all the log files for today, and start a goaccess instance in your console.
For further options you might run::
s3stat.py -h
Extending
----------
Actually s3stat was designed to be easy to add to your pythonic workflow, as a result it defines
a single class that you can subclass to process the results in json format.::
import s3stat
class MyS3Stat(s3stat.S3Stat):
def process(self, json):
print json
mytask = MyS3Stat(bukcet, log_path, for_date, (aws_key, aws_secret))
mytask.run()
Where the `aws_*` parameters are optional, if missing then they are taken from the environment variables as provided by boto.
ToDo
-----
* provide a command that adds logging to specified buckets and cloudfront distributions
=======
This python module uses the really nice `goaccess <http://goaccess.prosoftcorp.com/>` utility
to provide you with an amazing Amazon log file analyser tool that is relatively easy to install, and is extremely
easy to extend.
Installation
-------------
::
pip install s3stat
This installs `s3stat.py` in your PYTHONPATH in case you would like to run it from the command line.
Quickstart
------------
Generating an AWS user
........................
First you should create a user that has approriate rights to read your log files, and you should have its AWS access keys ready.
# Log in to the `aws console <https://console.aws.amazon.com/iam/home?#users>`
# Create a new user and select the option to generate an access key for the user
# Save the access key and secure keys, as these will be needed soon
# Open the *Permissions* tab for the user, and attach a new user policy.
Select custom policy, and copy the following::
{
"Statement": [
{
"Sid": "Stmt1334764540928",
"Action": [
"s3:GetBucketAcl",
"s3:GetBucketLogging",
"s3:GetObject",
"s3:ListAllMyBuckets",
"s3:ListBucket",
"s3:PutBucketAcl",
"s3:PutBucketLogging",
"s3:PutObject",
"s3:PutObjectAcl"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::*"
]
},
{
"Sid": "Stmt1334764631669",
"Action": [
"cloudfront:GetDistribution",
"cloudfront:GetDistributionConfig",
"cloudfront:GetStreamingDistribution",
"cloudfront:GetStreamingDistributionConfig",
"cloudfront:ListDistributions",
"cloudfront:ListStreamingDistributions",
"cloudfront:UpdateDistribution",
"cloudfront:UpdateStreamingDistribution"
],
"Effect": "Allow",
"Resource": [
"*"
]
}
]
}
Set up logging in your buckets
...............................
First you should ask Amazon to generate logs for your buckets and cloudfront distributions.
Run this script
................
::
s3stat.py <aws key> <aws secret> <bucket> <log_path>
This will download all the log files for today, and start a goaccess instance in your console.
For further options you might run::
s3stat.py -h
Extending
----------
Actually s3stat was designed to be easy to add to your pythonic workflow, as a result it defines
a single class that you can subclass to process the results in json format.::
import s3stat
class MyS3Stat(s3stat.S3Stat):
def process(self, json):
print json
mytask = MyS3Stat(bukcet, log_path, for_date, (aws_key, aws_secret))
mytask.run()
Where the `aws_*` parameters are optional, if missing then they are taken from the environment variables as provided by boto.
ToDo
-----
* provide a command that adds logging to specified buckets and cloudfront distributions
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
s3stat-1.0.0.tar.gz
(4.2 kB
view details)
File details
Details for the file s3stat-1.0.0.tar.gz
.
File metadata
- Download URL: s3stat-1.0.0.tar.gz
- Upload date:
- Size: 4.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 44e627c15b6b57069a0e652786d73ad17321e452b941fa78c1794c1c7c8f0932 |
|
MD5 | d3cdc85d620ba3a4b8e899f23b004678 |
|
BLAKE2b-256 | 1bebe1a15a8702f2e5e22a21baa6a223490007dd328a8c33dd9d75cae764a846 |