A package for backup DB and store in s3
Project description
S3 Dumps
========
|Build Status|
Note: A rewritten fork of `s3-backups <https://github.com/epicserve/s3-backups>`_.
`S3dumps <https://github.com/rakeshgunduka/s3_dumps>`_ provides easy scripts that system administrators can use to backup
data from programs likes PostgreSQL, Redis, etc.
.. |Build Status| raw:: html
<a href="https://travis-ci.org/rakeshgunduka/s3_dumps">
<img src="https://travis-ci.org/rakeshgunduka/s3_dumps.png?branch=master"/>
</a>
Installation
------------
To install s3-backups::
$ sudo pip install s3_dumps
Usage
-----
For Backup
''''''''''
Using --backup flag, the script creates the dump and stores in the bucket as it is without year/month/date directory structure.
::
--backup
For Archive
'''''''''''
Using --archive flag, the script takes all the files from the bucket and archives it in year/month/date directory structure.
::
--archive
For Archive and Backup
''''''''''''''''''''''
Using --backup --archive flags together, the script takes all the files from the bucket and archives it in year/month/date directory structure and creates a dump at the parent directory (inside Bucket).
::
--backup --archive
To dump into amazon s3 service.
'''''''''''''''''''''''''''''''
Set --SERVICE_NAME to 'amazon'.
::
--SERVICE_NAME='amazon'
To dump into digitalocean spaces.
'''''''''''''''''''''''''''''''''
Set --SERVICE_NAME to 'digitalocean'.
::
--SERVICE_NAME='digitalocean'
Setting Up S3 Dumps to Run Automatically Using Cron
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
PostgreSQL
''''''''''
Add the following to the file ``/etc/cron.d/postgres_to_s3`` and then change the command arguments so the command is using your correct AWS credentials, backup bucket and the correct base S3 Key/base folder.
Amazon Services
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Digitalocen Spaces
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='digitalocean' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
To create dump of a specific database (my-db-name).
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --DB_NAME='my-db-name' --FILE_KEY='postgres/my-awesome-server' --backup
To backup and archive at the same time.
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup --archive
Redis
'''''
Add the following to the file ``/etc/cron.d/redis_to_s3`` and then change the command arguments so the command is using your correct AWS credentials, backup bucket and the correct base S3 Key/base folder.
Amazon Services
::
0 */1 * * * postgres redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Digitalocen Spaces
::
0 */1 * * * postgres redis_to_s3.py --SERVICE_NAME='digitalocean' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Provide Redis working according to the system redis config directory. (Not mandatory field) If not provided it sets to default.
::
0 */1 * * * root redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='redis/my-awesome-server' --REDIS_DUMP_DIR='/Your/Redis/Config/Dir' --backup
To backup and archive at the same time.
::
0 */1 * * * root redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='redis/my-awesome-server' --REDIS_DUMP_DIR='/Your/Redis/Config/Dir' --REDIS_SAVE_CMD='redis-cli save' --backup --archive
Manually Running Dumps and Archiving
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When running the archive command, S3 Dumps moves backups into a
``year/month/date`` sub folder (technically a S3 key).
The default archive mode will ...
- keep all archives for 7 days
- keep midnight backups for every other day for 30 days
- keep the first day of the month forever
- remove all other files that aren't scheduled to be kept
To backup PostgreSQL, run the following::
$ postgres_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To archive PostgreSQL backups, run the following::
$ postgres_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--archive
To backup Redis, run the following::
$ redis_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--REDIS_DUMP_DIR='/Your/Redis/Config/Dir' \
--REDIS_SAVE_CMD='redis-cli save' \
--backup
To archive Redis, run the following::
$ redis_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--REDIS_DUMP_DIR='/Your/Redis/Config/Dir' \
--REDIS_SAVE_CMD='redis-cli save' \
--archive
To backup MySQL, run the following::
$ mysql_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To archive MySQL, run the following::
$ mysql_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To Do's
----------
1. Add tests
Contributers
------------
1. `Brent O\'Connor <https://github.com/epicserve>`_
2. `Rakesh Gunduka <https://github.com/rakeshgunduka>`_
3. `Shekhar Tiwatne <https://github.com/shon>`_
========
|Build Status|
Note: A rewritten fork of `s3-backups <https://github.com/epicserve/s3-backups>`_.
`S3dumps <https://github.com/rakeshgunduka/s3_dumps>`_ provides easy scripts that system administrators can use to backup
data from programs likes PostgreSQL, Redis, etc.
.. |Build Status| raw:: html
<a href="https://travis-ci.org/rakeshgunduka/s3_dumps">
<img src="https://travis-ci.org/rakeshgunduka/s3_dumps.png?branch=master"/>
</a>
Installation
------------
To install s3-backups::
$ sudo pip install s3_dumps
Usage
-----
For Backup
''''''''''
Using --backup flag, the script creates the dump and stores in the bucket as it is without year/month/date directory structure.
::
--backup
For Archive
'''''''''''
Using --archive flag, the script takes all the files from the bucket and archives it in year/month/date directory structure.
::
--archive
For Archive and Backup
''''''''''''''''''''''
Using --backup --archive flags together, the script takes all the files from the bucket and archives it in year/month/date directory structure and creates a dump at the parent directory (inside Bucket).
::
--backup --archive
To dump into amazon s3 service.
'''''''''''''''''''''''''''''''
Set --SERVICE_NAME to 'amazon'.
::
--SERVICE_NAME='amazon'
To dump into digitalocean spaces.
'''''''''''''''''''''''''''''''''
Set --SERVICE_NAME to 'digitalocean'.
::
--SERVICE_NAME='digitalocean'
Setting Up S3 Dumps to Run Automatically Using Cron
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
PostgreSQL
''''''''''
Add the following to the file ``/etc/cron.d/postgres_to_s3`` and then change the command arguments so the command is using your correct AWS credentials, backup bucket and the correct base S3 Key/base folder.
Amazon Services
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Digitalocen Spaces
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='digitalocean' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
To create dump of a specific database (my-db-name).
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --DB_NAME='my-db-name' --FILE_KEY='postgres/my-awesome-server' --backup
To backup and archive at the same time.
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup --archive
Redis
'''''
Add the following to the file ``/etc/cron.d/redis_to_s3`` and then change the command arguments so the command is using your correct AWS credentials, backup bucket and the correct base S3 Key/base folder.
Amazon Services
::
0 */1 * * * postgres redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Digitalocen Spaces
::
0 */1 * * * postgres redis_to_s3.py --SERVICE_NAME='digitalocean' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Provide Redis working according to the system redis config directory. (Not mandatory field) If not provided it sets to default.
::
0 */1 * * * root redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='redis/my-awesome-server' --REDIS_DUMP_DIR='/Your/Redis/Config/Dir' --backup
To backup and archive at the same time.
::
0 */1 * * * root redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='redis/my-awesome-server' --REDIS_DUMP_DIR='/Your/Redis/Config/Dir' --REDIS_SAVE_CMD='redis-cli save' --backup --archive
Manually Running Dumps and Archiving
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When running the archive command, S3 Dumps moves backups into a
``year/month/date`` sub folder (technically a S3 key).
The default archive mode will ...
- keep all archives for 7 days
- keep midnight backups for every other day for 30 days
- keep the first day of the month forever
- remove all other files that aren't scheduled to be kept
To backup PostgreSQL, run the following::
$ postgres_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To archive PostgreSQL backups, run the following::
$ postgres_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--archive
To backup Redis, run the following::
$ redis_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--REDIS_DUMP_DIR='/Your/Redis/Config/Dir' \
--REDIS_SAVE_CMD='redis-cli save' \
--backup
To archive Redis, run the following::
$ redis_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--REDIS_DUMP_DIR='/Your/Redis/Config/Dir' \
--REDIS_SAVE_CMD='redis-cli save' \
--archive
To backup MySQL, run the following::
$ mysql_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To archive MySQL, run the following::
$ mysql_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To Do's
----------
1. Add tests
Contributers
------------
1. `Brent O\'Connor <https://github.com/epicserve>`_
2. `Rakesh Gunduka <https://github.com/rakeshgunduka>`_
3. `Shekhar Tiwatne <https://github.com/shon>`_
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
s3_dumps-0.1.9.tar.gz
(8.9 kB
view details)
File details
Details for the file s3_dumps-0.1.9.tar.gz
.
File metadata
- Download URL: s3_dumps-0.1.9.tar.gz
- Upload date:
- Size: 8.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e4227a5dbb6c5a5b85c5b352c41563c30cebcfe73d69e338ab1dc2d673060936 |
|
MD5 | 21b3f51423351a1691ce1bd1f47ca03f |
|
BLAKE2b-256 | 309edc50d46333fa7d072da39a53445570c566c84e3b312a17f648c7fdac9bb1 |