A package for backup DB and store in s3
Project description
S3 Dumps
==========
|Build Status|
Note: A rewritten fork of `s3-backups <https://github.com/epicserve/s3-backups>`_.
`S3 Dumps <https://github.com/rakeshgunduka/s3_dumps>`_ provides easy scripts that system administrators can use to backup
data from programs likes PostgreSQL, Redis, etc.
.. |Build Status| raw:: html
<a href="https://travis-ci.org/rakeshgunduka/s3_dumps">
<img src="https://travis-ci.org/rakeshgunduka/s3_dumps.png?branch=master"/>
</a>
Installation
------------
To install s3-backups::
$ sudo pip install s3_dumps
Usage
-----
For Backup
''''''''''
Using --backup flag, the script creates the dump and stores in the bucket as it is without year/month/date directory structure.
::
--backup
For Archive
'''''''''''
Using --archive flag, the script takes all the files from the bucket and archives it in year/month/date directory structure.
::
--archive
For Archive and Backup
''''''''''''''''''''''
Using --backup --archive flags together, the script takes all the files from the bucket and archives it in year/month/date directory structure and creates a dump at the parent directory (inside Bucket).
::
--backup --archive
To dump into amazon s3 service.
'''''''''''''''''''''''''''''''
Set --SERVICE_NAME to 'amazon'.
::
--SERVICE_NAME='amazon'
To dump into digitalocean spaces.
'''''''''''''''''''''''''''''''''
Set --SERVICE_NAME to 'digitalocean'.
::
--SERVICE_NAME='digitalocean'
Setting Up S3 Dumps to Run Automatically Using Cron
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
PostgreSQL
''''''''''
Add the following to the file ``/etc/cron.d/postgres_to_s3`` and then change the command arguments so the command is using your correct AWS credentials, backup bucket and the correct base S3 Key/base folder.
Amazon Services
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Digitalocen Spaces
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='digitalocean' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
To create dump of a specific database (my-db-name).
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --DB_NAME='my-db-name' --FILE_KEY='postgres/my-awesome-server' --backup
To backup and archive at the same time.
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup --archive
Redis
'''''
Add the following to the file ``/etc/cron.d/redis_to_s3`` and then change the command arguments so the command is using your correct AWS credentials, backup bucket and the correct base S3 Key/base folder.
Amazon Services
::
0 */1 * * * postgres redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Digitalocen Spaces
::
0 */1 * * * postgres redis_to_s3.py --SERVICE_NAME='digitalocean' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Provide Redis working according to the system redis config directory. (Not mandatory field) If not provided it sets to default.
::
0 */1 * * * root redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='redis/my-awesome-server' --REDIS_DUMP_DIR='/Your/Redis/Config/Dir' --backup
To backup and archive at the same time.
::
0 */1 * * * root redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='redis/my-awesome-server' --REDIS_DUMP_DIR='/Your/Redis/Config/Dir' --REDIS_SAVE_CMD='redis-cli save' --backup --archive
Manually Running Dumps and Archiving
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When running the archive command, S3 Dumps moves backups into a
``year/month/date`` sub folder (technically a S3 key).
The default archive mode will ...
- keep all archives for 7 days
- keep midnight backups for every other day for 30 days
- keep the first day of the month forever
- remove all other files that aren't scheduled to be kept
To backup PostgreSQL, run the following::
$ postgres_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To archive PostgreSQL backups, run the following::
$ postgres_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--archive
To backup Redis, run the following::
$ redis_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--REDIS_DUMP_DIR='/Your/Redis/Config/Dir' \
--REDIS_SAVE_CMD='redis-cli save' \
--backup
To archive Redis, run the following::
$ redis_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--REDIS_DUMP_DIR='/Your/Redis/Config/Dir' \
--REDIS_SAVE_CMD='redis-cli save' \
--archive
To backup MySQL, run the following::
$ mysql_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To archive MySQL, run the following::
$ mysql_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To Do's
----------
1. Add tests
Contributers
----------
1. `Brent O\'Connor <https://github.com/epicserve>`_
2. `Rakesh Gunduka <https://github.com/rakeshgunduka>`_
3. `Shekhar Tiwatne <https://github.com/shon>`_
==========
|Build Status|
Note: A rewritten fork of `s3-backups <https://github.com/epicserve/s3-backups>`_.
`S3 Dumps <https://github.com/rakeshgunduka/s3_dumps>`_ provides easy scripts that system administrators can use to backup
data from programs likes PostgreSQL, Redis, etc.
.. |Build Status| raw:: html
<a href="https://travis-ci.org/rakeshgunduka/s3_dumps">
<img src="https://travis-ci.org/rakeshgunduka/s3_dumps.png?branch=master"/>
</a>
Installation
------------
To install s3-backups::
$ sudo pip install s3_dumps
Usage
-----
For Backup
''''''''''
Using --backup flag, the script creates the dump and stores in the bucket as it is without year/month/date directory structure.
::
--backup
For Archive
'''''''''''
Using --archive flag, the script takes all the files from the bucket and archives it in year/month/date directory structure.
::
--archive
For Archive and Backup
''''''''''''''''''''''
Using --backup --archive flags together, the script takes all the files from the bucket and archives it in year/month/date directory structure and creates a dump at the parent directory (inside Bucket).
::
--backup --archive
To dump into amazon s3 service.
'''''''''''''''''''''''''''''''
Set --SERVICE_NAME to 'amazon'.
::
--SERVICE_NAME='amazon'
To dump into digitalocean spaces.
'''''''''''''''''''''''''''''''''
Set --SERVICE_NAME to 'digitalocean'.
::
--SERVICE_NAME='digitalocean'
Setting Up S3 Dumps to Run Automatically Using Cron
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
PostgreSQL
''''''''''
Add the following to the file ``/etc/cron.d/postgres_to_s3`` and then change the command arguments so the command is using your correct AWS credentials, backup bucket and the correct base S3 Key/base folder.
Amazon Services
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Digitalocen Spaces
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='digitalocean' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
To create dump of a specific database (my-db-name).
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --DB_NAME='my-db-name' --FILE_KEY='postgres/my-awesome-server' --backup
To backup and archive at the same time.
::
0 */1 * * * postgres postgres_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup --archive
Redis
'''''
Add the following to the file ``/etc/cron.d/redis_to_s3`` and then change the command arguments so the command is using your correct AWS credentials, backup bucket and the correct base S3 Key/base folder.
Amazon Services
::
0 */1 * * * postgres redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Digitalocen Spaces
::
0 */1 * * * postgres redis_to_s3.py --SERVICE_NAME='digitalocean' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='postgres/my-awesome-server' --backup
Provide Redis working according to the system redis config directory. (Not mandatory field) If not provided it sets to default.
::
0 */1 * * * root redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='redis/my-awesome-server' --REDIS_DUMP_DIR='/Your/Redis/Config/Dir' --backup
To backup and archive at the same time.
::
0 */1 * * * root redis_to_s3.py --SERVICE_NAME='amazon' --ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' --SECRET='xxxxxxxxxxxxxxxxxxxx' --REGION='bucket-region' --BUCKET_NAME='my-backup-bucket' --FILE_KEY='redis/my-awesome-server' --REDIS_DUMP_DIR='/Your/Redis/Config/Dir' --REDIS_SAVE_CMD='redis-cli save' --backup --archive
Manually Running Dumps and Archiving
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
When running the archive command, S3 Dumps moves backups into a
``year/month/date`` sub folder (technically a S3 key).
The default archive mode will ...
- keep all archives for 7 days
- keep midnight backups for every other day for 30 days
- keep the first day of the month forever
- remove all other files that aren't scheduled to be kept
To backup PostgreSQL, run the following::
$ postgres_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To archive PostgreSQL backups, run the following::
$ postgres_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--archive
To backup Redis, run the following::
$ redis_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--REDIS_DUMP_DIR='/Your/Redis/Config/Dir' \
--REDIS_SAVE_CMD='redis-cli save' \
--backup
To archive Redis, run the following::
$ redis_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--REDIS_DUMP_DIR='/Your/Redis/Config/Dir' \
--REDIS_SAVE_CMD='redis-cli save' \
--archive
To backup MySQL, run the following::
$ mysql_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To archive MySQL, run the following::
$ mysql_to_s3.py \
--SERVICE_NAME='amazon'
--ACCESS_KEY='xxxxxxxxxxxxxxxxxxxx' \
--SECRET='xxxxxxxxxxxxxxxxxxxx' \
--REGION='bucket-region' \
--BUCKET_NAME='my-backup-bucket' \
--FILE_KEY='postgres/my-awesome-server' \
--backup
To Do's
----------
1. Add tests
Contributers
----------
1. `Brent O\'Connor <https://github.com/epicserve>`_
2. `Rakesh Gunduka <https://github.com/rakeshgunduka>`_
3. `Shekhar Tiwatne <https://github.com/shon>`_
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
s3_dumps-0.1.8.tar.gz
(8.9 kB
view details)
File details
Details for the file s3_dumps-0.1.8.tar.gz
.
File metadata
- Download URL: s3_dumps-0.1.8.tar.gz
- Upload date:
- Size: 8.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 54676b03ba50dc53861668e4e768124b2ef73318bab314390eca50757f9e669a |
|
MD5 | 661aa489b9e48a8b5b69810fe9e9dbcd |
|
BLAKE2b-256 | d6a20b189388bbde8cedccca7e0826033e33e7474838938abe76ea070c3a3296 |