Cloud Object Storage utility
Project description
Cloud Object Storage file upload and download utility
Table of content:
- Getting Started
- Listing the content of a Cloud Object Storage bucket
- Uploading files to a Cloud Object Storage bucket
- Downloading files from a Cloud Object Storage bucket
Getting started
The utility requires Python 3.6 or above.
Installation
You can install the utility from PyPI or from the source.
Install from pypi.org
$ pip install cos-utils --upgrade
Install from source code
$ git clone https://github.com/CODAIT/cos-utils.git
$ cd cos-utils
$ pip install .
Configuration
Set the AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
environment variables based on your Cloud Object Storage HMAC credentials.
$ export AWS_ACCESS_KEY_ID=...
$ export AWS_SECRET_ACCESS_KEY=...
Listing the content of a Cloud Object Storage bucket
You can run the list utility in a terminal window using the generated console script
$ list_files --help
or explicitly
$ python -m cos_utils.list_files --help
The help lists required and optional parameters.
usage: list_files [-h] bucket
List the content of a Cloud Object Storage bucket.
positional arguments:
bucket Bucket name
optional arguments:
-h, --help show this help message and exit
Environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY must be
defined to run the utility.
Example: List the content of <bucket-name>
$ list_files <bucket-name>
Uploading files to a Cloud Object Storage bucket
You can run the upload utility in a terminal window using the generated console script
$ upload_files --help
or explicitly
$ python -m cos_utils.upload_files --help
The help lists required and optional parameters. The examples listed below explain them in detail.
usage: upload_files [-h] [-p PREFIX] [-r] [-s] [-w] bucket source
Upload files to a Cloud Object Storage bucket.
positional arguments:
bucket Bucket name
source File or directory spec (supported wildcards: * and ?)
optional arguments:
-h, --help show this help message and exit
-p PREFIX, --prefix PREFIX
Key name prefix
-r, --recursive Include files in subdirectories
-s, --squash Exclude subdirectory name from key name
-w, --wipe Clear bucket prior to upload
Environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY must be
defined to run the utility.
Example scenario
The </path/to/local/directory>
contains the following directories and files:
file1.png
file2.png
file3.jpg
file4.txt
dir1/file5.gif
dir1/file6.png
dir1/dir2/file7.png
dir1/dir3/file8.jpg
dir1/dir3/file1.png
In the examples given below <bucket-name>
refers to an existing bucket in Cloud Object Storage.
Upload directories
You can upload the content of any directory.
Upload the content of </path/to/local/directory>
to bucket <bucket-name>
$ upload_files <bucket-name> </path/to/local/directory>
Bucket <bucket-name>
contains the following objects:
file1.png
file2.png
file3.jpg
file4.txt
Same as before but clear the bucket first before uploading
Specify the optional --wipe
parameter to clear the bucket before upload.
$ upload_files <bucket-name> </path/to/local/directory> --wipe
Bucket <bucket-name>
contains the following objects:
file1.png
file2.png
file3.jpg
file4.txt
Same as before but include subdirectories
Specify the optional --recursive
parameter include files in subdirectories.
$ upload_files <bucket-name> </path/to/local/directory> --wipe --recursive
Bucket <bucket-name>
contains the following objects:
file1.png
file2.png
file3.jpg
file4.txt
dir1/file5.gif
dir1/file6.png
dir1/dir2/file7.png
dir1/dir3/file8.jpg
dir1/dir3/file1.png
Same as before but don't use subdirectory names during object key generation
Specify the optional --squash
parameter to ignore subdirectory names during object key generation.
$ upload_files <bucket-name> </path/to/local/directory> --wipe --recursive --squash
Bucket <bucket-name>
contains the following objects. Note that </path/to/local/directory>
contains two files named file1.png
. First file1.png
is uploaded and later overwritten with the content of dir1/dir3/file1.png
.
file2.png
file3.jpg
file4.txt
file5.gif
file6.png
file7.png
file8.jpg
file1.png
Same as before but include a static key name prefix
Specify the optional --prefix <prefix>
parameter to add <prefix>
to the object key for every file.
$ upload_files <bucket-name> </path/to/local/directory> --wipe --recursive --squash --prefix data
Bucket <bucket-name>
contains the following objects:
data/file2.png
data/file3.jpg
data/file4.txt
data/file5.gif
data/file6.png
data/file7.png
data/file8.jpg
data/file1.png
Upload files
You can upload a single file by specifying </path/to/local/directory/filename>
.
$ upload_files <bucket-name> /path/to/local/directory/file1.png --wipe
Bucket <bucket-name>
contains the following object:
file1.png
You can upload multiple files by specifying a pattern </path/to/local/directory/filename-pattern>
$ upload_files <bucket-name> /path/to/local/directory/*.png --wipe
On Linux, Unix and MacOS wildcards need to be escaped to prevent shell expansion:
/path/to/local/directory/\*.png
.
Bucket <bucket-name>
contains the following objects:
file1.png
file2.png
Use the --recursive
parameter to extend the search to subdirectories of /path/to/local/directory/
.
$ upload_files <bucket-name> /path/to/local/directory/*.png --wipe --recursive
file1.png
file2.png
dir1/file6.png
dir1/dir2/file7.png
dir1/dir3/file1.png
Downloading files from a Cloud Object Storage bucket
You can run the download utility in a terminal window using the generated console script
$ download_files --help
or explicitly
$ python -m cos_utils.dowload_files --help
The help lists required and optional parameters. The examples listed below explain them in detail.
usage: download_files [-h] [-d TARGET_DIR] bucket source
Download objects from a Cloud Object Storage bucket.
positional arguments:
bucket Bucket name
source Object key spec (supported wildcards: * and ?)
optional arguments:
-h, --help show this help message and exit
-d TARGET_DIR, --target_dir TARGET_DIR
Local target directory. Defaults to the current
directory.
Environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY must be
defined to run the utility.
Download complete bucket content
You can download the complete content of a bucket to the current directory:
$ download_files <bucket-name> *
On Linux, Unix and MacOS wildcards need to be escaped to prevent shell expansion:
download_files <bucket-name> \*
.
Same as before but specify a target directory
Use the --target_dir </path/to/local/dir>
parameter to specify an existing directory where the downloaded files will be stored:
$ download_files <bucket-name> * --target_dir /tmp/downloads
Use wildcards to selectively download files
Use the *
(any character) and ?
(one character) wildcards to define a filter condition.
Download only png files
$ download_files <bucket-name> *.png
Download files that contain a certain string in their name
$ download_files <bucket-name> *fil*
License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file cos-utils-0.0.10.tar.gz
.
File metadata
- Download URL: cos-utils-0.0.10.tar.gz
- Upload date:
- Size: 11.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4ab82ce7a809823bdf4d49cf95941b2e06dccc1cc4ec23fc2439d6dda43c99ac |
|
MD5 | 95a89caa9fb99cefb5be2de3b5d171b2 |
|
BLAKE2b-256 | 64d27961445def080d2b06ee9b1e9f218a97f67aa4f3f877900eef8bf68381d9 |