Skip to main content

Upload wheels to any cloud storage supported by Libcloud

Project description

Upload/download wheels to/from cloud storage using Apache Libcloud. Helps package maintainers build wheels for their packages and upload them to PyPI.

The cloud storage containers are typically populated by Continuous Integration servers that generate and test binary packages on various platforms (Windows and OSX for several versions and architectures for Python). At release time the project maintainer can collect all the generated package for a specific version of the project and upload them all at once to PyPI.

Installation

pip install wheelhouse-uploader

Usage

The canonical use case is:

  1. Continuous Integration (CI) workers build and test the project packages for various platforms and versions of Python, for instance using the commands:

    pip install wheel
    python setup.py bdist_wheel
  2. CI workers use wheelhouse-uploader to upload the generated artifacts to one or more cloud storage containers (e.g. one container per platform, or one for the master branch and the other for release tags):

    python -m wheelhouse_uploader upload container_name
  3. The project maintainer uses the wheelhouse-uploader distutils extensions to fetch all the generated build artifacts for a specific version number to its local dist folder and upload them all at once to PyPI when making a release.

    python setup.py sdist fetch_artifacts upload_all

Uploading artifact to a cloud storage container

Use the following command:

python -m wheelhouse_uploader upload \
    --username=mycloudaccountid --secret=xxx \
    --local-folder=dist/ my_wheelhouse

or:

export WHEELHOUSE_UPLOADER_USERNAME=mycloudaccountid
export WHEELHOUSE_UPLOADER_SECRET=xxx
python -m wheelhouse_uploader upload --local-folder dist/ my_wheelhouse

When used in a CI setup such as http://travis-ci.org or http://appveyor.com, the environment variables are typically configured in the CI configuration files such as .travis.yml or appveyor.yml. The secret API key is typically encrypted and exposed with a secure: prefix in those files.

The files in the dist/ folder will be uploaded to a container named my_wheelhouse on the CLOUDFILES (Rackspace) cloud storage provider.

You can pass a custom --provider param to select the cloud storage from the list of supported providers.

Assuming the container will be published as a static website using the cloud provider CDN options, the upload command also maintains an index.html file with links to all the files previously uploaded to the container.

It is recommended to configure the container CDN cache TTL to a shorter than usual duration such as 15 minutes to be able to quickly perform a release once all artifacts have been uploaded by the CI servers.

Fetching artifacts manually

The following command downloads items that have been previously published to a web page with an index with HTML links to the project files:

python -m wheelhouse_uploader fetch \
    --version=X.Y.Z --local-folder=dist/ \
    project-name http://wheelhouse.example.org/

Uploading previously archived artifacts to PyPI (deprecated)

DEPRECATION NOTICE: while the following still works, you are advised to use the alternative tool: twine that makes it easy to script uploads of packages to PyPI without messing around with distutils and setup.py.

Ensure that the setup.py file of the project registers the wheelhouse-uploader distutils extensions:

cmdclass = {}

try:
    # Used by the release manager of the project to add support for:
    # python setup.py sdist fetch_artifacts upload_all
    import wheelhouse_uploader.cmd
    cmdclass.update(vars(wheelhouse_uploader.cmd))
except ImportError:
    pass
...

setup(
    ...
    cmdclass=cmdclass,
)

Put the URL of the public artifact repositories populated by the CI workers in the setup.cfg file of the project:

[wheelhouse_uploader]
artifact_indexes=
    http://wheelhouse.site1.org/
    http://wheelhouse.site2.org/

Fetch all the artifacts matching the current version of the project as configured in the local setup.py file and upload them all to PyPI:

python setup.py fetch_artifacts upload_all

Note: this will reuse PyPI credentials stored in $HOME/.pypirc if python setup.py register or upload were called previously.

TODO

  • test on as many cloud storage providers as possible (please send an email to olivier.grisel@ensta.org if you can make it work on a non-Rackspace provider),

  • check that CDN activation works everywhere (it’s failing on Rackspace currently: need to investigate) otherwise the workaround is to enable CDN manually in the management web UI,

  • make it possible to fetch private artifacts using the cloud storage protocol instead of HTML index pages.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wheelhouse-uploader-0.10.3.tar.gz (17.3 kB view details)

Uploaded Source

Built Distribution

wheelhouse_uploader-0.10.3-py2.py3-none-any.whl (15.6 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file wheelhouse-uploader-0.10.3.tar.gz.

File metadata

  • Download URL: wheelhouse-uploader-0.10.3.tar.gz
  • Upload date:
  • Size: 17.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.5

File hashes

Hashes for wheelhouse-uploader-0.10.3.tar.gz
Algorithm Hash digest
SHA256 c59bea8f2f09f4f845765e79a47442724e869aa552443cb312695040866e66fa
MD5 fd8ef9f0e11596fab1399ae9bd73dfcd
BLAKE2b-256 fbbccb89b6fbe418b89495372b81da6a35cd78de1eb4f61d773bb724127166a6

See more details on using hashes here.

File details

Details for the file wheelhouse_uploader-0.10.3-py2.py3-none-any.whl.

File metadata

  • Download URL: wheelhouse_uploader-0.10.3-py2.py3-none-any.whl
  • Upload date:
  • Size: 15.6 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.5

File hashes

Hashes for wheelhouse_uploader-0.10.3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 4c2f8fcb71a7fe2645e2377cf8d62efd4a29b9dd811254d4ba0590c5abebbae9
MD5 005636096aae8e8f5db85f2a70b6669c
BLAKE2b-256 a9c425817995431fd0ae50873b1f306587875a1a98ff3f35665d83dfc43fc383

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page