Skip to main content

Cookbook plugin for the Pulp Project

Project description

Travis (.org) PyPI PyPI - Python Version Code style: black

pulp_cookbook Plugin

This is the pulp_cookbook Plugin for Pulp Project 3.0+. This plugin implements support for Chef cookbooks.

Currently, it allows to import packaged cookbooks into cookbook repositories. When publishing a specific version of a cookbook repository, a universe endpoint will be created to allow tools like berkshelf to download cookbooks and resolve cookbook dependencies.

Not supported (yet):

  • Full support of the Supermarket API

  • Cookbook version constraints to a remote (only filtering by cookbook name is supported)

All REST API examples below use httpie to perform the requests. The httpie commands below assume that the user executing the commands has a .netrc file in the home directory. The .netrc should have the following configuration:

machine localhost
login admin
password admin

If you configured the admin user with a different password, adjust the configuration accordingly. If you prefer to specify the username and password with each request, please see httpie documentation on how to do that.

This documentation makes use of the jq library to parse the json received from requests, in order to get the unique urls generated when objects are created. To follow this documentation as-is please install the jq library/binary with:

$ sudo dnf install jq

Install pulpcore

Follow the installation instructions provided with pulpcore.

Users should install from either PyPI or source.

Install pulp_cookbook from source

sudo -u pulp -i
source ~/pulpvenv/bin/activate
git clone https://github.com/gmbnomis/pulp_cookbook.git
cd pulp_cookbook
pip install -e .

Install pulp-cookbook From PyPI

sudo -u pulp -i
source ~/pulpvenv/bin/activate
pip install pulp-cookbook

Make and Run Migrations

export DJANGO_SETTINGS_MODULE=pulpcore.app.settings
django-admin migrate pulp_cookbook

Run Services

django-admin runserver 24817
gunicorn pulpcore.content:server --bind 'localhost:24816' --worker-class 'aiohttp.GunicornWebWorker' -w 2
sudo systemctl restart pulp-resource-manager
sudo systemctl restart pulp-worker@1
sudo systemctl restart pulp-worker@2

Example: Import cookbooks and synchronize from remote

Create a repository foo

$ http POST http://localhost:24817/pulp/api/v3/repositories/ name=foo

{
    "_created": "2019-10-03T16:29:25.171311Z",
    "_href": "/pulp/api/v3/repositories/200118d5-dc92-4e2d-b970-df7edec122ea/",
    "_latest_version_href": null,
    "_versions_href": "/pulp/api/v3/repositories/200118d5-dc92-4e2d-b970-df7edec122ea/versions/",
    "description": null,
    "name": "foo",
    "plugin_managed": false
}

$ export REPO_HREF=$(http :24817/pulp/api/v3/repositories/ | jq -r '.results[] | select(.name == "foo") | ._href')

Upload cookbooks to Pulp

As a simple example, let’s download two cookbooks from the Chef Supermarket and upload them into our repository.

Download ‘ubuntu’ and ‘apt’ cookbooks (the ‘ubuntu’ cookbooks depends on the ‘apt’ cookbook):

$ curl -Lo ubuntu-2.0.1.tgz https://supermarket.chef.io:443/api/v1/cookbooks/ubuntu/versions/2.0.1/download $ curl -Lo apt-7.0.0.tgz https://supermarket.chef.io:443/api/v1/cookbooks/apt/versions/7.0.0/download

Create a content unit for ubuntu 2.0.1:

$ http --form POST http://localhost:24817/pulp/api/v3/content/cookbook/cookbooks/ name="ubuntu" file@ubuntu-2.0.1.tgz

$ export UBUNTU_CONTENT_HREF=$(http :24817/pulp/api/v3/content/cookbook/cookbooks/?name=ubuntu | jq -r '.results[0]._href')

Create a content unit for apt 7.0.0:

$ http --form POST http://localhost:24817/pulp/api/v3/content/cookbook/cookbooks/ name="apt" file@apt-7.0.0.tgz

$ export APT_CONTENT_HREF=$(http :24817/pulp/api/v3/content/cookbook/cookbooks/?name=apt | jq -r '.results[0]._href')

Add content to repository foo

$ http POST :24817$REPO_HREF'versions/' add_content_units:="[\"$UBUNTU_CONTENT_HREF\",\"$APT_CONTENT_HREF\"]"

$ export LATEST_VERSION_HREF=$(http :24817$REPO_HREF | jq -r '._latest_version_href')

Create a Publication

$ http POST http://localhost:24817/pulp/api/v3/publications/cookbook/cookbook/ repository_version=$LATEST_VERSION_HREF

{
    "task": "/pulp/api/v3/tasks/cd37e3dd-fb9b-4fa3-a32b-174bcb860c79/"
}

$ export PUBLICATION_HREF=$(http :24817/pulp/api/v3/publications/cookbook/cookbook/ | jq --arg LVH "$LATEST_VERSION_HREF" -r '.results[] | select(.repository_version == $LVH) | ._href')

Create a Distribution at ‘foo’ for the Publication

$ http POST http://localhost:24817/pulp/api/v3/distributions/cookbook/cookbook/ name='baz' base_path='foo' publication=$PUBLICATION_HREF

You can have a look at the published “universe” metadata now:

$ http http://localhost:24816/pulp_cookbook/content/foo/universe

{
    "apt": {
        "7.0.0": {
            "dependencies": {},
            "download_url": "http://localhost:24816/pulp_cookbook/content/foo/cookbook_files/apt/7_0_0/apt-7.0.0.tar.gz",
            "location_path": "http://localhost:24816/pulp_cookbook/content/foo/cookbook_files/apt/7_0_0/apt-7.0.0.tar.gz",
            "location_type": "uri"
        }
    },
    "ubuntu": {
        "2.0.1": {
            "dependencies": {
                "apt": ">= 0.0.0"
            },
            "download_url": "http://localhost:24816/pulp_cookbook/content/foo/cookbook_files/ubuntu/2_0_1/ubuntu-2.0.1.tar.gz",
            "location_path": "http://localhost:24816/pulp_cookbook/content/foo/cookbook_files/ubuntu/2_0_1/ubuntu-2.0.1.tar.gz",
            "location_type": "uri"
        }
    }
}

Use Berkshelf with the published repo

Create a Berksfile with the following content:

source 'http://localhost:24816/pulp_cookbook/content/foo/'

cookbook 'ubuntu'

$ berks install

Resolving cookbook dependencies...
Fetching cookbook index from http://localhost:24816/pulp_cookbook/content/foo/...
Installing apt (7.0.0) from http://localhost:24816/pulp_cookbook/content/foo/ ([uri] http://localhost:24816/pulp_cookbook/content/foo/cookbook_files/apt/7_0_0/apt-7.0.0.tar.gz)
Installing ubuntu (2.0.1) from http://localhost:24816/pulp_cookbook/content/foo/ ([uri] http://localhost:24816/pulp_cookbook/content/foo/cookbook_files/ubuntu/2_0_1/ubuntu-2.0.1.tar.gz)

Create a new remote foo_remote

In addition to uploading content, pulp_cookbook allows to synchronize a repo with an upstream repo (that has to provide a “universe” endpoint).

Let’s mirror the pulp and qpid cookbooks into our existing repo. First, we have to create a remote:

$ http POST http://localhost:24817/pulp/api/v3/remotes/cookbook/cookbook/ name='foo_remote' url='https://supermarket.chef.io/' cookbooks:='{"pulp": "", "qpid": ""}'

{
    "_created": "2019-10-03T16:37:19.240581Z",
    "_href": "/pulp/api/v3/remotes/cookbook/cookbook/601c0402-30ff-4209-9008-5bc0339419be/",
    "_last_updated": "2019-10-03T16:37:19.240602Z",
    "_type": "cookbook.cookbook",
    "cookbooks": {
        "pulp": "",
        "qpid": ""
    },
    "download_concurrency": 20,
    "name": "foo_remote",
    "policy": "immediate",
    "proxy_url": null,
    "ssl_ca_certificate": null,
    "ssl_client_certificate": null,
    "ssl_client_key": null,
    "ssl_validation": true,
    "url": "https://supermarket.chef.io/"
}

$ export REMOTE_HREF=$(http :24817/pulp/api/v3/remotes/cookbook/cookbook/ | jq -r '.results[] | select(.name == "foo_remote") | ._href')

Sync repository foo using remote foo_remote

We don’t want to delete the apt and ubuntu coobooks imported previously. Therefore, we sync in ‘additive’ mode by setting mirror to false.

$ http POST :24817$REMOTE_HREF'sync/' repository=$REPO_HREF mirror:=false

Look at the new Repository Version created

$ http GET ':24817'$REPO_HREF'versions/2/'

{
    "_created": "2019-10-03T16:38:18.843201Z",
    "_href": "/pulp/api/v3/repositories/200118d5-dc92-4e2d-b970-df7edec122ea/versions/2/",
    "base_version": null,
    "content_summary": {
        "added": {
            "cookbook.cookbook": {
                "count": 2,
                "href": "/pulp/api/v3/content/cookbook/cookbooks/?repository_version_added=/pulp/api/v3/repositories/200118d5-dc92-4e2d-b970-df7edec122ea/versions/2/"
            }
        },
        "present": {
            "cookbook.cookbook": {
                "count": 4,
                "href": "/pulp/api/v3/content/cookbook/cookbooks/?repository_version=/pulp/api/v3/repositories/200118d5-dc92-4e2d-b970-df7edec122ea/versions/2/"
            }
        },
        "removed": {}
    },
    "number": 2
}

At the time of writing, there was only a single version of the pulp and qpid cookbooks available, respectively. This brings the total count to 4 cookbooks.

Publish the newest version

To publish the version just created, do:

$ http POST http://localhost:24817/pulp/api/v3/publications/cookbook/cookbook/ repository=$REPO_HREF

And update the distribution:

export DISTRIBUTION_HREF=$(http :24817/pulp/api/v3/distributions/cookbook/cookbook/ | jq -r '.results[] | select(.name == "baz") | ._href')
export LATEST_VERSION_HREF=$(http :24817$REPO_HREF | jq -r '._latest_version_href')
export LATEST_PUBLICATION_HREF=$(http :24817/pulp/api/v3/publications/cookbook/cookbook/ | jq --arg LVH "$LATEST_VERSION_HREF" -r '.results[] | select(.repository_version == $LVH) | ._href')
http PATCH :24817$DISTRIBUTION_HREF publication=$LATEST_PUBLICATION_HREF

Now, the universe endpoint http://localhost:24816/pulp_cookbook/content/foo/universe will show the content of the new repo version.

Example: Snapshot of Chef Supermarket

Using the ‘on_demand’ policy on a remote allows to create snapshots of a large repo like the Chef Supermarket effectively. In “on_demand” mode, only the meta-data will be synchronized. Actual cookbooks are not downloaded at sync time, but only when requested from a distribution. After the first successful download, the cookbooks are stored locally for faster retrieval.

Create a repository supermarket

$ http POST http://localhost:24817/pulp/api/v3/repositories/ name=supermarket

{
    "_created": "2019-03-30T22:59:02.569833Z",
    "_href": "/pulp/api/v3/repositories/80f03582-ae58-406d-b456-bbb33e718f8f/",
    "_latest_version_href": null,
    "_versions_href": "/pulp/api/v3/repositories/80f03582-ae58-406d-b456-bbb33e718f8f/versions/",
    "description": "",
    "name": "supermarket"
}

$ export REPO_HREF=$(http :24817/pulp/api/v3/repositories/ | jq -r '.results[] | select(.name == "supermarket") | ._href')

Create a new remote supermarket

$ http POST http://localhost:24817/pulp/api/v3/remotes/cookbook/cookbook/ name='supermarket' url='https://supermarket.chef.io/' policy=on_demand

{
    "_created": "2019-03-30T22:59:35.618466Z",
    "_href": "/pulp/api/v3/remotes/cookbook/cookbook/472c73b9-0132-4c1b-8814-816fd237a40a/",
    "_last_updated": "2019-03-30T22:59:35.618484Z",
    "_type": "cookbook.cookbook",
    "cookbooks": "",
    "download_concurrency": 20,
    "name": "supermarket",
    "policy": "on_demand",
    "proxy_url": "",
    "ssl_validation": true,
    "url": "https://supermarket.chef.io/",
    "validate": true
}

$ export REMOTE_HREF=$(http :24817/pulp/api/v3/remotes/cookbook/cookbook/ | jq -r '.results[] | select(.name == "supermarket") | ._href')

Sync repository supermarket using remote supermarket

$ http POST :24817$REMOTE_HREF'sync/' repository=$REPO_HREF mirror:=true

{
    "task": "/pulp/api/v3/tasks/24990466-6602-4f4f-bb59-6d827bd48130/"
}

This will take a while. You can query the task status using the returned URL. In the example above, use http :24817/pulp/api/v3/tasks/24990466-6602-4f4f-bb59-6d827bd48130/ and inspect the “state” field.

Create a Publication

$ export LATEST_VERSION_HREF=$(http :24817$REPO_HREF | jq -r '._latest_version_href')

$ http POST http://localhost:24817/pulp/api/v3/publications/cookbook/cookbook/ repository_version=$LATEST_VERSION_HREF

{
    "task": "/pulp/api/v3/tasks/8e9d3faf-695f-4048-a11a-1a7a65bd2f8e/"
}

Again, this may take some time. When the task is finished, get the URL of the publication:

$ export PUBLICATION_HREF=$(http :24817/pulp/api/v3/publications/cookbook/cookbook/ | jq --arg LVH "$LATEST_VERSION_HREF" -r '.results[] | select(.repository_version == $LVH) | ._href')

Create a Distribution at ‘supermarket’ for the Publication

$ http POST http://localhost:24817/pulp/api/v3/distributions/cookbook/cookbook/ name='supermarket' base_path='supermarket' publication=$PUBLICATION_HREF

You can have a look at the published “universe” metadata now:

$ http localhost:24816/pulp_cookbook/content/supermarket/universe

In your Berksfile you can use the following source to access the Supermarket snapshot:

source 'http://localhost:24816/pulp_cookbook/content/supermarket/'

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pulp-cookbook-0.1.0b2.tar.gz (40.4 kB view details)

Uploaded Source

Built Distribution

pulp_cookbook-0.1.0b2-py3-none-any.whl (48.2 kB view details)

Uploaded Python 3

File details

Details for the file pulp-cookbook-0.1.0b2.tar.gz.

File metadata

  • Download URL: pulp-cookbook-0.1.0b2.tar.gz
  • Upload date:
  • Size: 40.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.21.0 setuptools/40.6.3 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.4

File hashes

Hashes for pulp-cookbook-0.1.0b2.tar.gz
Algorithm Hash digest
SHA256 4944ddf7ab7d8914c8c2998e2d38260092cd508f8b2b5a7fd5cdf1718f25a7db
MD5 179cae15897b57ec16c0cb85ad5ee6cf
BLAKE2b-256 cb1bb678b4a1cf3ca4e8f017e9f77499f9c4de31a43ab2c3158c40bad908db81

See more details on using hashes here.

File details

Details for the file pulp_cookbook-0.1.0b2-py3-none-any.whl.

File metadata

  • Download URL: pulp_cookbook-0.1.0b2-py3-none-any.whl
  • Upload date:
  • Size: 48.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.21.0 setuptools/40.6.3 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.4

File hashes

Hashes for pulp_cookbook-0.1.0b2-py3-none-any.whl
Algorithm Hash digest
SHA256 0d566452b6455f80adb5d45ce29d222c2cb3171388caca831621feb121b9212c
MD5 2d7b91e0062021fb217b6c72185f8fac
BLAKE2b-256 bbf5fb561cac8803611048fa50fd9dd0e93f393d6ca89fdaa382baf68b7665ec

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page