Skip to main content

⚙️ CLI helpers for GitHub Actions + reuseable workflows

Project description

gha-utils CLI + reusable workflows

Last release Python versions Type checked with mypy Unittests status

gha-utils stands for GitHub Action workflows Utilities.

Maintaining project takes time. This repository contains the code of the gha-utils CLI and a collection of reusable workflows to:

  • maintain a Python project, its CLI, doc, QA, etc.
  • maintain an Awesome List project.

gha-utils CLI

Executables

Standalone executables of gha-utils's latest version are available as direct downloads for several platforms and architectures:

Platform x86_64 arm64
Linux Download gha-utils-linux-x64.bin
macOS Download gha-utils-macos-x64.bin Download gha-utils-macos-arm64.bin
Windows Download gha-utils-windows-x64.exe

Run dev version

$ git clone https://github.com/kdeldycke/workflows
$ cd workflows
$ python -m pip install uv
$ uv venv
$ source .venv/bin/activate
$ uv pip install .
$ uv run -- gha-utils

Reusable workflows collection

This repository contains workflows to automate most of the boring tasks.

These workflows are mostly used for Python projects and their documentation, but not only. They're all reusable GitHub actions workflows.

Reasons for a centralized workflow repository:

  • reusability of course: no need to update dozens of repository where 95% of workflows are the same
  • centralize all dependencies pertaining to automation: think of the point-release of an action that triggers dependabot upgrade to all your repositories depending on it

Guidelines

I don't want to copy-n-past, keep in sync and maintain another Nth CI/CD file at the root of my repositories.

So my policy is: move every repository-specific config in a pyproject.toml file, or hide the gory details in a reused workflow.

.github/workflows/docs.yaml jobs

  • Autofix typos

  • Optimize images

  • Keep .mailmap up to date

  • Update dependency graph of Python projects

    • Requires:
      • Python package with a pyproject.toml file
  • Build Sphinx-based documentation and publish it to GitHub Pages

    • Requires:
      • Python package with a pyproject.toml file
      • All Sphinx dependencies in a docs extra dependency group:
        [project.optional-dependencies]
        docs = [
            "furo == 2024.1.29",
            "myst-parser ~= 3.0.0",
            "sphinx >= 6",
            ...
        ]
        
      • Sphinx configuration file at docs/conf.py
  • Sync awesome projects from awesome-template repository

Why all these requirements/*.txt files?

Let's look for example at the lint-yaml job from .github/workflows/lint.yaml. Here we only need the yamllint CLI. This CLI is distributed on PyPi. So before executing it, we could have simply run the following step:

  - name: Install yamllint
    run: |
      pip install yamllint

Instead, we install it via the requirements/yamllint.txt file.

Why? Because I want the version of yamllint to be pinned. By pinning it, I make the workflow stable, predictable and reproducible.

So why use a dedicated requirements file? Why don't we simply add the version? Like this:

  - name: Install yamllint
    run: |
      pip install yamllint==1.35.1

That would indeed pin the version. But it requires the maintainer (me) to keep track of new release and update manually the version string. That's a lot of work. And I'm lazy. So this should be automated.

To automate that, the only practical way I found was to rely on dependabot. But dependabot cannot update arbitrary versions in run: YAML blocks. It only supports requirements.txt and pyproject.toml files for Python projects.

So to keep track of new versions of dependencies while keeping them stable, we've hard-coded all Python libraries and CLIs in the requirements/*.txt files. All with pinned versions.

And for the case we need to install all dependencies in one go, we have a requirements.txt file at the root that is referencing all files from the requirements/ subfolder.

Permissions and token

This repository updates itself via GitHub actions. It particularly updates its own YAML files in .github/workflows. That's forbidden by default. So we need extra permissions.

Usually, to grant special permissions to some jobs, you use the permissions parameter in workflow files. It looks like this:

on: (...)

jobs:

  my-job:
    runs-on: ubuntu-latest
    permissions:
      contents: write
      pull-requests: write

    steps: (...)

But the contents: write permission doesn't allow write access to the workflow files in the .github subfolder. There is actions: write, but it only covers workflow runs, not their YAML source file. Even a permissions: write-all doesn't work. So you cannot use the permissions parameter to allow a repository's workflow update its own workflow files.

You will always end up with this kind or errors:

   ! [remote rejected] branch_xxx -> branch_xxx (refusing to allow a GitHub App to create or update workflow `.github/workflows/my_workflow.yaml` without `workflows` permission)

  error: failed to push some refs to 'https://github.com/kdeldycke/my-repo'

[!NOTE] That's also why the Settings > Actions > General > Workflow permissions parameter on your repository has no effect on this issue, even with the Read and write permissions set:

To bypass the limitation, we rely on a custom access token. By convention, we call it WORKFLOW_UPDATE_GITHUB_PAT. It will be used, in place of the default secrets.GITHUB_TOKEN, in steps in which we need to change the workflow YAML files.

To create this custom WORKFLOW_UPDATE_GITHUB_PAT:

  • From your GitHub user, go to Settings > Developer Settings > Personal Access Tokens > Fine-grained tokens
  • Click on the Generate new token button
  • Choose a good token name like workflow-self-update to make your intention clear
  • Choose Only select repositories and the list the repositories in needs of updating their workflow YAML files
  • In the Repository permissions drop-down, sets:
    • Contents: Access: **Read and Write**
    • Metadata (mandatory): Access: **Read-only**
    • Pull Requests: Access: **Read and Write**
    • Workflows: Access: **Read and Write**

      [!NOTE] This is the only place where I can have control over the Workflows permission, which is not supported by the permissions: parameter in YAML files.

  • Now save these parameters and copy the github_pat_XXXX secret token
  • Got to your repo > Settings > Security > Secrets and variables > Actions > Secrets > Repository secrets and click New repository secrets
  • Name your secret WORKFLOW_UPDATE_GITHUB_PAT and copy the github_pat_XXXX token in the Secret field

Now re-run your actions and they should be able to update the workflow files in .github folder without the refusing to allow a GitHub App to create or update workflow error.

Release management

It turns out Release Engineering is a full-time job, and full of edge-cases.

Rust has cargo-dist. Go has... ? But there is no equivalent for Python.

So I made up a release.yaml workflow, which:

  1. Extracts project metadata from pyproject.toml
  2. Generates a build matrix of all commits / os / arch / CLI entry points
  3. Build Python wheel with Twine
  4. Compile binaries of all CLI with Nuitka
  5. Tag the release commit in Git
  6. Publish new version to PyPi
  7. Publish a GitHub release
  8. Attach and rename build artifacts to it

Changelog

A detailed changelog is available.

Used in

Check these projects to get real-life examples of usage and inspiration:

Feel free to send a PR to add your project in this list if you are relying on these scripts.

Release process

All steps of the release process and version management are automated in the changelog.yaml and release.yaml workflows.

All there's left to do is to:

  • check the open draft prepare-release PR and its changes,
  • click the Ready for review button,
  • click the Rebase and merge button,
  • let the workflows tag the release and set back the main branch into a development state.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gha_utils-4.3.3.tar.gz (31.1 kB view details)

Uploaded Source

Built Distribution

gha_utils-4.3.3-py3-none-any.whl (29.3 kB view details)

Uploaded Python 3

File details

Details for the file gha_utils-4.3.3.tar.gz.

File metadata

  • Download URL: gha_utils-4.3.3.tar.gz
  • Upload date:
  • Size: 31.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for gha_utils-4.3.3.tar.gz
Algorithm Hash digest
SHA256 a3311289b40adfec38e469522cfc176ab19b1a2a1353e16851114eccb23c0d32
MD5 024eb08100d8ac7886e5fc2defa1e6eb
BLAKE2b-256 78a9371da831699f76c0043f83a28ce8454ee3dcafe5e274d11400add8c9bb32

See more details on using hashes here.

File details

Details for the file gha_utils-4.3.3-py3-none-any.whl.

File metadata

  • Download URL: gha_utils-4.3.3-py3-none-any.whl
  • Upload date:
  • Size: 29.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for gha_utils-4.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 4687624452138961ff8daa2e5fe6eee080009cba63c899f8735812b1bce528d2
MD5 5aacbbeca660c926c28584c920c2d566
BLAKE2b-256 a06aa39cadb800f9eab30b7770c6132ad9a3847a0effeec228b0bcd92e1725bf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page