Skip to main content

Literate package development with Jupyter

Project description

Literary logo with an orange cursive uppercase L inside black square brackets

Literary

pypi-badge binder-badge wiki-badge

TL;DR

Literary is a Python tool to make Jupyter (IPython) notebooks behave like pure-Python packages. This allows pure-Python packages to be generated from notebooks, and notebooks to be imported at runtime.

This package is an exploration of the literate programming idea pioneered by Donald Knuth and implemented in the nbdev package. Although nbdev looks to be a very mature and comprehensive tool, it is quite opinionated. This package is an investigation into what a smaller nbdev might look like.

Philosophy 📖

  1. Low mental overhead
    Realistically, most Python programmers that wish to write packages need to have some familiarity with the Python package development model, including the conventional structure of a package. For this reason, I feel that it is important to design literary such that these skills translate directly to designing libraries with notebooks
  2. Minimal downstream impact
    Users of literary packages should not realise that they are consuming notebook-generated code at runtime. This means that a pure-Python package needs to be generated from the notebooks, and it must use the conventional import model. For this reason, literary should only exist as a development dependency of the package.

Differences with nbdev

  • Use of cell tags instead of comments or magics to dictate exports
  • Use of nbconvert machinery to build the pure-Python lib package
  • Use of import hooks to import other notebooks
    • Maintains a similar programming model to conventional module development
    • Reduces the need to modify notebook contents during conversion
  • Minimal runtime overhead
    • Features like patch are removed from the generated module (& imported notebook source) using AST transformations
  • Currently no documentation generation
    • Loosely, the plan is to use existing notebook-book tooling to re-use the existing Jupyter ecosystem

Differences with Knuth

Knuth introduced the tangle and weave programs to produce separate documentation and source code for compilation. Literary differs in treating the notebook as the "ground truth" for documentation + testing, and generating smaller source code for packaging.

Design 🎨

The plan for this package is:

  1. Notebooks will be written inside <PACKAGE_NAME>/ in literary project's root directory
  2. Notebooks will respect relative imports and other pure-Python features to minimise the differences between the generated packages and the notebooks
  3. A pure-python generated lib/<PACKAGE_NAME>/ directory should be built before the packaging tool builds the final wheel/sdist. E.g. with Poetry:
    [tool.poetry]
    # ...
    packages = [
      { include = "<PACKAGE_NAME>", from = "lib" },
    ]
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

literary-1.8.0.tar.gz (15.1 kB view details)

Uploaded Source

Built Distribution

literary-1.8.0-py3-none-any.whl (17.3 kB view details)

Uploaded Python 3

File details

Details for the file literary-1.8.0.tar.gz.

File metadata

  • Download URL: literary-1.8.0.tar.gz
  • Upload date:
  • Size: 15.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.9.5

File hashes

Hashes for literary-1.8.0.tar.gz
Algorithm Hash digest
SHA256 403a9676a79df0db4e20be2be340fcc0566688c6ccc8df6a554e58221fbb13e9
MD5 e843c7d2e85169e7a6b335ca64eb579d
BLAKE2b-256 24bacd7e345f1f09527cbd89dbe42b7f76dac5acc91e16afa42f68facec89a25

See more details on using hashes here.

File details

Details for the file literary-1.8.0-py3-none-any.whl.

File metadata

  • Download URL: literary-1.8.0-py3-none-any.whl
  • Upload date:
  • Size: 17.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.9.5

File hashes

Hashes for literary-1.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 72ee3dd3fd2d5e4a104b55d6e8fd66da364817dc02fa61e66b1422f8c7001f54
MD5 98c6fbeaff1a2d9818e2b527a79000a6
BLAKE2b-256 c20cfa91d3a2ae04c3a2c8c7db3e8f7de525f60c14954231f6e85958bcb83d13

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page