Skip to main content

Literate package development with Jupyter

Reason this release was yanked:

missing conditional dependency

Project description

Literary

pypi-badge binder-badge

This package is an exploration of the literate programming idea pioneered by Donald Knuth and implemented in the nbdev package. Although nbdev looks to be a very mature and comprehensive tool, it is quite opinionated. This package is an investigation into what a smaller nbdev might look like.

Philosophy

  1. Low mental overhead
    Realistically, most Python programmers that wish to write packages need to have some familiarity with the Python package development model, including the conventional structure of a package. For this reason, I feel that it is important to design literary such that these skills translate directly to designing libraries with notebooks
  2. Minimal downstream impact
    Users of literary packages should not realise that they are consuming notebook-generated code at runtime. This means that a pure-Python package needs to be generated from the notebooks, and it must use the conventional import model. For this reason, literary should only exist as a development dependency of the package.

Differences with nbdev

  • Use of cell tags instead of comments or magics to dictate exports
  • Use of nbconvert machinery to build the pure-Python lib package
  • Use of import hooks to import other notebooks
    • Maintains a similar programming model to conventional module development
    • Reduces the need to modify notebook contents during conversion
  • Minimal runtime overhead
    • Features like patch are removed from the generated module (& imported notebook source) using AST transformations
  • Currently no documentation generation
    • Loosely, the plan is to use existing notebook-book tooling to re-use the existing Jupyter ecosystem

Differences with Knuth

Knuth introduced the tangle and weave programs to produce separate documentation and source code for compilation. Literary differs in treating the notebook as the "ground truth" for documentation + testing, and generating smaller source code for packaging.

Design

The plan for this package is:

  1. Notebooks will be written inside <PACKAGE_NAME>/ in literary project's root directory
  2. Notebooks will respect relative imports and other pure-Python features to minimise the differences between the generated packages and the notebooks
  3. A pure-python generated lib/<PACKAGE_NAME>/ directory will be built before Poetry builds the final project.
    E.g.
    [tool.poetry]
    # ...
    packages = [
      { include = "<PACKAGE_NAME>", from = "lib" },
    ]
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

literary-1.4.0.tar.gz (11.9 kB view details)

Uploaded Source

Built Distribution

literary-1.4.0-py3-none-any.whl (14.5 kB view details)

Uploaded Python 3

File details

Details for the file literary-1.4.0.tar.gz.

File metadata

  • Download URL: literary-1.4.0.tar.gz
  • Upload date:
  • Size: 11.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.9.0 Linux/5.8.0-43-generic

File hashes

Hashes for literary-1.4.0.tar.gz
Algorithm Hash digest
SHA256 f3869ab97cf252d6a3b26f075cdf7ed282141e7c88472a8e64d90b9964859ca1
MD5 e27b8139b4411884481b2fa0ba9c974f
BLAKE2b-256 6443cb87273079a8c3c82f467817489707107d5e29357887e60842da2bf5fea4

See more details on using hashes here.

File details

Details for the file literary-1.4.0-py3-none-any.whl.

File metadata

  • Download URL: literary-1.4.0-py3-none-any.whl
  • Upload date:
  • Size: 14.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.9.0 Linux/5.8.0-43-generic

File hashes

Hashes for literary-1.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 073a9dd2c54bccadba30cbb057985444f130def157c1896b96cb3e4583dffc0f
MD5 178eadd89369d4100344f5fe37f9d52c
BLAKE2b-256 c0cb57eb9fdb0c965a8f1aa807c1b08fcce0a5272b1075dd71ab6d3aa47c80f5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page