Skip to main content

Literate package development with Jupyter

Project description

Literary logo with an orange cursive uppercase L inside black square brackets

Literary

pypi-badge binder-badge

This package is an exploration of the literate programming idea pioneered by Donald Knuth and implemented in the nbdev package. Although nbdev looks to be a very mature and comprehensive tool, it is quite opinionated. This package is an investigation into what a smaller nbdev might look like.

Philosophy

  1. Low mental overhead
    Realistically, most Python programmers that wish to write packages need to have some familiarity with the Python package development model, including the conventional structure of a package. For this reason, I feel that it is important to design literary such that these skills translate directly to designing libraries with notebooks
  2. Minimal downstream impact
    Users of literary packages should not realise that they are consuming notebook-generated code at runtime. This means that a pure-Python package needs to be generated from the notebooks, and it must use the conventional import model. For this reason, literary should only exist as a development dependency of the package.

Differences with nbdev

  • Use of cell tags instead of comments or magics to dictate exports
  • Use of nbconvert machinery to build the pure-Python lib package
  • Use of import hooks to import other notebooks
    • Maintains a similar programming model to conventional module development
    • Reduces the need to modify notebook contents during conversion
  • Minimal runtime overhead
    • Features like patch are removed from the generated module (& imported notebook source) using AST transformations
  • Currently no documentation generation
    • Loosely, the plan is to use existing notebook-book tooling to re-use the existing Jupyter ecosystem

Differences with Knuth

Knuth introduced the tangle and weave programs to produce separate documentation and source code for compilation. Literary differs in treating the notebook as the "ground truth" for documentation + testing, and generating smaller source code for packaging.

Design

The plan for this package is:

  1. Notebooks will be written inside <PACKAGE_NAME>/ in literary project's root directory
  2. Notebooks will respect relative imports and other pure-Python features to minimise the differences between the generated packages and the notebooks
  3. A pure-python generated lib/<PACKAGE_NAME>/ directory will be built before Poetry builds the final project.
    E.g.
    [tool.poetry]
    # ...
    packages = [
      { include = "<PACKAGE_NAME>", from = "lib" },
    ]
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

literary-1.5.0.tar.gz (12.1 kB view details)

Uploaded Source

Built Distribution

literary-1.5.0-py3-none-any.whl (14.7 kB view details)

Uploaded Python 3

File details

Details for the file literary-1.5.0.tar.gz.

File metadata

  • Download URL: literary-1.5.0.tar.gz
  • Upload date:
  • Size: 12.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.5 CPython/3.9.2 Linux/5.4.0-1040-azure

File hashes

Hashes for literary-1.5.0.tar.gz
Algorithm Hash digest
SHA256 270492d77d9f99db8216a3a7ddc317be8f029919d1cde2b3a89bd4ddb7a7422e
MD5 19f25f0557689c095077d425177e03a8
BLAKE2b-256 8695f075b6a3f67ca1f65913f88029b143f4493a1e888ec6a91af14f0cda9f5d

See more details on using hashes here.

File details

Details for the file literary-1.5.0-py3-none-any.whl.

File metadata

  • Download URL: literary-1.5.0-py3-none-any.whl
  • Upload date:
  • Size: 14.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.5 CPython/3.9.2 Linux/5.4.0-1040-azure

File hashes

Hashes for literary-1.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 355dd9720ea8b7563084abdbaf92c3630176a73c81652a9337bc1282b32b0aef
MD5 21c0cdb7c4ff0cc1b6fc0bf291b16a30
BLAKE2b-256 0c7325840d49e8e0fd390cee04eca87e73cd980c9f964b9e97f441bdc3c10134

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page