Skip to main content

Literate package development with Jupyter

Project description

Literary logo with an orange cursive uppercase L inside black square brackets

Literary

pypi-badge binder-badge

This package is an exploration of the literate programming idea pioneered by Donald Knuth and implemented in the nbdev package. Although nbdev looks to be a very mature and comprehensive tool, it is quite opinionated. This package is an investigation into what a smaller nbdev might look like.

Philosophy

  1. Low mental overhead
    Realistically, most Python programmers that wish to write packages need to have some familiarity with the Python package development model, including the conventional structure of a package. For this reason, I feel that it is important to design literary such that these skills translate directly to designing libraries with notebooks
  2. Minimal downstream impact
    Users of literary packages should not realise that they are consuming notebook-generated code at runtime. This means that a pure-Python package needs to be generated from the notebooks, and it must use the conventional import model. For this reason, literary should only exist as a development dependency of the package.

Differences with nbdev

  • Use of cell tags instead of comments or magics to dictate exports
  • Use of nbconvert machinery to build the pure-Python lib package
  • Use of import hooks to import other notebooks
    • Maintains a similar programming model to conventional module development
    • Reduces the need to modify notebook contents during conversion
  • Minimal runtime overhead
    • Features like patch are removed from the generated module (& imported notebook source) using AST transformations
  • Currently no documentation generation
    • Loosely, the plan is to use existing notebook-book tooling to re-use the existing Jupyter ecosystem

Differences with Knuth

Knuth introduced the tangle and weave programs to produce separate documentation and source code for compilation. Literary differs in treating the notebook as the "ground truth" for documentation + testing, and generating smaller source code for packaging.

Design

The plan for this package is:

  1. Notebooks will be written inside <PACKAGE_NAME>/ in literary project's root directory
  2. Notebooks will respect relative imports and other pure-Python features to minimise the differences between the generated packages and the notebooks
  3. A pure-python generated lib/<PACKAGE_NAME>/ directory will be built before Poetry builds the final project.
    E.g.
    [tool.poetry]
    # ...
    packages = [
      { include = "<PACKAGE_NAME>", from = "lib" },
    ]
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

literary-1.4.1.tar.gz (12.0 kB view details)

Uploaded Source

Built Distribution

literary-1.4.1-py3-none-any.whl (14.6 kB view details)

Uploaded Python 3

File details

Details for the file literary-1.4.1.tar.gz.

File metadata

  • Download URL: literary-1.4.1.tar.gz
  • Upload date:
  • Size: 12.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.9.2 Linux/5.4.0-1039-azure

File hashes

Hashes for literary-1.4.1.tar.gz
Algorithm Hash digest
SHA256 b635e5ae813983cee1118dbe33bcd86be3ad525f1fc95adc910aae1cd89b8a93
MD5 2e276179342a1b30531634ccfa2eeddf
BLAKE2b-256 c1dbc5c0d2a3dc67dc0bb2729c1484ba92d377e6f07690f06c71c8068e99ee30

See more details on using hashes here.

File details

Details for the file literary-1.4.1-py3-none-any.whl.

File metadata

  • Download URL: literary-1.4.1-py3-none-any.whl
  • Upload date:
  • Size: 14.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.9.2 Linux/5.4.0-1039-azure

File hashes

Hashes for literary-1.4.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b81a6d4abd73b0f3a4866ad43114724ea5f6caa68b3a40c12f513fbda54063ae
MD5 ada413bc380383cf84c7997e1487e4c1
BLAKE2b-256 de233839e88f7781705ec65b2c0f61396f6b72954bac396e4bb91ab2e8c3fd39

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page