Skip to main content

JupyterLab Extension for dependency management and optimization

Project description

jupyterlab-requirements

Dependency management and optimization in JupyterLab.

Swagger Validator

About

This extension provides management of dependencies for JupyterLab notebooks.

The main goals of the project are the following:

  • manage notebook requirements without leaving the notebook
  • provide a unique and optimized* environment for each notebook

NOTE: The requirements are optimized using the Thoth resolution engine

Requirements

  • JupyterLab >= 3.0

Installation

You can install this extension with pip:

pip install jupyterlab-requirements

And start using it immediately on JupyterLab:

jupyter lab

Troubleshoot

If you are seeing the frontend extension, but it is not working, check that the server extension is enabled:

jupyter server extension list

If the server extension is installed and enabled, but you are not seeing the frontend extension, check the frontend extension is installed:

jupyter labextension list

Usage

You can use this extension for each of your notebook to guarantee they have the correct dependencies and kernel. This extension is able to add/remove dependencies, lock them and store them in the notebook metadata. In this way all the dependencies information required to repeat the environment are shipped with the notebook.

In particular, in the notebook metadata you can find:

  • requirements (Pipfile)

  • requirements locked with all hashes (Pipfile.lock)

  • dependency resolution engine used (thoth or pipenv)

  • configuration file for runtiment environment (.thoth.yaml if you are using thoth resolution engine)

All this information can allow reproducibility of the notebook.

There are 3 ways to interact with this extension:

  • using %horus magic commands directly in your notebook's cells. To learn more about how to use the %horus magic commands check out the guide here or the video here
JupyterLab Requirements Horus magic commands
  • using the horus CLI directly from terminal or integrated in pipelines (check video).
JupyterLab Requirements Horus CLI
  • using the Manage Dependencies button that appears in the notebook when it is opened:
JupyterLab Requirements UI

Currently this extension supports only Python kernels.

Resolution engines

Currently Thoth is used by default and pipenv is backup. In the future user will be able to select specific one.

Using the Thoth resolution engine you can request an optimized software that satisfies your requirements using the Thoth recommender system. You can choose the type of recommendation that better fits your needs:

  • latest
  • performance
  • security
  • stable
  • testing

You can find more information and updates here.

Virtual environment for you dependencies

Virtualenv created to run your notebook according to your dependencies requirement is created in:

~/.local/share/thoth/kernels/{kernel_name}

Dependencies installation

Once lock file is created using any of available resolution engines, the dependencies will be installed in the virtualenv using micropipenv.

Overlays directory

The dependencies stored in the notebook metadata are also stored into overlays folder (created automatically) using the kernel name by default. If you want to know more about the use of overlays, have a look here.

Thoth configuration file

Thoth resolution engine is able to provide an optimized software stack based on the runtime environment you are using (more inputs are used, if you want to know more, have a look here here).

In general different runtime environment will provide different effect on you application (e.g. more performance), therefore we include these information in the notebook metadata so that other can find out what runtime environment has been used to run a certain notebook.

Delete kernels

If you have too many kernels, you can handle them directly from the menu.

kernel delete handler menu

%horus magic command

As of v0.10.0 jupyterlab-requirements supports %horus magic command directly in the cells so that the user can speed up all dependency management taks, working in one place. Magic commands are automatically loaded when you start a notebook and they automatically identify the notebook you are using.

Check notebook metadata content about dependencies

%horus check

Create/Modify/Remove requirements in Pipfile in notebook metadata

You can add requirement to Pipfile in your notebook, using the following command:

%horus requirements --add tensorflow

If you want to remove a requirement instead, you can use the following command:

%horus requirements --remove tensorflow

Lock requirements in notebook metadata and installed in the kernel

Adding --kernel-name can use a certain kernel name (default to jupyterlab-requirements).

Using Thoth resolution engine:

%horus lock

Thoth only can be combined with the commands below:

Adding --set-timeout will set timeout for request to thoth.

Adding --force will force request to thoth if one analysis result already exists.

Adding --recommendation-type the user can select the type of reccomendation:

  • latest [default]
  • stable
  • performance
  • security

Adding --os-name will use OS name in request to Thoth.

Adding --os-version will use OS version in request to Thoth.

Adding --python-version will use python version in request to Thoth.

Usign Pipenv resolution engine:

%horus lock --pipenv

Once dependencies are locked, they will be automatically installed in the kernel and saved in the notebook metadata.

Convert notebook cells with pip commands to use horus commands in order to allow reproducibility

%horus convert

Have a look at this video to know more about this command.

Discover notebook content about dependencies

This command is used to discover dependencies used in the notebook and create a Pipfile (empty if packages are not identified). NOTE: Please keep in mind this feature is under development and the packages identified need to be checked by humans.

%horus discover

Adding --force will store file at the desired/default path even if one exists. If no --force is provided the CLI will simply fail.

Extract notebook metadata content about dependencies

This command is used to extract dependencies content from notebook metadata and store it locally.

%horus extract

It can be combined with the commands below:

Adding --store-files-path will store file at the desired path.

Adding --force will store file at the desired/default path even if one exists. If no --force is provided the CLI will simply fail.

NOTE: Please keep in mind the .thoth.yaml will be stored at the root of the repo.

If you want to extract only a specific paramater, you can consider the following options:

%horus extract --pipfile
%horus extract --pipfile-lock
%horus extract --thoth-config

Extension Button

This jupyterlab extension provides a button directly in the notebook to manage the dependencies (see image below).

JupyterLab Requirements Extension

How to use it

Start adding dependencies from empty notebook

Clicking the above button you will receive the following dialog form initially:

Initial Dialog Form

Initially, no dependencies are identified if you start a new notebook as metadata related are not existing. The extension checks in the notebook metadata in order to identify them every time you restart a notebook. Moreover it verifies that the kernel you are using is matching your dependencies. If not it warns to use install button again to avoid weird behaviours.

You can start adding your packages using the central add button and once you select package name and version, remember to add your package using add button in action, otherwise it won't be saved (in the future this behaviour will not be necessary due to the autocompletion feature):

Add Package

NOTE: The extra button in action will be removed in the future.

NOTE: Autocompletion is planned in the future so that user can check which version are available on PyPI.

Save dependencies added and install them in your customized kernel

After saving the install button will appear so you can check before actually installing the dependencies:

Install

NOTE: You can choose the name of the kernel you want for your notebook.

Finally after using the install button:

Ready to Work

Now all dependencies will be locked (direct and transitive), saved in the notebook metadata, and installed. Moreover, the kernel will be automatically created and set for your notebook without human intervention required.

Now you are ready to work on your project!

Restart notebook

If you restart notebook and check dependencies with button you will see that they are all installed and ready:

Restarting Notebook

Start notebook without information about dependencies in metadata

If you have notebooks with code and you want to start using this extension, there is a nice feature that can be interesting.

Thoth relies on a library called invectio. This library statically analyzes sources and extract information about called or exported library functions in Python applications.

jupyterlab-requirements extension uses this information to provide users with list of packages to be installed if they have never used the extension before.

User with code

Horus: jupyterlab-requirements CLI

As of v0.9.0 jupyterlab-requirements supports a CLI that can be used for automation processes, called Horus, another Egyptian God, part of Thoth family.

Check notebook metadata content about dependencies

This command is used to verify if a certain notebook is reproducible, therefore if it contains all dependencies required for installation and run. This command can be used in CI to verify notebooks have dependencies.

horus check [YOUR_NOTEBOOK].ipynb

Show notebook metadata content about dependencies

This command is used to show dependencies content from notebook metadata.

horus show [YOUR_NOTEBOOK].ipynb

Extract notebook metadata content about dependencies

This command is used to extract dependencies content from notebook metadata and store it locally.

horus extract [YOUR_NOTEBOOK].ipynb

It can be combined with the commands below:

Adding --store-files-path will store file at the desired path.

Adding --force will store file at the desired/default path even if one exists. If no --force is provided the CLI will simply fail.

NOTE: Please keep in mind the .thoth.yaml will be stored at the root of the repo.

If you want to extract only a specific paramater, you can consider the following options:

horus extract [YOUR_NOTEBOOK].ipynb  --pipfile
horus extract [YOUR_NOTEBOOK].ipynb  --pipfile-lock
horus extract [YOUR_NOTEBOOK].ipynb  --thoth-config

Install and create kernel for the notebook dependencies

This commands is used to prepare environment for the notebook to run, just pointing to the notebook.

horus set-kernel [YOUR_NOTEBOOK].ipynb

Discover notebook content about dependencies

This command is used to discover dependencies used in the notebook and create a Pipfile (empty if packages are not identified). NOTE: Please keep in mind this feature is under development and the packages identified need to be checked by humans.

horus discover [YOUR_NOTEBOOK].ipynb

Adding --show-only won't store file locally, but only show it to stdout.

Adding --force will store file at the desired/default path even if one exists. If no --force is provided the CLI will simply fail.

Save content about dependencies in notebook metadata

This command is used to save content in notebook metadata.

horus save [YOUR_NOTEBOOK].ipynb --resolution-engine [RESOLUTION_ENGINE]

RESOLUTION_ENGINE can be thoth or pipenv currently.

It can be combined with the commands below:

Adding --save-files-path will consider files to save from the desired path.

Adding --force will store file at the desired/default path even if one exists. If no --force is provided the CLI will simply fail.

Adding --kernel-name can set a certain kernel name (default to jupyterlab-requirements).

If you want to save only a specific paramater, you can consider the following options:

horus save [YOUR_NOTEBOOK].ipynb  --pipfile
horus save [YOUR_NOTEBOOK].ipynb  --pipfile-lock
horus save [YOUR_NOTEBOOK].ipynb  --thoth-config

Create/Modify/Remove requirements in Pipfile in notebook metadata

You can add requirement to Pipfile in your notebook, using the following command:

horus requirements [YOUR_NOTEBOOK].ipynb  --add tensorflow

If you want to remove a requirement instead, you can use the following command:

horus requirements [YOUR_NOTEBOOK].ipynb  --remove tensorflow

Lock requirements in notebook metadata

Adding --kernel-name can use a certain kernel name (default to jupyterlab-requirements).

Using Thoth resolution engine:

horus lock [YOUR_NOTEBOOK].ipynb

Thoth only can be combined with the commands below:

Adding --set-timeout will set timeout for request to thoth.

Adding --force will force request to thoth if one analysis result already exists.

Adding --os-name will use OS name in request to Thoth.

Adding --os-version will use OS version in request to Thoth.

Adding --python-version will use python version in request to Thoth.

Usign Pipenv resolution engine:

horus lock [YOUR_NOTEBOOK].ipynb  --pipenv

Contributing

Development install

Note: You will need NodeJS to build the extension package.

The jlpm command is JupyterLab's pinned version of yarn that is installed with JupyterLab. You may use yarn or npm in lieu of jlpm below.

# Clone the repo to your local environment
# Change directory to the jupyterlab-requirements directory
# Install package in development mode
pip install -ve .
# Link your development version of the extension with JupyterLab
jupyter labextension develop . --overwrite

jupyter serverextension enable --py jupyterlab-requirements --sys-prefix
# Rebuild extension Typescript source after making changes
jlpm run build

You can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.

# Watch the source directory in one terminal, automatically rebuilding when needed
jlpm run watch
# Run JupyterLab in another terminal
jupyter lab

With the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).

By default, the jlpm run build command generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:

jupyter lab build --minimize=False

Uninstall

pip uninstall jupyterlab-requirements

Demo development status and new features

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jupyterlab_requirements-0.11.1.tar.gz (5.0 MB view details)

Uploaded Source

Built Distribution

jupyterlab_requirements-0.11.1-py3-none-any.whl (209.5 kB view details)

Uploaded Python 3

File details

Details for the file jupyterlab_requirements-0.11.1.tar.gz.

File metadata

  • Download URL: jupyterlab_requirements-0.11.1.tar.gz
  • Upload date:
  • Size: 5.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.9.7

File hashes

Hashes for jupyterlab_requirements-0.11.1.tar.gz
Algorithm Hash digest
SHA256 a2c777a38c2f2c475d83aceb6c7ed304a798106bc8f67173cf1d722999cf9a7a
MD5 0bd67a7720b646d3a899ea3487b1ea6e
BLAKE2b-256 2af1bf8ed2d60c0d9f111504b58da8248a8dc8d5df52bd2df335614da7f13878

See more details on using hashes here.

File details

Details for the file jupyterlab_requirements-0.11.1-py3-none-any.whl.

File metadata

  • Download URL: jupyterlab_requirements-0.11.1-py3-none-any.whl
  • Upload date:
  • Size: 209.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.1 pkginfo/1.7.1 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.2 CPython/3.9.7

File hashes

Hashes for jupyterlab_requirements-0.11.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c2539413ccdab1846e138cd077df97cb5e7c150f8fbe164aa208f78b5aec4840
MD5 ea340093ca9752fdba09afd6c4a62ab5
BLAKE2b-256 cae834bc49a584aa9ef436f40e4594bcf1a2d074388c66ca28c2b7a4954f56ae

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page