Skip to main content

Jupyter Notebook operator for Kubeflow Pipelines

Project description

KFP-Notebook is an operator that enable running notebooks as part of a Kubeflow Pipeline.

Building kfp-notebook

make clean install

Usage

The example below can easily be added to a python script or jupyter notebook for testing purposes.

import os
import kfp
from notebook.pipeline import NotebookOp
from kubernetes.client.models import V1EnvVar

# KubeFlow Pipelines API Endpoint
kfp_url = 'http://dataplatform.ibm.com:32488/pipeline'

# S3 Object Storage
cos_endpoint = 'http://s3.us-south.cloud-object-storage.appdomain.cloud'
cos_bucket = 'test-bucket'
cos_username = 'test'
cos_password = 'test123'
cos_directory = 'test-directory' 
cos_pull_archive = 'test-archive.tar.gz'

# Inputs and Outputs
inputs = []
outputs = []

# Container Image
image = 'tensorflow/tensorflow:latest'

def run_notebook_op(op_name, notebook_path):

    notebook_op = NotebookOp(name=op_name,
                             notebook=op_name,
                             cos_endpoint=cos_endpoint,
                             cos_bucket=cos_bucket,
                             cos_directory=cos_directory,
                             cos_pull_archive=cos_pull_archive,
                             pipeline_outputs=outputs,
                             pipeline_inputs=inputs,
                             image=image)

    notebook_op.container.add_env_variable(V1EnvVar(name='AWS_ACCESS_KEY_ID', value=cos_username))
    notebook_op.container.add_env_variable(V1EnvVar(name='AWS_SECRET_ACCESS_KEY', value=cos_password))
    notebook_op.container.set_image_pull_policy('Always')

    return op

def demo_pipeline():
    stats_op = run_notebook_op('stats', 'generate-community-overview')
    contributions_op = run_notebook_op('contributions', 'generate-community-contributions')
    run_notebook_op('overview', 'overview').after(stats_op, contributions_op)

# Compile the new pipeline
kfp.compiler.Compiler().compile(demo_pipeline,'pipelines/pipeline.tar.gz')

# Upload the compiled pipeline
client = kfp.Client(host=kfp_url)
pipeline_info = client.upload_pipeline('pipelines/pipeline.tar.gz',pipeline_name='pipeline-demo')

# Create a new experiment
experiment = client.create_experiment(name='demo-experiment')

# Create a new run associated with experiment and our uploaded pipeline
run = client.run_pipeline(experiment.id, 'demo-run', pipeline_id=pipeline_info.id)

Generated Kubeflow Pipelines

Kubeflow Pipeline Example

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kfp-notebook-0.12.0.tar.gz (9.9 kB view details)

Uploaded Source

Built Distribution

kfp_notebook-0.12.0-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file kfp-notebook-0.12.0.tar.gz.

File metadata

  • Download URL: kfp-notebook-0.12.0.tar.gz
  • Upload date:
  • Size: 9.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.0 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.9

File hashes

Hashes for kfp-notebook-0.12.0.tar.gz
Algorithm Hash digest
SHA256 a616cb98738dfb790f0a175a8fef70b04784d932a599c2d7849d911f2be8b566
MD5 0c9705dfcafb9e590b9bedbde9b5d998
BLAKE2b-256 98a63d6d99cd979fbf85e251d4d70d0d2a2635b0216487b4f8a8a9efe1ff6b53

See more details on using hashes here.

File details

Details for the file kfp_notebook-0.12.0-py3-none-any.whl.

File metadata

  • Download URL: kfp_notebook-0.12.0-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.0 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.7.9

File hashes

Hashes for kfp_notebook-0.12.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1c3f3a68c295814cec2c6d50d0db63431e94ed091f6d9eb5afd9a06c314ea200
MD5 4897c645d25b7598204ffb77e5f567ea
BLAKE2b-256 7f3d919aa3a0c46959d519475b1f7ecffded55434102bfda63d9a28eb140e9b0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page