Skip to main content

Jupyter Notebook operator for Kubeflow Pipelines

Project description

KFP-Notebook is an operator that enable running notebooks as part of a Kubeflow Pipeline.

Building kfp-notebook

make clean install

Usage

The example below can easily be added to a python script or jupyter notebook for testing purposes.

import os
import kfp
from notebook.pipeline import NotebookOp
from kubernetes.client.models import V1EnvVar

# KubeFlow Pipelines API Endpoint
kfp_url = 'http://dataplatform.ibm.com:32488/pipeline'

# S3 Object Storage
cos_endpoint = 'http://s3.us-south.cloud-object-storage.appdomain.cloud'
cos_bucket = 'test-bucket'
cos_username = 'test'
cos_password = 'test123'
cos_directory = 'test-directory' 
cos_pull_archive = 'test-archive.tar.gz'

# Inputs and Outputs
inputs = []
outputs = []

# Container Image
image = 'tensorflow/tensorflow:latest'

def run_notebook_op(op_name, notebook_path):

    notebook_op = NotebookOp(name=op_name,
                             notebook=op_name,
                             cos_endpoint=cos_endpoint,
                             cos_bucket=cos_bucket,
                             cos_directory=cos_directory,
                             cos_pull_archive=cos_pull_archive,
                             pipeline_outputs=outputs,
                             pipeline_inputs=inputs,
                             image=image)

    notebook_op.container.add_env_variable(V1EnvVar(name='AWS_ACCESS_KEY_ID', value=cos_username))
    notebook_op.container.add_env_variable(V1EnvVar(name='AWS_SECRET_ACCESS_KEY', value=cos_password))
    notebook_op.container.set_image_pull_policy('Always')

    return op

def demo_pipeline():
    stats_op = run_notebook_op('stats', 'generate-community-overview')
    contributions_op = run_notebook_op('contributions', 'generate-community-contributions')
    run_notebook_op('overview', 'overview').after(stats_op, contributions_op)

# Compile the new pipeline
kfp.compiler.Compiler().compile(demo_pipeline,'pipelines/pipeline.tar.gz')

# Upload the compiled pipeline
client = kfp.Client(host=kfp_url)
pipeline_info = client.upload_pipeline('pipelines/pipeline.tar.gz',pipeline_name='pipeline-demo')

# Create a new experiment
experiment = client.create_experiment(name='demo-experiment')

# Create a new run associated with experiment and our uploaded pipeline
run = client.run_pipeline(experiment.id, 'demo-run', pipeline_id=pipeline_info.id)

Generated Kubeflow Pipelines

Kubeflow Pipeline Example

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kfp-notebook-0.10.1.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

kfp_notebook-0.10.1-py3-none-any.whl (10.3 kB view details)

Uploaded Python 3

File details

Details for the file kfp-notebook-0.10.1.tar.gz.

File metadata

  • Download URL: kfp-notebook-0.10.1.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1.post20200616 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.7.6

File hashes

Hashes for kfp-notebook-0.10.1.tar.gz
Algorithm Hash digest
SHA256 f5c8a8eb9cc89946901cfba0e6044af5072a213b87b175a121711b2a3513ad89
MD5 f80f5c9da128f7f417dad2202fd4ba50
BLAKE2b-256 38cce956e674c35026d4760a3def6f593929d2c939b52450c7a360dc41da838a

See more details on using hashes here.

File details

Details for the file kfp_notebook-0.10.1-py3-none-any.whl.

File metadata

  • Download URL: kfp_notebook-0.10.1-py3-none-any.whl
  • Upload date:
  • Size: 10.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1.post20200616 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.7.6

File hashes

Hashes for kfp_notebook-0.10.1-py3-none-any.whl
Algorithm Hash digest
SHA256 86da4482082b0acb1fa3bbf3205699f26ef3476ee90f276fbe7c1b63621fe8d2
MD5 a4a821387262a270304732c3f37cd22b
BLAKE2b-256 102b0c66cbd555ccd018454f1ec3aef0581da1dfe3b54c681320a4fd85861f2b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page