Skip to main content

Jupyter Notebook operator for Kubeflow Pipelines

Project description

KFP-Notebook is an operator that enable running notebooks as part of a Kubeflow Pipeline.

Building kfp-notebook

make clean install

Usage

The example below can easily be added to a python script or jupyter notebook for testing purposes.

import os
import kfp
from notebook.pipeline import NotebookOp
from kubernetes.client.models import V1EnvVar

# KubeFlow Pipelines API Endpoint
kfp_url = 'http://dataplatform.ibm.com:32488/pipeline'

# S3 Object Storage
cos_endpoint = 'http://s3.us-south.cloud-object-storage.appdomain.cloud'
cos_bucket = 'test-bucket'
cos_username = 'test'
cos_password = 'test123'
cos_directory = 'test-directory' 
cos_pull_archive = 'test-archive.tar.gz'

# Inputs and Outputs
inputs = []
outputs = []

# Container Image
image = 'tensorflow/tensorflow:latest'

def run_notebook_op(op_name, notebook_path):

    notebook_op = NotebookOp(name=op_name,
                             notebook=op_name,
                             cos_endpoint=cos_endpoint,
                             cos_bucket=cos_bucket,
                             cos_directory=cos_directory,
                             cos_pull_archive=cos_pull_archive,
                             pipeline_outputs=outputs,
                             pipeline_inputs=inputs,
                             image=image)

    notebook_op.container.add_env_variable(V1EnvVar(name='AWS_ACCESS_KEY_ID', value=cos_username))
    notebook_op.container.add_env_variable(V1EnvVar(name='AWS_SECRET_ACCESS_KEY', value=cos_password))
    notebook_op.container.set_image_pull_policy('Always')

    return op

def demo_pipeline():
    stats_op = run_notebook_op('stats', 'generate-community-overview')
    contributions_op = run_notebook_op('contributions', 'generate-community-contributions')
    run_notebook_op('overview', 'overview').after(stats_op, contributions_op)

# Compile the new pipeline
kfp.compiler.Compiler().compile(demo_pipeline,'pipelines/pipeline.tar.gz')

# Upload the compiled pipeline
client = kfp.Client(host=kfp_url)
pipeline_info = client.upload_pipeline('pipelines/pipeline.tar.gz',pipeline_name='pipeline-demo')

# Create a new experiment
experiment = client.create_experiment(name='demo-experiment')

# Create a new run associated with experiment and our uploaded pipeline
run = client.run_pipeline(experiment.id, 'demo-run', pipeline_id=pipeline_info.id)

Generated Kubeflow Pipelines

Kubeflow Pipeline Example

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kfp-notebook-0.10.2.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

kfp_notebook-0.10.2-py3-none-any.whl (10.3 kB view details)

Uploaded Python 3

File details

Details for the file kfp-notebook-0.10.2.tar.gz.

File metadata

  • Download URL: kfp-notebook-0.10.2.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1.post20200616 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.7.6

File hashes

Hashes for kfp-notebook-0.10.2.tar.gz
Algorithm Hash digest
SHA256 cbdb8b661577d7da9d5fc04f5dd48e42bdf4333f78f8fe515d36329e260cb240
MD5 4fb7f8a6979625bb49a61dfa1547a3bc
BLAKE2b-256 0ce0c53db147f74554e1a1ad7f2d663f3b69e47aaa428dfc7c9e966c0080cac1

See more details on using hashes here.

File details

Details for the file kfp_notebook-0.10.2-py3-none-any.whl.

File metadata

  • Download URL: kfp_notebook-0.10.2-py3-none-any.whl
  • Upload date:
  • Size: 10.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/47.3.1.post20200616 requests-toolbelt/0.9.1 tqdm/4.46.1 CPython/3.7.6

File hashes

Hashes for kfp_notebook-0.10.2-py3-none-any.whl
Algorithm Hash digest
SHA256 aab1edc6f36c0b8bfe1fbda254fd21c5bcf2ce0f261f3e54a3d740b644f067ff
MD5 1087ce6e6e2d3bedc17c8de099406f74
BLAKE2b-256 b66f9c684c8c4a08f9fd32c48df19facd50006d2f0b63a421bba6786ed116044

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page