Skip to main content

Jupyter Notebook operator for Kubeflow Pipelines

Project description

KFP-Notebook is an operator that enable running notebooks as part of a Kubeflow Pipeline.

Building kfp-notebook

make clean install

Usage

The example below can easily be added to a python script or jupyter notebook for testing purposes.

import os
import kfp
from notebook.pipeline import NotebookOp
from kubernetes.client.models import V1EnvVar

# KubeFlow Pipelines API Endpoint
kfp_url = 'http://dataplatform.ibm.com:32488/pipeline'

# S3 Object Storage
cos_endpoint = 'http://s3.us-south.cloud-object-storage.appdomain.cloud'
cos_bucket = 'test-bucket'
cos_username = 'test'
cos_password = 'test123'
cos_directory = 'test-directory' 
cos_pull_archive = 'test-archive.tar.gz'

# Inputs and Outputs
inputs = []
outputs = []

# Container Image
image = 'tensorflow/tensorflow:latest'

def run_notebook_op(op_name, notebook_path):

    notebook_op = NotebookOp(name=op_name,
                             notebook=op_name,
                             cos_endpoint=cos_endpoint,
                             cos_bucket=cos_bucket,
                             cos_directory=cos_directory,
                             cos_pull_archive=cos_pull_archive,
                             pipeline_outputs=outputs,
                             pipeline_inputs=inputs,
                             image=image)

    notebook_op.container.add_env_variable(V1EnvVar(name='AWS_ACCESS_KEY_ID', value=cos_username))
    notebook_op.container.add_env_variable(V1EnvVar(name='AWS_SECRET_ACCESS_KEY', value=cos_password))
    notebook_op.container.set_image_pull_policy('Always')

    return op

def demo_pipeline():
    stats_op = run_notebook_op('stats', 'generate-community-overview')
    contributions_op = run_notebook_op('contributions', 'generate-community-contributions')
    run_notebook_op('overview', 'overview').after(stats_op, contributions_op)

# Compile the new pipeline
kfp.compiler.Compiler().compile(demo_pipeline,'pipelines/pipeline.tar.gz')

# Upload the compiled pipeline
client = kfp.Client(host=kfp_url)
pipeline_info = client.upload_pipeline('pipelines/pipeline.tar.gz',pipeline_name='pipeline-demo')

# Create a new experiment
experiment = client.create_experiment(name='demo-experiment')

# Create a new run associated with experiment and our uploaded pipeline
run = client.run_pipeline(experiment.id, 'demo-run', pipeline_id=pipeline_info.id)

Generated Kubeflow Pipelines

Kubeflow Pipeline Example

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kfp-notebook-0.9.1.tar.gz (8.8 kB view details)

Uploaded Source

Built Distribution

kfp_notebook-0.9.1-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file kfp-notebook-0.9.1.tar.gz.

File metadata

  • Download URL: kfp-notebook-0.9.1.tar.gz
  • Upload date:
  • Size: 8.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.1.post20200322 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.7.4

File hashes

Hashes for kfp-notebook-0.9.1.tar.gz
Algorithm Hash digest
SHA256 1851f9d205e7a1ff1b58c46816f50d624fdb17278ce8af2ed86ba716c4f110e4
MD5 3515b5a040c0884941fc226a6695b411
BLAKE2b-256 9db1416a9d598d77e0c76b612f068c9b7bc441eb1633d1a6cbc57ce189f47b57

See more details on using hashes here.

File details

Details for the file kfp_notebook-0.9.1-py3-none-any.whl.

File metadata

  • Download URL: kfp_notebook-0.9.1-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/46.1.1.post20200322 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.7.4

File hashes

Hashes for kfp_notebook-0.9.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3d23f6653d5f9e859f60dea908bf21f404c1d2f20fb38563ea5fac2c5d9f14c9
MD5 575f44eb24fc17531dd7ecc8ef51046f
BLAKE2b-256 2f738d1176a70fb87a131e61594c20240efe2099bf47e39d51d4daef8512442d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page