Skip to main content

Extend MLFlow with Comet.ml

Project description

Comet-For-MLFlow Extension

image CI Build Updates

The Comet-For-MLFlow extension is a CLI that maps MLFlow experiment runs to Comet experiments. This extension allows you to see your existing experiments in the Comet.ml UI which provides authenticated access to experiment results, dramatically improves the performance for high volume experiment runs, and provides richer charting and visualization options.

This extension will synchronize previous MLFlow experiment runs with all runs tracked with Comet's Python SDK with MLFlow support, for deeper experiment instrumentation and improved logging, visibility, project organization and access management.

The Comet-For-MLFlow Extension is available as free open-source software, released under GNU General Public License v3. The extension can be used with existing Comet.ml accounts or with a new, free Individual account.

Installation

pip install comet-for-mlflow

If you install comet-for-mlflow in a different Python environment than the one you used to generate mlflow runs, please ensure that you use the same mlflow version in both environments.

Basic usage

For automatically synchronizing MLFlow runs in their default storage location (./mlruns) with Comet.ml, run:

comet_for_mlflow --api-key $COMET_API_KEY --rest-api-key $COMET_REST_API_KEY

If you'd like to review the mapping of MLFlow runs in their default storage location without synchronizing them with Comet.ml automatically, you can run:

comet_for_mlflow --no-upload

After review, you can upload the mapped MLFlow runs with:

comet upload /path/to/archive.zip

Example

 __   __         ___ ___     ___  __   __                 ___       __
/  ` /  \  |\/| |__   |  __ |__  /  \ |__) __  |\/| |    |__  |    /  \ |  |
\__, \__/  |  | |___  |     |    \__/ |  \     |  | |___ |    |___ \__/ |/\|


Please create a free Comet account with your email.
Email: kristen.stewart@example.com

Please enter a username for your new account.
Username: kstewart

A Comet.ml account has been created for you and an email was sent to you to setup your password later.
Your Comet API Key has been saved to ~/.comet.ini, it is also available on your Comet.ml dashboard.
Starting Comet Extension for MLFlow

Preparing data locally from: '/home/ks/project/mlruns'
You will have an opportunity to review.

# Preparing experiment 1/3: Default

# Preparing experiment 2/3: Keras Experiment
## Preparing run 1/4 [2e02df92025044669701ed6e6dd300ca]
## Preparing run 2/4 [93fb285da7cf4c4a93e279ab7ff19fc5]
## Preparing run 3/4 [2e8a1aed22544549b2b6b6b2c5976ed9]
## Preparing run 4/4 [82f584bad7604289af61bc505935599b]

# Preparing experiment 3/3: Tensorflow Keras Experiment
## Preparing run 1/2 [99550a7ce4c24677aeb6a1ae4e7444cb]
## Preparing run 2/2 [88ca5c4262f44176b576b54e0b24731a]

 MLFlow name:   | Comet.ml name:   |   Prepared count:
----------------+------------------+-------------------
 Experiments    | Projects         |                 3
 Runs           | Experiments      |                 6
 Tags           | Others           |                39
 Parameters     | Parameters       |                51
 Metrics        | Metrics          |                60
 Artifacts      | Assets           |                27

All prepared data has been saved to: /tmp/tmpjj74z8bf

Upload prepared data to Comet.ml? [y/N] y

# Start uploading data to Comet.ml
100%|███████████████████████████████████████████████████████████████████████| 6/6 [01:00<00:00, 15s/it]
Explore your experiment data on Comet.ml with the following links:
	- https://www.comet.ml/kstewart/mlflow-default-2bacc9?loginToken=NjKgD6f9ZuZWeudP76sDPHx9j
	- https://www.comet.ml/kstewart/mlflow-keras-experiment-2bacc9?loginToken=NjKgD6f9ZuZWeudP76sDPHx9j
	- https://www.comet.ml/kstewart/mlflow-tensorflow-keras-experiment-2bacc9?loginToken=NjKgD6f9ZuZWeudP76sDPHx9j
Get deeper instrumentation by adding Comet SDK to your project: https://comet.ml/docs/python-sdk/mlflow/


If you need support, you can contact us at http://chat.comet.ml/ or https://comet.ml/docs/quick-start/#getting-support

Advanced use

Importing MLFlow runs in a database store or in the MLFLow server store

If your MLFlow runs are not located in the default local store (./mlruns), you can either set the CLI flag --mlflow-store-uri or the environment variable MLFLOW_TRACKING_URI to point to the right store.

For example, with a different local store path:

comet_for_mlflow --mlflow-store-uri /data/mlruns/

With a SQL store:

comet_for_mlflow --mlflow-store-uri sqlite:///path/to/file.db

Or with a MLFlow server:

comet_for_mlflow --mlflow-store-uri http://localhost:5000

Importing MLFlow artifacts stored remotely

If your MLFlow runs have artifacts stored remotely (in any of supported remote artifact stores https://www.mlflow.org/docs/latest/tracking.html#artifact-stores), you need to configure your environment the same way as when you ran those experiments. For example, with a local Minio server:

env MLFLOW_S3_ENDPOINT_URL=http://localhost:9001 \
    AWS_ACCESS_KEY_ID=minio \
    AWS_SECRET_ACCESS_KEY=minio123 \
    comet_for_mlflow

FAQ

How can I configure my API Key or Rest API Key?

You can either pass your Comet.ml API Key or Rest API Key as command-line flags or through the usual configuration options.

How are MLFlow experiments mapped to Comet.ml projects?

Each MLFlow experiment is mapped to a unique Comet.ml project ID. This way even if you rename the Comet.ml project or the MLFlow experiment, new runs will be imported in the correct Comet.ml project. The name for newly created Comet.ml is mlflow-$MLFLOW_EXPERIMENT_NAME. The original MLFlow experiment name is also saved as an Other field named mlflow.experimentName.

Below is a complete list of MLFlow experiment and run fields mapped to Comet.ml equivalent concepts:

  • MLFlow Experiments are mapped as Comet.ml projects
  • MLFlow Runs are mapped as Comet.ml experiments
  • MLFlow Runs fields are imported according to following table:
MLFlow Run Field Comet.ml Experiment Field
File name File name
Tags Others
User Git User + System User
Git parent Git parent
Git origin Git Origin
Params Params
Metrics Metrics
Artifacts Assets

Do I have to run this for future experiments?

No, the common pattern is to import Comet's Python SDK with MLFlow support in your MLFlow projects, which will keep all future experiment runs synchronized.

Credits

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.

Release History

0.1.1 (2020-02-17)

  • Fix package URL in metadata.

0.1.0 (2020-02-12)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

comet_for_mlflow-0.1.1.tar.gz (30.0 kB view details)

Uploaded Source

Built Distribution

comet_for_mlflow-0.1.1-py2.py3-none-any.whl (28.5 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file comet_for_mlflow-0.1.1.tar.gz.

File metadata

  • Download URL: comet_for_mlflow-0.1.1.tar.gz
  • Upload date:
  • Size: 30.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for comet_for_mlflow-0.1.1.tar.gz
Algorithm Hash digest
SHA256 7c9125c04f6ccc2fd355fd8a6ff867b2b3f471f34dc9acc075fcc47cc0eaeadd
MD5 403bd5b76583feb2f84a0a96ac217027
BLAKE2b-256 cfaa3e6ab2f8f27c32dddd72adf02afa00832e500ade2dc6f6fec25de53bbbe9

See more details on using hashes here.

File details

Details for the file comet_for_mlflow-0.1.1-py2.py3-none-any.whl.

File metadata

  • Download URL: comet_for_mlflow-0.1.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 28.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.7.6

File hashes

Hashes for comet_for_mlflow-0.1.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 dc863f96ce5a79b96103fe71f35a19d1c3760ef5a613634dafa831db5cccc4ef
MD5 31a0c6a889ccdfa089bcdc0e8ddcfecf
BLAKE2b-256 946a66de1d3bcae1f63791958f49ce41b0e6fdc507f0a6930d4666bb01d43f12

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page