Skip to main content

MLflow is an open source platform for the complete machine learning lifecycle

Project description

MLflow Skinny: A Lightweight Machine Learning Lifecycle Platform Client

MLflow Skinny is a lightweight MLflow package without SQL storage, server, UI, or data science dependencies. MLflow Skinny supports:

  • Tracking operations (logging / loading / searching params, metrics, tags + logging / loading artifacts)

  • Model registration, search, artifact loading, and deployment

  • Execution of GitHub projects within notebook & against a remote target.

Additional dependencies can be installed to leverage the full feature set of MLflow. For example:

  • To use the mlflow.sklearn component of MLflow Models, install scikit-learn, numpy and pandas.

  • To use SQL-based metadata storage, install sqlalchemy, alembic, and sqlparse.

  • To use serving-based features, install flask and pandas.

MLflow: A Machine Learning Lifecycle Platform

MLflow is a platform to streamline machine learning development, including tracking experiments, packaging code into reproducible runs, and sharing and deploying models. MLflow offers a set of lightweight APIs that can be used with any existing machine learning application or library (TensorFlow, PyTorch, XGBoost, etc), wherever you currently run ML code (e.g. in notebooks, standalone applications or the cloud). MLflow’s current components are:

  • MLflow Tracking: An API to log parameters, code, and results in machine learning experiments and compare them using an interactive UI.

  • MLflow Projects: A code packaging format for reproducible runs using Conda and Docker, so you can share your ML code with others.

  • MLflow Models: A model packaging format and tools that let you easily deploy the same model (from any ML library) to batch and real-time scoring on platforms such as Docker, Apache Spark, Azure ML and AWS SageMaker.

  • MLflow Model Registry: A centralized model store, set of APIs, and UI, to collaboratively manage the full lifecycle of MLflow Models.

Latest Docs Apache 2 License Total Downloads Slack Account Twitter

Packages

PyPI

PyPI - mlflow PyPI - mlflow-skinny

conda-forge

Conda - mlflow Conda - mlflow-skinny

CRAN

CRAN - mlflow

Maven Central

Maven Central - mlflow-client Maven Central - mlflow-parent Maven Central - mlflow-scoring Maven Central - mlflow-spark

Job Statuses

Examples Action Status cross-version-tests r-devel test-requirements stale push-images slow-tests website-e2e

Installing

Install MLflow from PyPI via pip install mlflow

MLflow requires conda to be on the PATH for the projects feature.

Nightly snapshots of MLflow master are also available here.

Install a lower dependency subset of MLflow from PyPI via pip install mlflow-skinny Extra dependencies can be added per desired scenario. For example, pip install mlflow-skinny pandas numpy allows for mlflow.pyfunc.log_model support.

Documentation

Official documentation for MLflow can be found at https://mlflow.org/docs/latest/index.html.

Roadmap

The current MLflow Roadmap is available at https://github.com/mlflow/mlflow/milestone/3. We are seeking contributions to all of our roadmap items with the help wanted label. Please see the Contributing section for more information.

Community

For help or questions about MLflow usage (e.g. “how do I do X?”) see the docs or Stack Overflow.

To report a bug, file a documentation issue, or submit a feature request, please open a GitHub issue.

For release announcements and other discussions, please subscribe to our mailing list (mlflow-users@googlegroups.com) or join us on Slack.

Running a Sample App With the Tracking API

The programs in examples use the MLflow Tracking API. For instance, run:

python examples/quickstart/mlflow_tracking.py

This program will use MLflow Tracking API, which logs tracking data in ./mlruns. This can then be viewed with the Tracking UI.

Launching the Tracking UI

The MLflow Tracking UI will show runs logged in ./mlruns at http://localhost:5000. Start it with:

mlflow ui

Note: Running mlflow ui from within a clone of MLflow is not recommended - doing so will run the dev UI from source. We recommend running the UI from a different working directory, specifying a backend store via the --backend-store-uri option. Alternatively, see instructions for running the dev UI in the contributor guide.

Running a Project from a URI

The mlflow run command lets you run a project packaged with a MLproject file from a local path or a Git URI:

mlflow run examples/sklearn_elasticnet_wine -P alpha=0.4

mlflow run https://github.com/mlflow/mlflow-example.git -P alpha=0.4

See examples/sklearn_elasticnet_wine for a sample project with an MLproject file.

Saving and Serving Models

To illustrate managing models, the mlflow.sklearn package can log scikit-learn models as MLflow artifacts and then load them again for serving. There is an example training application in examples/sklearn_logistic_regression/train.py that you can run as follows:

$ python examples/sklearn_logistic_regression/train.py
Score: 0.666
Model saved in run <run-id>

$ mlflow models serve --model-uri runs:/<run-id>/model

$ curl -d '{"dataframe_split": {"columns":[0],"index":[0,1],"data":[[1],[-1]]}}' -H 'Content-Type: application/json'  localhost:5000/invocations

Note: If using MLflow skinny (pip install mlflow-skinny) for model serving, additional required dependencies (namely, flask) will need to be installed for the MLflow server to function.

Official MLflow Docker Image

The official MLflow Docker image is available on GitHub Container Registry at https://ghcr.io/mlflow/mlflow.

export CR_PAT=YOUR_TOKEN
echo $CR_PAT | docker login ghcr.io -u USERNAME --password-stdin
# Pull the latest version
docker pull ghcr.io/mlflow/mlflow
# Pull 2.2.1
docker pull ghcr.io/mlflow/mlflow:v2.2.1

Contributing

We happily welcome contributions to MLflow. We are also seeking contributions to items on the MLflow Roadmap. Please see our contribution guide to learn more about contributing to MLflow.

Core Members

MLflow is currently maintained by the following core members with significant contributions from hundreds of exceptionally talented community members.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlflow_skinny-2.17.0rc0.tar.gz (5.3 MB view details)

Uploaded Source

Built Distribution

mlflow_skinny-2.17.0rc0-py3-none-any.whl (5.6 MB view details)

Uploaded Python 3

File details

Details for the file mlflow_skinny-2.17.0rc0.tar.gz.

File metadata

  • Download URL: mlflow_skinny-2.17.0rc0.tar.gz
  • Upload date:
  • Size: 5.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.15

File hashes

Hashes for mlflow_skinny-2.17.0rc0.tar.gz
Algorithm Hash digest
SHA256 7eec1702f963797de7e1c0048b18622c8442464f2d0b48bfd14a5228fdfdc70c
MD5 06a39a9d27587f840086d04d4cbb9f74
BLAKE2b-256 e3092dfd9be50b0fce0f2565c484fda3ff06f0eb096fade728a6ed6eabe8e10b

See more details on using hashes here.

File details

Details for the file mlflow_skinny-2.17.0rc0-py3-none-any.whl.

File metadata

File hashes

Hashes for mlflow_skinny-2.17.0rc0-py3-none-any.whl
Algorithm Hash digest
SHA256 c5dca5b8b26a7b49411bd3c3367bc9d02fc30c3e96318d0abbef03936023dcef
MD5 f604b4f102580995e005657d80de38df
BLAKE2b-256 8edd634ac1ba9af8926a5a856313a48b288d312aa4794883211de3fb699dda5c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page