Skip to main content

A runtime system for NMDC data management and orchestration

Project description

A runtime system for NMDC data management and orchestration.

Service Status

http://nmdcstatus.polyneme.xyz/

How It Fits In

  • issues
    tracks issues related to NMDC, which may necessitate work across multiple repos.

  • nmdc-schema houses the LinkML schema specification, as well as generated artifacts (e.g. JSON Schema).

  • nmdc-server houses code specific to the data portal -- its database, back-end API, and front-end application.

  • workflow_documentation references workflow code spread across several repositories, that take source data and produce computed data.

  • This repo (nmdc-runtime)

    • houses code that takes source data and computed data, and transforms it to broadly accommodate downstream applications such as the data portal
    • manages execution of the above (i.e., lightweight data transformations) and also of computationally- and data-intensive workflows performed at other sites, ensuring that claimed jobs have access to needed configuration and data resources.

Data exports

The NMDC metadata as of 2021-10 is available here:

https://drs.microbiomedata.org/ga4gh/drs/v1/objects/sys086d541

The link returns a GA4GH DRS API bundle object record, with the NMDC metadata collections (study_set, biosample_set, etc.) as contents, each a DRS API blob object.

For example the blob for the study_set collection export, named "study_set.jsonl.gz", is listed with DRS API ID "sys0xsry70". Thus, it is retrievable via

https://drs.microbiomedata.org/ga4gh/drs/v1/objects/sys0xsry70

The returned blob object record lists https://nmdc-runtime.files.polyneme.xyz/nmdcdb-mongoexport/2021-10-14/study_set.jsonl.gz as the url for an access method.

The 2021-10 exports are currently all accessible at https://nmdc-runtime.files.polyneme.xyz/nmdcdb-mongoexport/2021-10-14/${COLLECTION_NAME}.jsonl.gz, but the DRS API indirection allows these links to change in the future, for mirroring via other URLs, etc. So, the DRS API links should be the links you share.

Overview

The runtime features:

  1. Dagster orchestration:

    • dagit - a web UI to monitor and manage the running system.
    • dagster-daemon - a service that triggers pipeline runs based on time or external state.
    • PostgresSQL database - for storing run history, event logs, and scheduler state.
    • workspace code
      • Code to run is loaded into a Dagster workspace. This code is loaded from one or more dagster repositories. Each Dagster repository may be run with a different Python virtual environment if need be, and may be loaded from a local Python file or pip installed from an external source. In our case, each Dagster repository is simply loaded from a Python file local to the nmdc-runtime GitHub repository, and all code is run in the same Python environment.
      • A Dagster repository consists of solids and pipelines, and optionally schedules and sensors.
        • solids represent individual units of computation
        • pipelines are built up from solids
        • schedules trigger recurring pipeline runs based on time
        • sensors trigger pipeline runs based on external state
      • Each pipeline can declare dependencies on any runtime resources or additional configuration. There are TerminusDB and MongoDB resources defined, as well as preset configuration definitions for both "dev" and "prod" modes. The presets tell Dagster to look to a set of known environment variables to load resources configurations, depending on the mode.
  2. A TerminusDB database supporting revision control of schema-validated data.

  3. A MongoDB database supporting write-once, high-throughput internal data storage by the nmdc-runtime FastAPI instance.

  4. A FastAPI service to interface with the orchestrator and database, as a hub for data management and workflow automation.

Local Development

Ensure Docker (and Docker Compose) are installed; and the Docker engine is running.

docker --version
docker compose version
docker info

Ensure the permissions of ./mongoKeyFile are such that only the file's owner can read or write the file.

chmod 600 ./mongoKeyFile

Ensure you have a .env file for the Docker services to source from. You may copy .env.example to .env (which is gitignore'd) to get started.

cp .env.example .env

Create environment variables in your shell session, based upon the contents of the .env file.

export $(grep -v '^#' .env | xargs)

If you are connecting to resources that require an SSH tunnel—for example, a MongoDB server that is only accessible on the NERSC network—set up the SSH tunnel.

The following command could be useful to you, either directly or as a template (see Makefile).

make nersc-mongo-tunnels

Finally, spin up the Docker Compose stack.

make up-dev

Docker Compose is used to start local MongoDB and PostgresSQL (used by Dagster) instances, as well as a Dagster web server (dagit) and daemon (dagster-daemon).

The Dagit web server is viewable at http://localhost:3000/.

The FastAPI service is viewable at http://localhost:8000/ -- e.g., rendered documentation at http://localhost:8000/redoc/.

Local Testing

Tests can be found in tests and are run with the following commands:

On an M1 Mac? May need to export DOCKER_DEFAULT_PLATFORM=linux/amd64.

make up-test
make test

As you create Dagster solids and pipelines, add tests in tests/ to check that your code behaves as desired and does not break over time.

For hints on how to write tests for solids and pipelines in Dagster, see their documentation tutorial on Testing.

Release to PyPI

rm -rf dist
python -m build
twine upload dist/*

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nmdc_runtime-1.0.9.tar.gz (63.7 MB view details)

Uploaded Source

Built Distribution

nmdc_runtime-1.0.9-py3-none-any.whl (118.8 kB view details)

Uploaded Python 3

File details

Details for the file nmdc_runtime-1.0.9.tar.gz.

File metadata

  • Download URL: nmdc_runtime-1.0.9.tar.gz
  • Upload date:
  • Size: 63.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for nmdc_runtime-1.0.9.tar.gz
Algorithm Hash digest
SHA256 fcbbc2291e93b34b9c8e8865fd4e6500d3f2475ff4ef22401820098b038978aa
MD5 9b63c5d923ad707f52b4d7c1c9d729f7
BLAKE2b-256 b0010d438371437d3059ca4a608b7b594c41dc79fcd58922d26379101a6345ef

See more details on using hashes here.

File details

Details for the file nmdc_runtime-1.0.9-py3-none-any.whl.

File metadata

  • Download URL: nmdc_runtime-1.0.9-py3-none-any.whl
  • Upload date:
  • Size: 118.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for nmdc_runtime-1.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 a149d10b8dae5aebf7e4b1143c79862100c4377b81e673497476c83a2317f8e3
MD5 d9c7fc0475f2a373cfc608ab7f059e3d
BLAKE2b-256 e537b7e7eb1a426c76c4ab05590350bfce3e258bffe2ca8a069fa253bde96df9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page