Skip to main content

WDL launcher for Amazon Omics

Project description

miniwdl-omics-run

This command-line tool makes it easier to launch WDL workflow runs on Amazon Omics. It uses miniwdl locally to register WDL workflows with the service, validate command-line inputs, and start a run.

pip3 install miniwdl-omics-run

miniwdl-omics-run \
    --role-arn {SERVICE_ROLE_ARN} \
    --output-uri s3://{BUCKET_NAME}/{PREFIX} \
    {MAIN_WDL_FILE} input1=value1 input2=value2 ...

Quick start

Prerequisites: up-to-date AWS CLI installed locally, and configured with full AdministratorAccess to your AWS account.

S3 bucket

Create an S3 bucket with a test input file.

AWS_ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)
AWS_DEFAULT_REGION=$(aws configure get region)

aws s3 mb --region "$AWS_DEFAULT_REGION" "s3://${AWS_ACCOUNT_ID}-${AWS_DEFAULT_REGION}-omics"
echo test | aws s3 cp - s3://${AWS_ACCOUNT_ID}-${AWS_DEFAULT_REGION}-omics/test/test.txt

Service role

Create an IAM service role for your Omics workflow runs to use (to access S3, ECR, etc.).

aws iam create-role --role-name poweromics --assume-role-policy-document '{
    "Version":"2012-10-17",
    "Statement":[{
        "Effect":"Allow",
        "Action":"sts:AssumeRole",
        "Principal":{"Service":"omics.amazonaws.com"}
    }]
}'

aws iam attach-role-policy --role-name poweromics \
    --policy-arn arn:aws:iam::aws:policy/PowerUserAccess

WARNING: PowerUserAccess, suggested here only for brevity, is far more powerful than needed. See Omics docs on service roles for the least privileges necessary, especially if you plan to use third-party WDL and/or Docker images.

ECR repository

Create an ECR repository suitable for Omics to pull Docker images from.

aws ecr create-repository --repository-name omics
aws ecr set-repository-policy --repository-name omics --policy-text '{
    "Version": "2012-10-17",
    "Statement": [{
        "Sid": "omics workflow",
        "Effect": "Allow",
        "Principal": {"Service": "omics.amazonaws.com"},
        "Action": [
            "ecr:GetDownloadUrlForLayer",
            "ecr:BatchGetImage",
            "ecr:BatchCheckLayerAvailability"
        ]
    }]
}'

Push a plain Ubuntu image to the repository.

ECR_ENDPT="${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com"
aws ecr get-login-password | docker login --username AWS --password-stdin "$ECR_ENDPT"

docker pull ubuntu:22.04
docker tag ubuntu:22.04 "${ECR_ENDPT}/omics:ubuntu-22.04"
docker push "${ECR_ENDPT}/omics:ubuntu-22.04"

Run test workflow

pip3 install miniwdl-omics-run

miniwdl-omics-run \
    --role-arn arn:aws:iam::${AWS_ACCOUNT_ID}:role/poweromics \
    --output-uri "s3://${AWS_ACCOUNT_ID}-${AWS_DEFAULT_REGION}-omics/test/out" \
    https://raw.githubusercontent.com/miniwdl-ext/miniwdl-omics-run/main/test/TestFlow.wdl \
    input_txt_file="s3://${AWS_ACCOUNT_ID}-${AWS_DEFAULT_REGION}-omics/test/test.txt" \
    docker="${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/omics:ubuntu-22.04"

This zips up the specified WDL, registers it as an Omics workflow, validates the given inputs, and starts the workflow run.

The WDL source code may be set to a local filename or a public HTTP(S) URL. The tool automatically bundles any WDL files imported by the main one. On subsequent invocations, it'll reuse the previously-registered workflow if the source code hasn't changed.

The command-line interface accepts WDL inputs using the input_key=value syntax exactly like miniwdl run, including the option of a JSON file with --input FILE.json. Each input File must be set to an existing S3 URI accessible by the service role.

Advice

  • Omics can use Docker images only from your ECR in the same account & region.
    • This often means pulling, re-tagging, and pushing images as illustrated above with ubuntu:22.04.
    • And editing any WDL tasks that hard-code public registries in their runtime.docker.
    • Each ECR repository must have the Omics-specific repository policy set as shown above.
    • We therefore tend to use a single ECR repository for multiple Docker images, disambiguating them using lengthier tags.
    • If you prefer to use per-image repositories, just remember to set the repository policy on each one.
  • To quickly list a workflow's inputs, try miniwdl run workflow.wdl ?

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

miniwdl-omics-run-0.1.0.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

miniwdl_omics_run-0.1.0-py3-none-any.whl (8.3 kB view details)

Uploaded Python 3

File details

Details for the file miniwdl-omics-run-0.1.0.tar.gz.

File metadata

  • Download URL: miniwdl-omics-run-0.1.0.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.7

File hashes

Hashes for miniwdl-omics-run-0.1.0.tar.gz
Algorithm Hash digest
SHA256 da43fcd9ea7362a00c2d7e3171d092bdd8e6f2f2fcbad4deff89fef88defcb2d
MD5 6821a057c1f05368969fa9d411342672
BLAKE2b-256 c6d9b02329f3e15b8912a5feaa2fbaa2e444e1a32834d15df1e77dad7e1169c3

See more details on using hashes here.

File details

Details for the file miniwdl_omics_run-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for miniwdl_omics_run-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 492bc0df3735e91b6c6f1676326c057886d7a45b438b864f5b11db6f405d99d4
MD5 17e2820525db355d4340b59cd5502913
BLAKE2b-256 e4d537838df14d1871f055a068256a5651b84bf9d4671d3cbd0fd3442f8828b7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page