Skip to main content

WDL launcher for Amazon Omics

Project description

miniwdl-omics-run

This command-line tool makes it easier to launch WDL workflow runs on Amazon Omics. It uses miniwdl locally to register WDL workflows with the service, validate command-line inputs, and start a run.

pip3 install miniwdl-omics-run

miniwdl-omics-run \
    --role-arn {SERVICE_ROLE_ARN} \
    --output-uri s3://{BUCKET_NAME}/{PREFIX} \
    {MAIN_WDL_FILE} input1=value1 input2=value2 ...

Quick start

Prerequisites: up-to-date AWS CLI installed locally, and configured with full AdministratorAccess to your AWS account.

S3 bucket

Create an S3 bucket with a test input file.

AWS_ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)
AWS_DEFAULT_REGION=$(aws configure get region)

aws s3 mb --region "$AWS_DEFAULT_REGION" "s3://${AWS_ACCOUNT_ID}-${AWS_DEFAULT_REGION}-omics"
echo test | aws s3 cp - s3://${AWS_ACCOUNT_ID}-${AWS_DEFAULT_REGION}-omics/test/test.txt

Service role

Create an IAM service role for your Omics workflow runs to use (to access S3, ECR, etc.).

aws iam create-role --role-name poweromics --assume-role-policy-document '{
    "Version":"2012-10-17",
    "Statement":[{
        "Effect":"Allow",
        "Action":"sts:AssumeRole",
        "Principal":{"Service":"omics.amazonaws.com"}
    }]
}'

aws iam attach-role-policy --role-name poweromics \
    --policy-arn arn:aws:iam::aws:policy/PowerUserAccess

WARNING: PowerUserAccess, suggested here only for brevity, is far more powerful than needed. See Omics docs on service roles for the least privileges necessary, especially if you plan to use third-party WDL and/or Docker images.

ECR repository

Create an ECR repository suitable for Omics to pull Docker images from.

aws ecr create-repository --repository-name omics
aws ecr set-repository-policy --repository-name omics --policy-text '{
    "Version": "2012-10-17",
    "Statement": [{
        "Sid": "omics workflow",
        "Effect": "Allow",
        "Principal": {"Service": "omics.amazonaws.com"},
        "Action": [
            "ecr:GetDownloadUrlForLayer",
            "ecr:BatchGetImage",
            "ecr:BatchCheckLayerAvailability"
        ]
    }]
}'

Push a plain Ubuntu image to the repository.

ECR_ENDPT="${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com"
aws ecr get-login-password | docker login --username AWS --password-stdin "$ECR_ENDPT"

docker pull ubuntu:22.04
docker tag ubuntu:22.04 "${ECR_ENDPT}/omics:ubuntu-22.04"
docker push "${ECR_ENDPT}/omics:ubuntu-22.04"

Run test workflow

pip3 install miniwdl-omics-run

miniwdl-omics-run \
    --role-arn arn:aws:iam::${AWS_ACCOUNT_ID}:role/poweromics \
    --output-uri "s3://${AWS_ACCOUNT_ID}-${AWS_DEFAULT_REGION}-omics/test/out" \
    https://raw.githubusercontent.com/miniwdl-ext/miniwdl-omics-run/main/test/TestFlow.wdl \
    input_txt_file="s3://${AWS_ACCOUNT_ID}-${AWS_DEFAULT_REGION}-omics/test/test.txt" \
    docker="${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_DEFAULT_REGION}.amazonaws.com/omics:ubuntu-22.04"

This zips up the specified WDL, registers it as an Omics workflow, validates the given inputs, and starts the workflow run.

The WDL source code may be set to a local filename or a public HTTP(S) URL. The tool automatically bundles any WDL files imported by the main one. On subsequent invocations, it'll reuse the previously-registered workflow if the source code hasn't changed.

The command-line interface accepts WDL inputs using the input_key=value syntax exactly like miniwdl run, including the option of a JSON file with --input FILE.json. Each input File must be set to an existing S3 URI accessible by the service role.

Advice

  • Omics can use Docker images only from your ECR in the same account & region.
    • This often means pulling, re-tagging, and pushing images as illustrated above with ubuntu:22.04.
    • And editing any WDL tasks that hard-code public registries in their runtime.docker.
    • Each ECR repository must have the Omics-specific repository policy set as shown above.
    • We therefore tend to use a single ECR repository for multiple Docker images, disambiguating them using lengthier tags.
    • If you prefer to use per-image repositories, just remember to set the repository policy on each one.
  • To quickly list a workflow's inputs, try miniwdl run workflow.wdl ?

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

miniwdl-omics-run-0.3.0.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

miniwdl_omics_run-0.3.0-py3-none-any.whl (8.7 kB view details)

Uploaded Python 3

File details

Details for the file miniwdl-omics-run-0.3.0.tar.gz.

File metadata

  • Download URL: miniwdl-omics-run-0.3.0.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for miniwdl-omics-run-0.3.0.tar.gz
Algorithm Hash digest
SHA256 2d73b76c656afb3499af2aba1efd88ba9b6f26d8242803408732d4d129c570f5
MD5 ee3d90b936583b9d01e1a00f1a5dda5d
BLAKE2b-256 6b37990cf87c1924ac7faa7d8e1a8529aa5e8ce407d5413b2989d70fafb7ab45

See more details on using hashes here.

File details

Details for the file miniwdl_omics_run-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for miniwdl_omics_run-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9f89186e311877b6b4307f7206ad872b9898da24498ff6e153da9c4c2c98a625
MD5 2b04a9f3beebccaaa235fc36d77e6d6d
BLAKE2b-256 8e9cb6e67267b189417aecf2601fac56e89527922f320ca75ba7ee06e764969b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page