Skip to main content

Tool to submit genomics pipeline outputs to the ENCODE Portal

Project description

Code Style: Black License: MIT CircleCI status

accession is a Python module and command line tool for submitting genomics pipeline analysis output files and metadata to the ENCODE Portal.

Installation

Note: installation requires Python >= 3.6

$ pip install accession

Next, provide your API keys from the ENCODE portal:

$ export DCC_API_KEY=XXXXXXXX
$ export DCC_SECRET_KEY=yyyyyyyyyyy
It is highly recommended to set the DCC_LAB and DCC_AWARD environment variables for ease of use. These correspond to the lab and award identifiers given by the ENCODE portal, e.g. /labs/foo/ and U00HG123456, respectively.
$ export DCC_LAB=XXXXXXXX
$ export DCC_AWARD=yyyyyyyyyyy
If you are accessioning workflows produced using the Caper local backend, then installation is complete. However, if using WDL metadata from pipeline runs on Google Cloud, you will also need to authenticate with Google Cloud. Run the following two commands and follow the prompts:
$ gcloud auth login --no-launch-browser
$ gcloud auth application-default login --no-launch-browser
If you would like to be able to pass Caper workflow IDs or labels you will need to configure access to the Caper server. If you are invoking accession from a machine where you already have a Caper set up, and you have the Caper configuration file available at ~/.caper/default.conf, then there is no extra setup required. If the Caper server is on another machine, you will need so configure HTTP access to it by setting the hostname and port values in the Caper conf file.
(Optional) Finally, to enable using Cloud Tasks to upload files from Google Cloud Storage to AWS S3, set the following two environment variables. If one or more of them is not set, then files will be uploaded using the same machine that the accessioning code is run from. For more information on how to set up Cloud Tasks and the upload service, see the docs for the gcs-s3-transfer-service
$ export ACCESSION_CLOUD_TASKS_QUEUE_NAME=my-queue
$ export ACCESSION_CLOUD_TASKS_QUEUE_REGION=us-west1
To accession workflows produced on AWS backend you will need to set up AWS credentials. The easiest way to do this is to install the AWS CLI and run aws configure

Usage

$ accession -m metadata.json \
            -p mirna \
            -s dev

Please see the docs for greater detail on these input parameters.

Deploying on Google Cloud

First authenticate with Google Cloud via gcloud auth login if needed. Then install the API client with pip install google-api-python-client, it is recommended to do this inside of a venv. Finally, create the firewall rule and deploy the instance by running python deploy.py –project $PROJECT. This will also install the accession package. Finally, SSH onto the new instance and run gcloud auth login to authenticate on the instance.
For Caper integration, once the instance is up, SSH onto it and create the Caper conf file at ~/.caper/default.conf, use the private IP of the Caper VM instance as the hostname and use 8000 for the port. For the connection to work the Caper VM will need to have the tag caper-server. Also note that the deployment assumes the Cromwell server port is set to 8000.

AWS Notes

To enable S3 to S3 copy from the pipeline buckets to the ENCODE buckets, ensure that the pipeline bucket policy grants read access to the ENCODE account. Here is an example policy:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "DelegateS3AccessGet",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::618537831167:root",
                    "arn:aws:iam::159877419961:root"
                ]
            },
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::PIPELINE-BUCKET/*"
        },
        {
            "Sid": "DelegateS3AccessList",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::618537831167:root",
                    "arn:aws:iam::159877419961:root"
                ]
            },
            "Action": "s3:ListBucket",
            "Resource": "arn:aws:s3:::PIPELINE-BUCKET"
        }
    ]
}

Project Information

accession is released under the MIT license, documentation lives in readthedocs, code is hosted on github and the releases on PyPI.

Project details


Release history Release notifications | RSS feed

This version

4.7.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

accession-4.7.1.tar.gz (100.7 kB view details)

Uploaded Source

Built Distribution

accession-4.7.1-py3-none-any.whl (131.0 kB view details)

Uploaded Python 3

File details

Details for the file accession-4.7.1.tar.gz.

File metadata

  • Download URL: accession-4.7.1.tar.gz
  • Upload date:
  • Size: 100.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.6.10

File hashes

Hashes for accession-4.7.1.tar.gz
Algorithm Hash digest
SHA256 ba1c4198ec1ddbb2d6736d8a84c8ec0dea972880fae5ca6a3ac9772da669de59
MD5 4a535ef58cedc7ee344d08406d39575c
BLAKE2b-256 58d9efe27bb8276b05699fc119d0bade9ba90e401d8418f1cf50df404b1bad8d

See more details on using hashes here.

File details

Details for the file accession-4.7.1-py3-none-any.whl.

File metadata

  • Download URL: accession-4.7.1-py3-none-any.whl
  • Upload date:
  • Size: 131.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.0 requests-toolbelt/0.9.1 tqdm/4.42.1 CPython/3.6.10

File hashes

Hashes for accession-4.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b509f8e9acc3906a3fec2a7024f5b29f7ea19b6a733122c6538a6be1281cd840
MD5 0965c2d51aa7eab4d0d414afab0e19af
BLAKE2b-256 e6936a65bdc7eff07e7e4e9abf9cd54f0b0a575b18aabe97632b312f46eb15cf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page