Skip to main content

Accessioning tool to submit genomics pipeline outputs to the ENCODE Portal

Project description

accession

Python module and command line tool to submit genomics pipeline analysis output files and metadata to the ENCODE Portal

Table of Contents

Installation

Install the module with pip:

$ pip install accession

Setting environmental variables

You will need ENCODE DCC credentials from the ENCODE Portal. Set them in your command line tool like so:

$ export DCC_API_KEY=XXXXXXXX
$ export DCC_SECRET_KEY=yyyyyyyyyyy

You will also need Google Application Credentials in your environment. Obtain and set your service account credentials:

$ export GOOGLE_APPLICATION_CREDENTIALS=<path_to_service_account_file>

Usage

$ accession --accession-metadata metadata.json \
            --accession-steps steps.json \
            --server dev \
            --lab /labs/encode-processing-pipeline/ \
            --award U41HG007000

Arguments

Metadata JSON

This file is an output of a pipeline analysis run. The example file has all of the tasks and produced files.

Accession Steps

The accessioning steps configuration file specifies the task and file names in the output metadata JSON and the order in which the files and metadata will be submitted. Accessioning code will selectively submit the specified files to the ENCODE Portal. A single step is configured in the following way:

{
        "dcc_step_version":     "/analysis-step-versions/kundaje-lab-atac-seq-trim-align-filter-step-v-1-0/",
        "dcc_step_run":         "atac-seq-trim-align-filter-step-run-v1",
        "wdl_task_name":        "filter",
        "wdl_files":            [
            {
                "filekey":                  "nodup_bam",
                "output_type":              "alignments",
                "file_format":              "bam",
                "quality_metrics":          ["cross_correlation", "samtools_flagstat"],
                "derived_from_files":       [{
                    "derived_from_task":        "trim_adapter",
                    "derived_from_filekey":     "fastqs",
                    "derived_from_inputs":      "true"
                }]
            }
        ]
}

dcc_step_version and dcc_step_run must exist on the portal.

wdl_task_name is the name of the task that has the files to be accessioned.

wdl_files specifies the set of files to be accessioned.

filekey is a variable that stores the file path in the metadata file.

output_type, file_format, and file_format_type are ENCODE specific metadata that are required by the Portal

quality_metrics is a list of methods that will be called in during the accessioning to attach quality metrics to the file

possible_duplicate indicates that there could be files that have an identical content. If the possible_duplicate flag is set and the current file being accessioned has md5sum that's identical to the md5sum of another file in the same task, the current file will not be accessioned. Optimal IDR peaks and conservative IDR peaks are an example set of files that can have an identical md5sum.

derived_from_files specifies the list of files the current file being accessioned derives from. The parent files must have been accessioned before the current file can be submitted.

derived_from_inputs is used when indicating that the parent files were not produced during the pipeline analysis. Instead, these files are initial inputs to the pipeline. Raw fastqs and genome references are examples of such files.

derived_from_output_type is required in the case the parent file has a possible duplicate.

Server

prod and dev indicates the server where the files are being accessioned to. dev points to test.encodedcc.org. The server parameter can be explicitly passed as test.encodedcc.org or encodeproject.org.

Lab and Award

These are unique identifiers that are expected to be already present on the ENCODE Portal.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

accession-0.0.14.tar.gz (10.5 kB view details)

Uploaded Source

Built Distribution

accession-0.0.14-py3-none-any.whl (13.4 kB view details)

Uploaded Python 3

File details

Details for the file accession-0.0.14.tar.gz.

File metadata

  • Download URL: accession-0.0.14.tar.gz
  • Upload date:
  • Size: 10.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.19.1 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.5.1

File hashes

Hashes for accession-0.0.14.tar.gz
Algorithm Hash digest
SHA256 59cc84698b6a943fe05677f6ffa7362d729820dbb4d9ca8abd199d765f92e74c
MD5 bc9f12cb4e6afa7decc3ae0d767243ad
BLAKE2b-256 ea043e9a5f894fede88f1864836e58718acb54da66ac4a45afe79efaff6f3b3d

See more details on using hashes here.

File details

Details for the file accession-0.0.14-py3-none-any.whl.

File metadata

  • Download URL: accession-0.0.14-py3-none-any.whl
  • Upload date:
  • Size: 13.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.19.1 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.5.1

File hashes

Hashes for accession-0.0.14-py3-none-any.whl
Algorithm Hash digest
SHA256 115b2a2e97db70bf16cfca400b5c4a529f9df2681666406bf727bb05089434ba
MD5 bedd63c732ff622f7149dabe96c8029b
BLAKE2b-256 a190769bb902465c2cef1b0d0bf995eabd140eb04b8c0849ca56ab8b2a05658d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page