Skip to main content

Python code to integrate results of tb-pipeline and provide an antibiogram, mutations and variants

Project description

Tests Build and release Docker Build and release PyPI PyPI version

gnomonicus

Python code to integrate results of tb-pipeline and provide an antibiogram, mutations and variations

Provides a library of functions for use within scripts, as well as a CLI tool for linking the functions together to produce output

Usage

usage: gnomonicus [-h] --vcf_file VCF_FILE --genome_object GENOME_OBJECT [--catalogue_file CATALOGUE_FILE]
              [--ignore_vcf_filter] [--progress] [--output_dir OUTPUT_DIR] [--json] [--alt_json] [--fasta FASTA]

options:
  -h, --help            show this help message and exit
  --vcf_file VCF_FILE   the path to a single VCF file
  --genome_object GENOME_OBJECT
                        the path to a compressed gumpy Genome object or a genbank file
  --catalogue_file CATALOGUE_FILE
                        the path to the resistance catalogue
  --ignore_vcf_filter   whether to ignore the FILTER field in the vcf (e.g. necessary for some versions of
                        Clockwork VCFs)
  --progress            whether to show progress using tqdm
  --output_dir OUTPUT_DIR
                        Directory to save output files to. Defaults to wherever the script is run from.
  --json                Flag to create a single JSON output as well as the CSVs
  --alt_json            Whether to produce the alternate JSON format. Requires the --json flag too
  --fasta FASTA         Use to output a FASTA file of the resultant genome. Specify either 'fixed' or 'variable'
                        for fixed length and variable length FASTA respectively.

Helper usage

As the main script can utilise pickled gumpy.Genome objects, there is a supplied helper script. This converts a Genbank file into a pickled gumpy.Genome for significant time saving. Due to the security implications of the pickle module, DO NOT SEND/RECEIVE PICKLES. This script should be used on a host VM before running nextflow to avoid reinstanciation. Supports gzip compression to reduce file size significantly (using the --compress flag).

usage: gbkToPkl FILENAME [--compress]

Install

Simple install using pip for the latest release

pip install gnomonicus

Install from source

git clone https://github.com/oxfordmmm/gnomonicus.git
cd gnomonicus
pip install -e .

Docker

A Docker image should be built on releases. To open a shell with gnomonicus installed:

docker run -it oxfordmmm/gnomonicus:latest

Notes

When generating mutations, in cases of synonymous amino acid mutation, the nucelotides changed are also included. This can lead to a mix of nucleotides and amino acids for coding genes, but these are excluded from generating effects unless specified in the catalogue. This means that the default rule of gene@*= --> S is still in place regardless of the introduced gene@*? which would otherwise take precedence. For example:

  'MUTATIONS': [
      {
          'MUTATION': 'F2F',
          'GENE': 'S',
          'GENE_POSITION': 2
      },
      {
          'MUTATION': 't6c',
          'GENE': 'S',
          'GENE_POSITION': 6
      },
  ],
  'EFFECTS': {
      'AAA': [
          {
              'GENE': 'S',
              'MUTATION': 'F2F',
              'PREDICTION': 'S'
          },
          {
              'PHENOTYPE': 'S'
          }
      ],

The nucelotide variation is included in the the MUTATIONS, but explictly removed from the EFFECTS unless it is specified within the catalogue. In order for this variation to be included, a line in the catalogue of S@F2F&S@t6c would have to be present.

User stories

  1. As a bioinformatician, I want to be able to run gnomonicus on the command line, passing it (i) a GenBank file (or pickled gumpy.Genome object), (ii) a resistance catalogue and (iii) a VCF file, and get back pandas.DataFrames of the genetic variants, mutations, effects and predictions/antibiogram. The latter is for all the drugs described in the passed resistance catalogue.

  2. As a GPAS developer, I want to be able to embed gnomonicus in a Docker image/NextFlow pipeline that consumes the outputs of tb-pipeline and emits a structured, well-designed JSON object describing the genetic variants, mutations, effects and predictions/antibiogram.

  3. In general, I would also like the option to output fixed- and variable-length FASTA files (the latter takes into account insertions and deletions described in any input VCF file).

Unit testing

For speed, rather than use NC_000962.3 (i.e. H37Rv M. tuberculosis), we shall use SARS-CoV-2 and have created a fictious drug resistance catalogue, along with some vcf files and the expected outputs in tests/.

These can be run with pytest -vv

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gnomonicus-1.1.1.tar.gz (18.3 kB view details)

Uploaded Source

Built Distribution

gnomonicus-1.1.1-py3-none-any.whl (18.0 kB view details)

Uploaded Python 3

File details

Details for the file gnomonicus-1.1.1.tar.gz.

File metadata

  • Download URL: gnomonicus-1.1.1.tar.gz
  • Upload date:
  • Size: 18.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for gnomonicus-1.1.1.tar.gz
Algorithm Hash digest
SHA256 a8b907361efcb2c52bab49258b388dd1623d36b12e5037df2d7e55b664e1444f
MD5 c1dff3656ea699b5dede676ecc6dbd33
BLAKE2b-256 ccdd76244b54e000a2b8e7e2cf54375d81ec1a9ab4ddc252d67e93ffa70bd55e

See more details on using hashes here.

File details

Details for the file gnomonicus-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: gnomonicus-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 18.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.14

File hashes

Hashes for gnomonicus-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0a9e5ac98ec0476c8883a170e0a840d3e3492b03412fae0c123843a375f6238d
MD5 e48b2a189598ccef6c4715f286998d82
BLAKE2b-256 fe2d5a53e731de47298f38b55e61e23f1cc35320e259f462772dcbc380d7f4db

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page