Skip to main content

Common task classes used by the DKIST science data processing pipelines

Project description

This repository works in concert with dkist-processing-core and dkist-processing-*instrument* to form the DKIST calibration processing stack.

Usage

The classes in this repository should be used as the base of any DKIST processing pipeline tasks. Science tasks should subclass ScienceTaskL0ToL1Base.

Each class is built on an abstract base class with the run method left for a developer to fill out with the required steps that the task should take. This class is then used as the callable object for the workflow and scheduling engine.

Example

from dkist_processing_common.tasks.base import ScienceTaskL0ToL1Base


class RemoveArtifacts(ScienceTaskL0ToL1Base):
    def run(self):
        # task code here
        total = 2 + 5

Deployment

dkist-processing-common is deployed to PyPI

Development

There are two prerequisites for test execution on a local machine:

  • Redis. A running instance of redis on the local machine is required. The tests will use the default host ip of localhost and port of 6379 to connect to the database.

  • RabbitMQ. A running instance of rabbitmq on the local machine is required. The tests will use the default host of localhost and a port of 5672 to connect to the interservice bus.

To run the tests locally, clone the repository and install the package in editable mode with the test extras.

git clone git@bitbucket.org:dkistdc/dkist-processing-common.git
cd dkist-processing-common
pre-commit install
pip install -e .[test]
# Redis must be running
pytest -v --cov dkist_processing_common

Changelog

When you make any change to this repository it MUST be accompanied by a changelog file. The changelog for this repository uses the towncrier package. Entries in the changelog for the next release are added as individual files (one per change) to the changelog/ directory.

Writing a Changelog Entry

A changelog entry accompanying a change should be added to the changelog/ directory. The name of a file in this directory follows a specific template:

<PULL REQUEST NUMBER>.<TYPE>[.<COUNTER>].rst

The fields have the following meanings:

  • <PULL REQUEST NUMBER>: This is the number of the pull request, so people can jump from the changelog entry to the diff on BitBucket.

  • <TYPE>: This is the type of the change and must be one of the values described below.

  • <COUNTER>: This is an optional field, if you make more than one change of the same type you can append a counter to the subsequent changes, i.e. 100.bugfix.rst and 100.bugfix.1.rst for two bugfix changes in the same PR.

The list of possible types is defined the the towncrier section of pyproject.toml, the types are:

  • feature: This change is a new code feature.

  • bugfix: This is a change which fixes a bug.

  • doc: A documentation change.

  • removal: A deprecation or removal of public API.

  • misc: Any small change which doesn’t fit anywhere else, such as a change to the package infrastructure.

Rendering the Changelog at Release Time

When you are about to tag a release first you must run towncrier to render the changelog. The steps for this are as follows:

  • Run towncrier build –version vx.y.z using the version number you want to tag.

  • Agree to have towncrier remove the fragments.

  • Add and commit your changes.

  • Tag the release.

NOTE: If you forget to add a Changelog entry to a tagged release (either manually or automatically with towncrier) then the Bitbucket pipeline will fail. To be able to use the same tag you must delete it locally and on the remote branch:

# First, actually update the CHANGELOG and commit the update
git commit

# Delete tags
git tag -d vWHATEVER.THE.VERSION
git push --delete origin vWHATEVER.THE.VERSION

# Re-tag with the same version
git tag vWHATEVER.THE.VERSION
git push --tags origin main

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dkist_processing_common-10.3.0.tar.gz (498.1 kB view details)

Uploaded Source

Built Distribution

dkist_processing_common-10.3.0-py3-none-any.whl (526.6 kB view details)

Uploaded Python 3

File details

Details for the file dkist_processing_common-10.3.0.tar.gz.

File metadata

File hashes

Hashes for dkist_processing_common-10.3.0.tar.gz
Algorithm Hash digest
SHA256 1fcfd109913bb7bc26401a92d64b71024331276a734bce70ec81e3f3260a11e2
MD5 62766ac62e92f19770133e5eca870e9f
BLAKE2b-256 5218a51d53760e9bbec8ae3f10cba7cb6e71438ce92dc2b7eea8b257b0ce650e

See more details on using hashes here.

Provenance

File details

Details for the file dkist_processing_common-10.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for dkist_processing_common-10.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 5a449ebc26a7f5ecd482304561375a3344e329082eee45ed4e7c8015323bf7a2
MD5 06d5af553f97c3c474b0ce6a84dfa0c3
BLAKE2b-256 c5513cceb06c6e20e6af4f7b3cf77b18d3c675bba75f09a71a13d599a66ecd96

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page