Skip to main content

Management of scripts that produce/consume data with specific labels

Project description

Overview

Over the past few years, I’ve organically standardized on a structure for the code I write for my research. I’ve preferred to have each step of an analysis pipeline implemented as a standalone script, though usually with functions and classes that are importable in other modules – such scripts often load some data, perform some processing, save that processed data, save plots/figures, etc.

This package provides utilities for creating and finding labeled paths, which are suitable for storing data and plots. It’s often important to be able to compare results between different versions of some analysis step, so these paths are timestamped to prevent repeated runs of a script from overwriting previous results.

This package differentiates between “data” paths, to save things which might be loaded by another script at a further stage of an analysis pipeline, and “output” paths, for plots/etc. which are only intended for people to examine.

The main interface to this code is through the complementary functions create_data_path and find_newest_data_path, which each take a single “label” string argument and return a pathlib.Path. These can be used as follows:

input_path = find_newest_data_path('previous_script')
with open(input_path / 'filename') as f:
    data = load(f)

processed_data = do_something_with(data)

data_path = create_data_path('name_of_this_script')
with open(data_path / 'whatever_filename', 'w') as f:
    save(processed_data, f)

Output paths are likewise created by create_output_path. It is recommended that scripts which call create_data_path use the name of the script as the “label” argument, but this is not enforced – one can include parameter values or anything else relevant.

Additional functionality

With these calls to create_data_path and find_newest_data_path, one can then model a set of such scripts as a directed graph, with nodes representing both scripts and data paths, and edges denoting a “requires” relationship, e.g. “script X requires data label Y, which is produced by script Z”. This package also contains standalone scripts (which require the package NetworkX) that parse the Python files in a certain project, construct this graph, and use this graph to provide other useful functionality:

  • Plotting this graph, using the pydotplus package, and a call to the dot GraphViz executable

  • Listing the data/script dependencies of a certain script, by performing a topological sort on the subset of the graph reachable from a certain script node. Note that this requires that the subgraph reachable from a script node be acyclic (which it should be anyway).

  • Archiving all data dependencies of a certain script, by finding all data nodes reachable from a script node, and adding all files in those data paths to a zip file.

Requirements

Python 3.6 or newer.

Things listed under “Additional functionality” require NetworkX and pydotplus.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

data-path-utils-0.1.tar.gz (7.1 kB view details)

Uploaded Source

Built Distribution

data_path_utils-0.1-py3-none-any.whl (11.0 kB view details)

Uploaded Python 3

File details

Details for the file data-path-utils-0.1.tar.gz.

File metadata

File hashes

Hashes for data-path-utils-0.1.tar.gz
Algorithm Hash digest
SHA256 d4143fd3eec15caa748a59cd09d95abde75ff8bc97d32f95ff31aa33a92347e3
MD5 427c6fa7912907f66ff4ea40ddc62fa1
BLAKE2b-256 a18e796509eb3370a360f739ccec695e59bef6341d7abfc699ba3a1a1f31e0dd

See more details on using hashes here.

File details

Details for the file data_path_utils-0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for data_path_utils-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e7f0272519a5ee99707c9194c058498ffa257a0d5324aadb2e97e49babc51fd5
MD5 fda91f1493662c5ec296044ddf794174
BLAKE2b-256 b4729e27b4a7c070b9757481415e2affdf6bcb50e4ae288e4379922ea4bf38f5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page