Skip to main content

Tools to work with vertical profile time series.

Project description

vptstools

Project generated with PyScaffold PyPI-Server .github/workflows/release.yml

vptstools is a Python library to transfer and convert vpts data. VPTS (vertical profile time series) express the density, speed and direction of biological signals such as birds, bats and insects within a weather radar volume, grouped into altitude layers (height) and measured over time (datetime).

Installation

Python 3.9+ is required.

pip install vptstools

If you need the tools/services to transfer data (SFTP, S3) install these additional dependencies:

pip install vptstools\[transfer\]

Usage

As a library user interested in working with ODIM h5 and vpts files, the most important functions provided by the package are {py:func}vptstools.vpts.vp, {py:func}vptstools.vpts.vpts and {py:func}vptstools.vpts.vpts_to_csv, which can be used respectively to convert a single h5 file, a set of h5 files and save a vpts DataFrame to a csv-file:

  • Convert a single local ODIM h5 file to a vp DataFrame:
from vptstools.vpts import vp

file_path_h5 = "./NLDBL_vp_20080215T0010_NL50_v0-3-20.h5"
df_vp = vp(file_path_h5)
  • Convert a set of locally stored ODIM h5 files to a vpts DataFrame:
from pathlib import Path
from vptstools.vpts import vpts

file_paths = sorted(Path("./data").rglob("*.h5"))  # Get all h5 files within the data directory
df_vpts = vpts(file_paths)
  • Store a vp or vpts DataFrame to a VPTS CSV file:
from vptstools.vpts import vpts_to_csv

vpts_to_csv(df_vpts, "vpts.csv")
Both {py:func}`vptstools.vpts.vp` and {py:func}`vptstools.vpts.vpts` have 2 other optional parameters related to the
[VPTS-CSV data exchange format](https://aloftdata.eu/vpts-csv/). The `vpts_csv_version` parameter defines the version of the 
[VPTS-CSV data exchange standard](https://aloftdata.eu/vpts-csv/) (default v1) whereas the `source_file` provides a way to define
a custom [source_file](https://aloftdata.eu/vpts-csv/#source_file) field to reference the source from which the 
data were derived. 

To validate a vpts DataFrame against the frictionless data schema as defined by the VPTS-CSV data exchange format and return a report, use the {py:func}vptstools.vpts.validate_vpts:

from vptstools.vpts import validate_vpts

report = validate_vpts(df_vpts, version="v1")
report.stats["errors"]

Other modules in the package are:

  • {py:mod}vptstools.odimh5: This module extents the implementation of the original odimh5 package which is now deprecated.
  • {py:mod}vptstools.vpts_csv: This module contains - for each version of the VPTS-CSV exchange format - the corresponding implementation which can be used to generate a vp or vpts DataFrame. For more information on how to support a new version of the VPTS-CSV format, see contributing docs.
  • {py:mod}vptstools.s3: This module contains the functions to manage the aloft data repository](https://aloftdata.eu/browse/) S3 Bucket.

CLI endpoints

In addition to using functions in Python scripts, two vptstools routines are available to be called from the command line after installing the package:

.. include:: click.rst

Development instructions

See contributing for a detailed overview and set of guidelines. If familiar with tox, the setup of a development environment boils down to:

tox -e dev   # Create development environment with venv and register an ipykernel. 
source venv/bin/activate  # Activate this environment to get started

Next, the following set of commands are available to support development:

tox              # Run the unit tests
tox -e docs      # Invoke sphinx-build to build the docs
tox -e format    # Run black code formatting

tox -e clean     # Remove old distribution files and temporary build artifacts (./build and ./dist)
tox -e build     # Build the package wheels and tar

tox -e linkcheck # Check for broken links in the documentation

tox -e publish   # Publish the package you have been developing to a package index server. By default, it uses testpypi. If you really want to publish your package to be publicly accessible in PyPI, use the `-- --repository pypi` option.
tox -av          # List all available tasks

To create a pinned requirements.txt set of dependencies, pip-tools is used:

pip-compile --extra transfer --resolver=backtracking`

Notes

  • This project has been set up using PyScaffold 4.3.1. For details and usage information on PyScaffold see https://pyscaffold.org/.
  • The odimh5 module was originally developed and released to pypi as a separate odimh5 package by Nicolas Noé (@niconoe). Version 0.1.0 has been included into this vptstools package.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vptstools-0.2.2.tar.gz (419.7 kB view details)

Uploaded Source

Built Distribution

vptstools-0.2.2-py3-none-any.whl (24.3 kB view details)

Uploaded Python 3

File details

Details for the file vptstools-0.2.2.tar.gz.

File metadata

  • Download URL: vptstools-0.2.2.tar.gz
  • Upload date:
  • Size: 419.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for vptstools-0.2.2.tar.gz
Algorithm Hash digest
SHA256 32aef7df3f8136897ad46ceacefffe695d473586e1ca8b37e807a43f4b14af9f
MD5 d41c0e74a1760746f804e81d535e4b86
BLAKE2b-256 f3d4ce39a6651d6ed86c03fcd351f96e69388f7ca46a1fbb4ec87bd533286f34

See more details on using hashes here.

File details

Details for the file vptstools-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: vptstools-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 24.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for vptstools-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ff2b2d168e5b79a76c32d93e143107ea2e2322012ba6ff1cac111216f0b21422
MD5 8786d5b7fba21b6a2ff84868f99766af
BLAKE2b-256 025ff35f4d2e123ed0180b57ef3e1db24bec5b86ccebcbf57a03fb5253a9fbc1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page