Skip to main content

Utility functions to write LivingPark notebooks.

Project description

LivingPark utils

A collection of utility functions to write LivingPark notebooks.

Usage examples:

import livingpark_utils
from livingpark_utils import download
from livingpark_utils.clinical import moca2mmse
from livingpark_utils.dataset import ppmi

utils = livingpark_utils.LivingParkUtils()
downloader = download.ppmi.Downloader(utils.study_files_dir)

utils.notebook_init()
utils.get_study_files(["Demographics.csv"], default=downloader)
utils.get_T1_nifti_files(
    cohort, default=downloader
)  # `cohort` is of type: pd.DataFrame

ppmi.find_nifti_file_in_cache(x["PATNO"], x["EVENT_ID"], x["Description"])
ppmi.disease_duration()

moca2mmse(2)

Exclude subjects from a cohort without leaking patient information.

from livingpark_utils.ignore import (
    insert_hash,
    remove_ignored,
)

# Assuming a cohort definition defined as `cohort`.
cohort = insert_hash(cohort, columns=["PATNO", "EVENT_ID", "Description"])
remove_ignored(cohort, ignore_file=".ppmiignore")

Usage to execute utility notebooks:

from livingpark_utils.scripts import run

run.mri_metadata()
run.pd_status()

Note: Optionally use the %%capture cell magic to further hide notebook outputs.

CLI commands

Download T1 nifti files using a cohort definition file.

$ get_T1_nifti_files <cohort_file> --downloader (ppmi) [--symlink=<bool>]
[--force=<bool>] [--timeout=<int>]

The cohort_file is a csv file created beforehand. Respectively to the chosen downloader, it must have the following columns:

  • PPMI: PATNO, EVENT_ID, and Description.

Troubleshooting

Permission issues on Windows

We use symbolic links when creating the folder for cached data. Unfortunately, by default, Windows does not authorize users to create symbolic links. To fix this issue on your machine, please follow the guide from this blog post.

Contributing guidelines

We welcome contributions of any kind in the form of Pull-Request to this repository. See also LivingPark contributing guidelines.

Make sure to:

  • Use Python type annotations
  • Include Python docstrings using numpydoc format for all functions
  • Format docstrings
  • Run psf/black on the files you modify
  • Run pre-commit run --all before committing, this will be checked in your PR

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

livingpark_utils-0.9.3.tar.gz (55.4 kB view details)

Uploaded Source

Built Distribution

livingpark_utils-0.9.3-py3-none-any.whl (69.0 kB view details)

Uploaded Python 3

File details

Details for the file livingpark_utils-0.9.3.tar.gz.

File metadata

  • Download URL: livingpark_utils-0.9.3.tar.gz
  • Upload date:
  • Size: 55.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.18

File hashes

Hashes for livingpark_utils-0.9.3.tar.gz
Algorithm Hash digest
SHA256 cbb82907de7755e8f86e1310aeee07f6e5a114fb1db019e83309d1c3adf1ce90
MD5 1fc7d1a7d29bf30475993e5eef76fb44
BLAKE2b-256 7a3dbcfa8d3a8e149cfb0067c611ee5189c22e303b62491367dd032bfa3a1e6c

See more details on using hashes here.

Provenance

File details

Details for the file livingpark_utils-0.9.3-py3-none-any.whl.

File metadata

File hashes

Hashes for livingpark_utils-0.9.3-py3-none-any.whl
Algorithm Hash digest
SHA256 25befde44aeb8eab858686a507ba64f7b7143011719e537e5c10c0a2ec99853f
MD5 6beac954c013512fe8bb1e7768b4e532
BLAKE2b-256 94ed396dbe8021eeea6f0d56df90bbdb94b6b22f18e76a8578b0d47a166d0132

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page